Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OOM When Building Large Solutions #4477

Closed
aolszowka opened this issue Jun 27, 2019 · 13 comments
Closed

OOM When Building Large Solutions #4477

aolszowka opened this issue Jun 27, 2019 · 13 comments
Labels
Milestone

Comments

@aolszowka
Copy link

This topic has been touched upon many times and I know it is difficult to track these down however I am dedicated towards finding a solution for us since this affects many developers internally. I am willing to try anything to get this resolved and see it through to the end.

We cannot quite get to Visual Studio 2019 yet (we are blocked on getting AnhkSVN support and a few other VSIX tooling packages).

Environment data

msbuild /version output:
Microsoft (R) Build Engine version 15.9.21+g9802d43bc3 for .NET Framework
Copyright (C) Microsoft Corporation. All rights reserved.

OS info:
Windows 10 1803

If applicable, version of the tool that invokes MSBuild (Visual Studio, dotnet CLI, etc):
Visual Studio 2017 15.9.12

Issue

Pretty regularly (but not on demand) we can encounter Out of Memory (OOM) Errors in Visual Studio with the solution just "sitting". We assume it is some background compiler process that is killing us but its hard to prove (we are not skilled enough to do so). We'll see various stack traces in Event Viewer; here are some traces:

Trace 1

Application: devenv.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.OutOfMemoryException
   at System.Text.StringBuilder..ctor(System.String, Int32, Int32, Int32)
   at Microsoft.Build.InterningBinaryReader.ReadString()
   at Microsoft.Build.Framework.ProjectStartedEventArgs.CreateFromStream(System.IO.BinaryReader, Int32)
   at Microsoft.Build.Shared.LogMessagePacketBase.ReadFromStream(Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.Shared.LogMessagePacketBase.Translate(Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.LogMessagePacket.FactoryForDeserialization(Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.NodePacketFactory+PacketFactoryRecord.DeserializeAndRoutePacket(Int32, Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.NodePacketFactory.DeserializeAndRoutePacket(Int32, Microsoft.Build.BackEnd.NodePacketType, Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.NodeManager.DeserializeAndRoutePacket(Int32, Microsoft.Build.BackEnd.NodePacketType, Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.NodeProviderOutOfProcBase+NodeContext.ReadAndRoutePacket(Microsoft.Build.BackEnd.NodePacketType, Byte[], Int32)
   at Microsoft.Build.BackEnd.NodeProviderOutOfProcBase+NodeContext.BodyReadComplete(System.IAsyncResult)
   at System.IO.Pipes.PipeStream.AsyncPSCallback(UInt32, UInt32, System.Threading.NativeOverlapped*)
   at System.Threading._IOCompletionCallback.PerformIOCompletionCallback(UInt32, UInt32, System.Threading.NativeOverlapped*)

Trace 2

Application: devenv.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.OutOfMemoryException
   at System.Runtime.Serialization.ObjectManager.RegisterString(System.String, Int64, System.Runtime.Serialization.SerializationInfo, Int64, System.Reflection.MemberInfo)
   at System.Runtime.Serialization.Formatters.Binary.ObjectReader.RegisterObject(System.Object, System.Runtime.Serialization.Formatters.Binary.ParseRecord, System.Runtime.Serialization.Formatters.Binary.ParseRecord, Boolean)
   at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ParseMember(System.Runtime.Serialization.Formatters.Binary.ParseRecord)
   at System.Runtime.Serialization.Formatters.Binary.ObjectReader.Parse(System.Runtime.Serialization.Formatters.Binary.ParseRecord)
   at System.Runtime.Serialization.Formatters.Binary.__BinaryParser.ReadObjectString(System.Runtime.Serialization.Formatters.Binary.BinaryHeaderEnum)
   at System.Runtime.Serialization.Formatters.Binary.__BinaryParser.Run()
   at System.Runtime.Serialization.Formatters.Binary.ObjectReader.Deserialize(System.Runtime.Remoting.Messaging.HeaderHandler, System.Runtime.Serialization.Formatters.Binary.__BinaryParser, Boolean, Boolean, System.Runtime.Remoting.Messaging.IMethodCallMessage)
   at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Deserialize(System.IO.Stream, System.Runtime.Remoting.Messaging.HeaderHandler, Boolean, Boolean, System.Runtime.Remoting.Messaging.IMethodCallMessage)
   at Microsoft.Build.BackEnd.NodePacketTranslator+NodePacketReadTranslator.TranslateDotNet[[System.__Canon, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]](System.__Canon ByRef)
   at Microsoft.Build.Shared.LogMessagePacketBase.ReadFromStream(Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.Shared.LogMessagePacketBase.Translate(Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.LogMessagePacket.FactoryForDeserialization(Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.NodePacketFactory+PacketFactoryRecord.DeserializeAndRoutePacket(Int32, Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.NodePacketFactory.DeserializeAndRoutePacket(Int32, Microsoft.Build.BackEnd.NodePacketType, Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.NodeManager.DeserializeAndRoutePacket(Int32, Microsoft.Build.BackEnd.NodePacketType, Microsoft.Build.BackEnd.INodePacketTranslator)
   at Microsoft.Build.BackEnd.NodeProviderOutOfProcBase+NodeContext.ReadAndRoutePacket(Microsoft.Build.BackEnd.NodePacketType, Byte[], Int32)
   at Microsoft.Build.BackEnd.NodeProviderOutOfProcBase+NodeContext.BodyReadComplete(System.IAsyncResult)
   at System.IO.Pipes.PipeStream.AsyncPSCallback(UInt32, UInt32, System.Threading.NativeOverlapped*)
   at System.Threading._IOCompletionCallback.PerformIOCompletionCallback(UInt32, UInt32, System.Threading.NativeOverlapped*)

I have been searching on the issues page for similar stack traces and have found two that look most closely related that are still open:

#3577
#3210

Specifically the "Microsoft.Build.Shared.LogMessagePacketBase.ReadFromStream" is what I am basing it on.

Is there anything more we can do to try and trace this down? I am setting up ProcDump on a few machines and will report back if I get any dumps. Beyond that is there anything more we can do?

@livarcocc
Copy link
Contributor

@davkean since you have looked into a bunch of these, is there any workaround available?

@livarcocc livarcocc added this to the Discussion milestone Jun 27, 2019
@KonanM
Copy link

KonanM commented Jun 16, 2020

I also want to note that we are facing the same issue and are also running into out of memory issues quite regularly on huge solutions :-( You just leave VS running in the background and at some point it will crash with similar stack traces as shown above.

@aolszowka
Copy link
Author

These issues seem to persist, even in Visual Studio 2019 16.6.2, we have not been given any guidance on how to best track these issues down.

@davkean
Copy link
Member

davkean commented Jun 16, 2020

Please report this via Help -> Send Feedback -> Report a Problem. This will let us correlate the Watson crashes with your sessions, and help us identity the underlying causes. I'm going to close this bug as the stacks are not enough to identify the causes.

@davkean davkean closed this as completed Jun 16, 2020
@aolszowka
Copy link
Author

Just to be clear, you are proposing that after we encounter the OoM we file a report within the Visual Studio Reporter? When we crash OoM there is no way to report this from within Visual Studio?

@davkean
Copy link
Member

davkean commented Jun 17, 2020

You are not going to catch the OOM, unless you merge https://github.com/dotnet/project-system/blob/master/docs/repo/content/AlwaysSaveDevEnvCrashDumps.reg which will save dumps automatically to a directory on disk. However, reporting a problem lets us look at past sessions and data about them (free virtual memory, managed heap, etc) and find associated past crashes.

@aolszowka
Copy link
Author

We have ProcDump (https://docs.microsoft.com/en-us/sysinternals/downloads/procdump) setup on these boxes with /ma (much better than what Watson could give as we have the entire memory dump with those) and are more than happy to share the dumps privately. We will apply the above registry key though as instructed.

@davkean
Copy link
Member

davkean commented Jun 17, 2020

That's fine, but for privacy reasons we'll need you to upload it via Send Feedback as an attachment.

@aolszowka
Copy link
Author

The Send Feedback Tooling Crashes when you attempt to upload large dumps (>4GB) we will try again though. Thank you for your efforts.

@davkean
Copy link
Member

davkean commented Jun 17, 2020

@aolszowka I forgot about that, I've started a thread about that to see if we can get the limit increased or if it already was. If you add a private comment pointing to the dump, we can figure out how to get into our system while respecting GDPR.

@aolszowka
Copy link
Author

We appreciate the efforts, we really just want to get this fixed and are willing to jump though whatever hoops we need to. I am recreating a dump as we speak.

@mariaghiondea
Copy link

@aolszowka please give me more details on how you are trying to upload and what you are seeing after. You can follow the steps at https://docs.microsoft.com/en-us/visualstudio/ide/how-to-report-a-problem-with-visual-studio?view=vs-2019 and let me know which fails.

@KirillOsenkov
Copy link
Member

This should be mostly mitigated by #6155

@AR-May AR-May added the triaged label Feb 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants