Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak with LogEventInfo object #2220

Closed
earlgumbo3 opened this issue Jul 21, 2017 · 7 comments
Closed

Memory leak with LogEventInfo object #2220

earlgumbo3 opened this issue Jul 21, 2017 · 7 comments

Comments

@earlgumbo3
Copy link

Hi! Thanks for reporting this feature/bug/question!

Please keep / fill in the relevant info from this template so that we can help you as best as possible.

For .NET Core users, please check the platform support: https://github.com/NLog/NLog/wiki/platform-support

Type (choose one):

  • Bug

NLog version: All

Platform: .Net 4.0 / .Net 4.5

Current NLog config (xml or C#, if relevant)

<nlog>
	<variable name="HeaderLayout" value="${longdate}|INFO|General|${threadid}|MachineName: &#39;${machinename}&#39;, OS: &#39;${gdc:item=os}&#39;, Platform: &#39;${gdc:item=platform}&#39; ProcessId: &#39;${processid}&#39;, ProcessorCount: &#39;${gdc:item=processorCount}&#39;, TotalMemory: &#39;${gdc:item=totalMemory}&#39;, AppVersion: &#39;${gdc:item=appVersion}&#39;" />
	<variable name="StandardLayout" value="${longdate}|${level:uppercase=true}|${logger}|${threadid}|${message}" />
	<extensions>
		<add assembly="Aprimo.Diagnostics.ApplicationInsights" />
	</extensions>
	<targets>
		<target type="AsyncWrapper" name="asyncAprimoAi" overflowAction="Grow">
			<target type="AprimoAITarget" name="aiTarget" />
		</target>
		<target type="AsyncWrapper" name="asyncFile" overflowAction="Grow">
			<target name="file" type="File" fileName="${gdc:item=rootPath}\${machinename}.${gdc:item=appName}.log" archiveNumbering="Sequence" archiveAboveSize="10485760" maxArchiveFiles="10" archiveFileName="${gdc:item=rootPath}\Archives\${machinename}.${gdc:item=appName}.{#}.log" autoFlush="True">
				<layout type="LayoutWithHeaderAndFooter">
					<header type="SimpleLayout" text="${HeaderLayout}" />
					<layout type="SimpleLayout" text="${StandardLayout}" />
					<footer type="SimpleLayout" text= "" />
				</layout>
			</target>
		</target>
		<target type="AsyncWrapper" name="asyncEvent" overflowAction="Grow">
			<target type="EventLog" name="event" log="${gdc:item=eventLog}" source="${gdc:item=appName}"/>
		</target>
	</targets>
	<rules>
		<logger name="*" minlevel="Info" writeTo="asyncFile"/>
		<logger name="*" minlevel="Error" writeTo="asyncEvent"/>
		<logger name="*" minlevel="Warn" writeTo="asyncAprimoAi" />
	</rules>
</nlog>

In case of a BUG:

  • What is the current result?
    There is a large increase in memory consumption when the logging level is set to "Trace", and there are a large number of logging statements for verbose debugging in an application. This is leading to NLog objects (specifically the LogEventInfo object, and its associated string data) being placed in Gen2 or higher memory heaps where they persist longer than necessary, which in turn leads to high memory consumption, or even overflow.

  • What is the expected result?
    The LogEventInfo objects should be destroyed after the event it written to the NLog targets that apply.

  • Did you checked the Internal log?
    Yes.

  • Please post full exception details (message, stacktrace, inner exceptions)
    There are no exception details for this issue. It will require a memory dump analysis to determine if the objects are being disposed of properly before they end up in higher generation memory heaps.

  • Are there any work arrounds? yes/no
    The workaround for this issue is to turn logging levels up to "Warn" or above, and ensure that the underlying application is not making excessive logging calls that would cause additional NLog objects to be created.

  • Is there a version in which it did worked?
    No.

  • Can you help us by writing an unit test?
    N/A

@snakefoot
Copy link
Contributor

snakefoot commented Jul 21, 2017

Upgrade to newest NLog, and add the following setting to your AsyncWrapper-configurations:

timeToSleepBetweenBatches="0"

By default the AsyncWrapper will keep the LogEventInfo's alive for 50 ms before flushing to target. This is long enough for the LogEventInfo's to be upgraded to GC1, and causing unnecessary garbage. See also #1757

@snakefoot
Copy link
Contributor

snakefoot commented Jul 22, 2017

You can also increase the batchSize for the AsyncWrapper-configurations (Increasing it higher than 1000 will give extra memory allocations):

batchSize="1000"

This will help when using a file-target with keepFileOpen="false" (Default configuration). If possible then change the file-target to have keepFileOpen="true" (If not possible then add the parameter bufferSize="4096" to the file-target, not needed after #3444 in NLog 4.6.5)

@earlgumbo3
Copy link
Author

Thanks for those details. I will test them out.

Is there a code example for the overriding the "Write" method for bulk handling of a custom target? I can't find any examples for the following overrides:
protected override void Write(IList logEvents)
protected override void Write(AsyncLogEventInfo[] logEvents)

@snakefoot
Copy link
Contributor

snakefoot commented Jul 24, 2017

Is there a code example for the overriding the "Write" method for bulk handling of a custom target? I can't find any examples for the overrides.

Just ignore my suggestions about improving the bulk-handling in AprimoAITarget and EventLog. They are only used for occasional Warnings and Errors. They will not be affected by lowering the log-level to Debug or Trace for the asyncFile.

@snakefoot
Copy link
Contributor

@earlgumbo3 Were you able to resolve you problem? If so can you tell what fixed it? (And maybe close this issue)

@earlgumbo3
Copy link
Author

@snakefoot We are currently in QA and testing your suggested configuration changes. So far, my local testing looks good, and we are actually getting a performance boost from this configuration. I should be able to close this issue soon. The release will go RC tomorrow. Thanks for all the help. I'll be sure to close this once our official QA is completed.

@earlgumbo3
Copy link
Author

This issue was resolved using the suggestions posted above.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants