You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is a memory leak in Windows PowerShell 5.1 and a memory leak in PowerShell 7.4.2.
Details
[PS 5.1] Write-Output leaks memory when given a large byte[] as the input object to output to the success stream
[PS 7.4] [System.Convert]::FromBase64String leaks memory when given a large string to convert to a byte[]
Code
The specific place where this issue occurs within the code can be seen on line 358 and line 365 in PowerPass.Common.ps1.
# PowerShell 7.4.2# Memory leak: when calling FromBase64String on a large string, such as one that# is over 100 million characters long, the .NET runtime starts to leak memory and# the memory usage of pwsh.exe increases dramatically without end until the process# is terminated manually by closing PowerShell.$file= [System.Convert]::FromBase64String( $attachment.Data )
# Windows PowerShell 5.1# Memory leak: Windows PowerShell 5.1 will leak memory when a large byte[] is written# to the output stream. This command may never actually finish. In PowerShell 7, this# issue does not exist as the cmdlet immediately outputs the reference to the byte[]# to the caller and does not leak memory during Write-Output.Write-Output-InputObject $file-NoEnumerate
Synopsis
There is a known issue with the [System.Convert]::FromBase64String function in the .NET runtime which is causing long delays and excessive memory usage when attempting to retrieve large attachments from your Locker. This issue was discovered when attempting to load a 200 MB binary file into a PowerPass Locker as an attachment. The issue was traced to a memory leak in the .NET runtime which is present in PowerShell 7.4.2, but does not present itself in Windows PowerShell 5.1.
Attachments are encoded as base64 text so they can be represented in the JSON format and serialized, and deserialized, from disk to memory and back again without losing their structure. While doing so increases the storage overhead required to save a binary file in a PowerPass Locker, the intention was not for PowerPass to be used for large file storage, although it can be used in this way.
However, when Attachment file size increases substantially, such as into the 200 MB and larger territory, the conversion functions native to .NET start to leak memory as the massive base64-encoded Attachment string is converted back into a native byte[] data structure. This leak also causes a substantial delay in the conversion process, upwards of 80 or more seconds for Attachments of 200 MB in size. As a consequence, running Read-PowerPassAttachment may take a significant amount of time for large attachments.
Secondary Problem with Windows PowerShell 5.1
There is a secondary problem when using PowerPass in Windows PowerShell 5.1. When you call Read-PowerPassAttachment and expect to get the byte[] payload of the attachment back, if the attachment is large, the Write-Output call will leak memory and, apparently, never complete. This does not happen in PowerShell 7.4.2 where Write-Output completes immediately.
Workaround
Given the separate problems inherent in both PowerShell flavors, and the correlation to large, binary attachments, it is recommended at this time to avoid using PowerPass to store large binary attachments, files that are over 10 MiB in size.
The text was updated successfully, but these errors were encountered:
Summary
There is a memory leak in Windows PowerShell 5.1 and a memory leak in PowerShell 7.4.2.
Details
Write-Output
leaks memory when given a largebyte[]
as the input object to output to the success stream[System.Convert]::FromBase64String
leaks memory when given a largestring
to convert to abyte[]
Code
The specific place where this issue occurs within the code can be seen on line 358 and line 365 in
PowerPass.Common.ps1
.Synopsis
There is a known issue with the
[System.Convert]::FromBase64String
function in the .NET runtime which is causing long delays and excessive memory usage when attempting to retrieve large attachments from your Locker. This issue was discovered when attempting to load a 200 MB binary file into a PowerPass Locker as an attachment. The issue was traced to a memory leak in the .NET runtime which is present in PowerShell 7.4.2, but does not present itself in Windows PowerShell 5.1.Attachments are encoded as base64 text so they can be represented in the JSON format and serialized, and deserialized, from disk to memory and back again without losing their structure. While doing so increases the storage overhead required to save a binary file in a PowerPass Locker, the intention was not for PowerPass to be used for large file storage, although it can be used in this way.
However, when Attachment file size increases substantially, such as into the 200 MB and larger territory, the conversion functions native to .NET start to leak memory as the massive base64-encoded Attachment string is converted back into a native
byte[]
data structure. This leak also causes a substantial delay in the conversion process, upwards of 80 or more seconds for Attachments of 200 MB in size. As a consequence, runningRead-PowerPassAttachment
may take a significant amount of time for large attachments.Secondary Problem with Windows PowerShell 5.1
There is a secondary problem when using PowerPass in Windows PowerShell 5.1. When you call
Read-PowerPassAttachment
and expect to get thebyte[]
payload of the attachment back, if the attachment is large, theWrite-Output
call will leak memory and, apparently, never complete. This does not happen in PowerShell 7.4.2 whereWrite-Output
completes immediately.Workaround
Given the separate problems inherent in both PowerShell flavors, and the correlation to large, binary attachments, it is recommended at this time to avoid using PowerPass to store large binary attachments, files that are over 10 MiB in size.
The text was updated successfully, but these errors were encountered: