Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support upload from named pipes #748

Merged
merged 1 commit into from Nov 29, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
54 changes: 54 additions & 0 deletions packages/artifact/__tests__/upload.test.ts
Expand Up @@ -2,6 +2,10 @@ import * as http from 'http'
import * as io from '../../io/src/io'
import * as net from 'net'
import * as path from 'path'
import {mocked} from 'ts-jest/utils'
import {exec, execSync} from 'child_process'
import {createGunzip} from 'zlib'
import {promisify} from 'util'
import {UploadHttpClient} from '../src/internal/upload-http-client'
import * as core from '@actions/core'
import {promises as fs} from 'fs'
Expand Down Expand Up @@ -174,6 +178,56 @@ describe('Upload Tests', () => {
expect(uploadResult.uploadSize).toEqual(expectedTotalSize)
})

function hasMkfifo(): boolean {
try {
// make sure we drain the stdout
return execSync('which mkfifo').toString().length > 0
} catch (e) {
return false
}
}
const withMkfifoIt = hasMkfifo() ? it : it.skip
withMkfifoIt(
'Upload Artifact with content from named pipe - Success',
async () => {
// create a named pipe 'pipe' with content 'hello pipe'
const content = Buffer.from('hello pipe')
const pipeFilePath = path.join(root, 'pipe')
await promisify(exec)('mkfifo pipe', {cwd: root})
// don't want to await here as that would block until read
fs.writeFile(pipeFilePath, content)

const artifactName = 'successful-artifact'
const uploadSpecification: UploadSpecification[] = [
{
absoluteFilePath: pipeFilePath,
uploadFilePath: `${artifactName}/pipe`
}
]

const uploadUrl = `${getRuntimeUrl()}_apis/resources/Containers/13`
const uploadHttpClient = new UploadHttpClient()
const uploadResult = await uploadHttpClient.uploadArtifactToFileContainer(
uploadUrl,
uploadSpecification
)

// accesses the ReadableStream that was passed into sendStream
// eslint-disable-next-line @typescript-eslint/unbound-method
const stream = mocked(HttpClient.prototype.sendStream).mock.calls[0][2]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pretty cool trick, didn't know about this 👍

expect(stream).not.toBeNull()
// decompresses the passed stream
const data: Buffer[] = []
for await (const chunk of stream.pipe(createGunzip())) {
data.push(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk as string))
}
const uploaded = Buffer.concat(data)

expect(uploadResult.failedItems.length).toEqual(0)
expect(uploaded).toEqual(content)
}
)

it('Upload Artifact - Failed Single File Upload', async () => {
const uploadSpecification: UploadSpecification[] = [
{
Expand Down
15 changes: 11 additions & 4 deletions packages/artifact/src/internal/upload-http-client.ts
Expand Up @@ -219,16 +219,22 @@ export class UploadHttpClient {
httpClientIndex: number,
parameters: UploadFileParameters
): Promise<UploadFileResult> {
const totalFileSize: number = (await stat(parameters.file)).size
const fileStat: fs.Stats = await stat(parameters.file)
const totalFileSize = fileStat.size
// on Windows with mkfifo from MSYS2 stats.isFIFO returns false, so we check if running on Windows node and
// if the file has size of 0 to compensate
const isFIFO =
fileStat.isFIFO() || (process.platform === 'win32' && totalFileSize === 0)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes look good to me! Would like to get one more pair of eyes on this before merging

@yacaovsnc could you quickly glance over this PR? The only slight concern I have is the (process.platform === 'win32' && totalFileSize === 0) block. I'm not sure if there are any other edge cases where on Windows the file size would be reported as 0 and we don't want to go with an in-memory buffer.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is 👍🏼 . The logic here will make sure any file with size 0 to use the file on disk approach which should be safer compared to the in-memory buffer.

let offset = 0
let isUploadSuccessful = true
let failedChunkSizes = 0
let uploadFileSize = 0
let isGzip = true

// the file that is being uploaded is less than 64k in size, to increase throughput and to minimize disk I/O
// the file that is being uploaded is less than 64k in size to increase throughput and to minimize disk I/O
// for creating a new GZip file, an in-memory buffer is used for compression
if (totalFileSize < 65536) {
// with named pipes the file size is reported as zero in that case don't read the file in memory
if (!isFIFO && totalFileSize < 65536) {
const buffer = await createGZipFileInBuffer(parameters.file)

//An open stream is needed in the event of a failure and we need to retry. If a NodeJS.ReadableStream is directly passed in,
Expand Down Expand Up @@ -287,7 +293,8 @@ export class UploadHttpClient {
let uploadFilePath = tempFile.path

// compression did not help with size reduction, use the original file for upload and delete the temp GZip file
if (totalFileSize < uploadFileSize) {
// for named pipes totalFileSize is zero, this assumes compression did help
if (!isFIFO && totalFileSize < uploadFileSize) {
uploadFileSize = totalFileSize
uploadFilePath = parameters.file
isGzip = false
Expand Down