New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nock record outputs array of hex values instead of plain JSON when response is encoded #1212
Comments
FYI, this is the complete const parseNockDefs = (
nockDefs: (nock.NockDefinition & { rawHeaders: string[] })[]
) => {
return nockDefs.map(def => {
try {
const headers = def.rawHeaders.reduce<Dictionary<string>>(
(acc, curr, i, arr) => {
if (i % 2 === 0) {
acc[arr[i].toLowerCase()] = arr[i + 1].toLowerCase();
}
return acc;
},
{}
);
if (
headers['transfer-encoding'] === 'chunked' &&
headers['content-encoding'] === 'gzip' &&
Array.isArray(def.response)
) {
def.response = JSON.parse(
gunzipSync(Buffer.from(def.response.join(''), 'hex')).toString(
'utf-8'
)
);
def.rawHeaders = Object.entries(headers).flatMap(([key, value]) => {
if (key === 'transfer-encoding' || key === 'content-encoding') {
return [];
}
return [key, value];
});
}
} catch (ex) {
console.warn('Failed to decode response');
}
return def;
});
}; |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. We try to do our best, but nock is maintained by volunteers and there is only so much we can do at a time. Thank you for your contributions. |
We would love if you or someone else could work on the issue |
@richardscarrott thanks for pointing me in the right direction with this. The additional problem I had to figure out was that I needed to re-gzip after normalizing. This was because https://github.com/octokit/rest.js expects the response to be gzipped. To work with function decodeBuffer(fixture) {
// Decode the hex buffer that nock made
const response = isArray(fixture.response) ? fixture.response.join('') : fixture.response
try {
const decoded = Buffer.from(response, 'hex')
var unzipped = zlib.gunzipSync(decoded).toString('utf-8')
} catch (err) {
throw new Error(`Error decoding nock hex:\n${err}`)
}
return JSON.parse(unzipped)
}
function afterRecord(fixtures) {
const normalizedFixtures = fixtures.map(fixture => {
fixture.response = decodeBuffer(fixture)
// do normalization stuff
// Re-gzip to keep the @octokit/rest happy
const stringified = JSON.stringify(fixture.response)
const zipped = zlib.gzipSync(stringified)
fixture.response = zipped
return fixture
})
return normalizedFixtures
} |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. We try to do our best, but nock is maintained by volunteers and there is only so much we can do at a time. Thank you for your contributions. |
This has a PR: #1496 |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. We try to do our best, but nock is maintained by volunteers and there is only so much we can do at a time. Thank you for your contributions. |
If someone has only a problem with encoding, then setting |
Have taken a stab at making a PR to implement this: #2359 |
Related: #457 (comment)
What is the expected behavior?
When recording responses which are compressed and chunked, e.g.
Content-Encoding: 'gzip'
andTransfer-Encoding: 'chunked'
, I was expecting the generated fixture to decompress and combine the chunks into the human readable response; this would then allow us to remove sensitive information from the response and modify it to satisfy scenarios which are hard to reproduce against the real APIs.What is the actual behavior?
The fixtures response is an array of hex values.
Possible solution
I'm working around this by modifying the nockDefs like this:
If this is in fact a bug then it'd be good to fix it in nock itself -- if it's as intended, i.e. the nock is technically returning the exact same response as the real server, then perhaps an option could be passed to nock record /
nock.back
to have it output the decompressed response?How to reproduce the issue
The issue can be reproduced by recording an API which returns compressed, chunked JSON.
Does the bug have a test case?
https://github.com/richardscarrott/nock-record-chunked-encoding
Versions
The text was updated successfully, but these errors were encountered: