Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nock record outputs array of hex values instead of plain JSON when response is encoded #1212

Open
richardscarrott opened this issue Sep 9, 2018 · 9 comments
Labels

Comments

@richardscarrott
Copy link

richardscarrott commented Sep 9, 2018

Related: #457 (comment)

What is the expected behavior?
When recording responses which are compressed and chunked, e.g. Content-Encoding: 'gzip' and Transfer-Encoding: 'chunked', I was expecting the generated fixture to decompress and combine the chunks into the human readable response; this would then allow us to remove sensitive information from the response and modify it to satisfy scenarios which are hard to reproduce against the real APIs.

What is the actual behavior?
The fixtures response is an array of hex values.

Possible solution
I'm working around this by modifying the nockDefs like this:

onst { ungzip } = require('node-gzip');

try {
  if (Array.isArray(def.response)) { // NOTE: This is a v. naive check
    def.response = JSON.parse(
      (await ungzip(Buffer.from(def.response.join(''), 'hex'))).toString(
        'utf-8'
      )
    );
  }
} catch (ex) {
  console.warn('Failed to decode response');
}

If this is in fact a bug then it'd be good to fix it in nock itself -- if it's as intended, i.e. the nock is technically returning the exact same response as the real server, then perhaps an option could be passed to nock record / nock.back to have it output the decompressed response?

How to reproduce the issue
The issue can be reproduced by recording an API which returns compressed, chunked JSON.

Does the bug have a test case?
https://github.com/richardscarrott/nock-record-chunked-encoding

Versions

Software Version(s)
Nock 9.6.1
Node 10.5.0
@richardscarrott
Copy link
Author

FYI, this is the complete afterNock function I'm using

const parseNockDefs = (
  nockDefs: (nock.NockDefinition & { rawHeaders: string[] })[]
) => {
  return nockDefs.map(def => {
    try {
      const headers = def.rawHeaders.reduce<Dictionary<string>>(
        (acc, curr, i, arr) => {
          if (i % 2 === 0) {
            acc[arr[i].toLowerCase()] = arr[i + 1].toLowerCase();
          }
          return acc;
        },
        {}
      );
      if (
        headers['transfer-encoding'] === 'chunked' &&
        headers['content-encoding'] === 'gzip' &&
        Array.isArray(def.response)
      ) {
        def.response = JSON.parse(
          gunzipSync(Buffer.from(def.response.join(''), 'hex')).toString(
            'utf-8'
          )
        );
        def.rawHeaders = Object.entries(headers).flatMap(([key, value]) => {
          if (key === 'transfer-encoding' || key === 'content-encoding') {
            return [];
          }
          return [key, value];
        });
      }
    } catch (ex) {
      console.warn('Failed to decode response');
    }
    return def;
  });
};

@stale
Copy link

stale bot commented Dec 10, 2018

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. We try to do our best, but nock is maintained by volunteers and there is only so much we can do at a time. Thank you for your contributions.

@stale stale bot added the stale label Dec 10, 2018
@gr2m
Copy link
Member

gr2m commented Dec 10, 2018

We would love if you or someone else could work on the issue

@protoEvangelion
Copy link

@richardscarrott thanks for pointing me in the right direction with this. The additional problem I had to figure out was that I needed to re-gzip after normalizing. This was because https://github.com/octokit/rest.js expects the response to be gzipped.

To work with nockBack, without changing nock internally, here is what I did in the afterRecord callback:

function decodeBuffer(fixture) {
  // Decode the hex buffer that nock made
  const response = isArray(fixture.response) ? fixture.response.join('') : fixture.response

  try {
      const decoded = Buffer.from(response, 'hex')
      var unzipped = zlib.gunzipSync(decoded).toString('utf-8')
  } catch (err) {
      throw new Error(`Error decoding nock hex:\n${err}`)
  }

  return JSON.parse(unzipped)
}

function afterRecord(fixtures) {
  const normalizedFixtures = fixtures.map(fixture => {
      fixture.response = decodeBuffer(fixture)

      // do normalization stuff
      // Re-gzip to keep the @octokit/rest happy
      const stringified = JSON.stringify(fixture.response)
      const zipped = zlib.gzipSync(stringified)

      fixture.response = zipped

      return fixture
  })

  return normalizedFixtures
}

@stale
Copy link

stale bot commented Apr 10, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. We try to do our best, but nock is maintained by volunteers and there is only so much we can do at a time. Thank you for your contributions.

@stale stale bot added the stale label Apr 10, 2019
@stale stale bot closed this as completed Apr 17, 2019
@paulmelnikow paulmelnikow reopened this Apr 17, 2019
@stale stale bot removed the stale label Apr 17, 2019
@paulmelnikow
Copy link
Member

This has a PR: #1496

protoEvangelion pushed a commit to protoEvangelion/gh that referenced this issue Apr 19, 2019
@stale
Copy link

stale bot commented Jul 16, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. We try to do our best, but nock is maintained by volunteers and there is only so much we can do at a time. Thank you for your contributions.

@quarhodron
Copy link

If someone has only a problem with encoding, then setting 'Accept-Encoding': 'identity' on requests should fix it. At least I started to get plain JSON.
I haven't tested it with a chunked response.

@sadams
Copy link

sadams commented May 26, 2022

Have taken a stab at making a PR to implement this: #2359

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants