Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Faster tokenizer lookahead #13341

Merged
merged 8 commits into from May 26, 2021
Merged

Commits on May 25, 2021

  1. Copy the full SHA
    37afbd9 View commit details
    Browse the repository at this point in the history
  2. add benchmark

    JLHwung committed May 25, 2021
    Copy the full SHA
    723ff6e View commit details
    Browse the repository at this point in the history
  3. Copy the full SHA
    2660173 View commit details
    Browse the repository at this point in the history
  4. Update packages/babel-parser/src/tokenizer/index.js

    Co-authored-by: Brian Ng <bng412@gmail.com>
    JLHwung and existentialism committed May 25, 2021
    Copy the full SHA
    b1ff7ea View commit details
    Browse the repository at this point in the history
  5. Update packages/babel-parser/src/tokenizer/index.js

    Co-authored-by: Brian Ng <bng412@gmail.com>
    JLHwung and existentialism committed May 25, 2021
    Copy the full SHA
    1453b86 View commit details
    Browse the repository at this point in the history
  6. remove irrelevant comment

    JLHwung committed May 25, 2021
    Copy the full SHA
    7a47bb2 View commit details
    Browse the repository at this point in the history
  7. Copy the full SHA
    c1a6957 View commit details
    Browse the repository at this point in the history
  8. add test cases

    JLHwung committed May 25, 2021
    Copy the full SHA
    8dafa65 View commit details
    Browse the repository at this point in the history