Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build next generation polyfill service using core-js polyfills #638

Open
neenhouse opened this issue Sep 4, 2019 · 6 comments
Open

Build next generation polyfill service using core-js polyfills #638

neenhouse opened this issue Sep 4, 2019 · 6 comments
Milestone

Comments

@neenhouse
Copy link

neenhouse commented Sep 4, 2019

Overview

Looking forward, we'd like to build a next generation polyfill service that uses corejs polyfills as it's core to capture next gen performance. As stated from @zloirock a look to the future:

A service like this one integrated with a good polyfills source like core-js, which only loads the needed polyfills by statically analyzing the source like Babel's useBuiltIns: usage option does could cause a revolution in the way we think about polyfills.

We've tested and built this service and wish to open source it back to community with the help and input of @zloirock. We should be able to share the details of the implementation in a RFC here and are actively working on open sourcing the internal components we have already built. We intend to publish a demo of this service to corejs.io as a proof of concept and we're hoping to co-maintain this package as an extension of corejs.io.

Motivation

World class performance starts with fully optimized critical load path. Polyfills are mandatory in the critical load path if you are building a modern UI application. A service oriented approach is the best way to optimize modern applications, and corejs is a world-class source of polyfills.

Details

In order to build a service that can distribute polyfill bundles, we need to build a service that can:

  • Accept GET HTTP requests cross origin
  • Normalize user-agent header to small subset
  • Respond with an optimized javascript bundles based on core-js library

The challenge in implementing a service that can bundle and respond quickly. The approach to achieve this is inspired by polyfill.io. You mainly depend on caching responses in a CDN network to ultimately get the best performance that is close to the user. The second caching layer which this approach can improve on polyfill.io is using pre-computation in CDN compute services to optimize cold cache hits from the CDN.

The CDN cache layer should memoize input based on:

  • normalized user agent string based on browserslist query
  • content-encoding (none, gzip, brotli)
  • query string params
    • features - The feature set of core-js poyfills to use
    • version - The version of the resource to request

When a CDN does not have a cached resource, a pre-computed bundle should be made available based on a subset of the same memoization criteria above. We have successfully designed and deployed this code and open source is pending.

The DNS record we have registered is corejs.io. A prototype service will be built in open source as part of this proposal, but is pending a service library we have named nimbuild that enables the pre-computation of core-js polyfill bundles to make the service possible.

Apologize again for the long wait, our open source process is taking longer than anticipated to finish.

Alternatives

Polyfill.IO has greatly inspired this approach. We initially implemented the service, but found a few issues with performance and stability that ultimately led us to build a similar service with corejs polyfills as a source. We have opened PRs to polyfill-library to improve the quality of the lib and still are willing to support their effort, but we feel that starting with core-js polyfills as a source will most likely lead to the best solution in terms of both quality and performance.

@zloirock
Copy link
Owner

zloirock commented Sep 8, 2019

Sorry for the delayed reply, I have some problems with time. Sounds great. Waiting for updates...

@neenhouse
Copy link
Author

Also apologize for the delay on our end. We should have code published soon.

@slavafomin
Copy link

@neenhouse this looks extremely interesting, we would be very glad to see such service implemented. Do you have more progress? Is there a way for us to help?

@neenhouse
Copy link
Author

@slavafomin Yes! We fully intend to do all phases in the open. Since releasing our code has taken longer than anticipated, I would be open to meeting to discuss our next steps. I'm wondering if @zloirock would be open to meeting up with us as well?

@zloirock
Copy link
Owner

zloirock commented Nov 8, 2019

@neenhouse sorry for the delay, I had (and still have) some serious problems, but let's try to find a convenient time for both of us. Could you mail me a convenient time for you?

@dargmuesli
Copy link

@zloirock I've sent you an email on this topic 🚀 happy to hear your response so that I can into contact with others I mentioned in the mail 🙌

@neenhouse could you shed some light on how your plan developed? 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants