Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It will break the web #23

Open
zloirock opened this issue Apr 26, 2021 · 29 comments
Open

It will break the web #23

zloirock opened this issue Apr 26, 2021 · 29 comments

Comments

@zloirock
Copy link

zloirock commented Apr 26, 2021

Dear TC39! Seems you forgot about polyfills.

The readme mentions that obsolete core-js versions from pre-ES6 era unconditionally set @@species to some built-ins. But you forgot about actual core-js versions and feature detection.

Detection of @@spicies support and delegating to .exec, @@match, @@replace, @@search, @@split - all kinds of subclassing mentioned in this proposal - existent in most of related core-js polyfills. Removing those kinds of subclassing means that... all of those features will be polyfilled.

At least all of Array, %TypedArray%, RegExp methods which now should work with those kinds of subclassing and Promise constructor will be replaced.

What does it mean?

  • Most of Array, %TypedArray%, RegExp methods will work tens and hundreds of times slower since polyfilled methods can't be so optimized like natives. Because of this, feature detection of subclassing was added to those methods only when it was supported in most modern engines. However, it's just a slowdown.
  • Promise will be replaced. That means that for all native APIs that should return a promise, the result will not be an instance of global Promise.
fetch(url) instanceof Promise; // => false

For most cases that should return native promises (like the example above) added workarounds in the most recent core-js versions, but there are cases that could not be fixed, old core-js versions, APIs that could accept only native promises like zloirock/core-js#815 and other polyfills.

It's only the most obvious cases, I'm sure there are deeper cases as well.

core-js is used in most websites - even naive detection shows that core-js is used on ~59% of top 1000 websites, you could add some more percents of sites uncovered by this detection.

@ljharb
Copy link
Member

ljharb commented Apr 26, 2021

If Promise is replaced, and that causes fetch(url) instanceof Promise to be false, that seems like a bug in the polyfill, unrelated to this proposal?

A Promise polyfill should be able to use es6-shim's technique of having a slightly more convoluted inheritance chain, so that fetch(url) instanceof OriginalPromise and fetch(url) instanceof ReplacedPromise are both true.

@bakkot
Copy link

bakkot commented Apr 26, 2021

It doesn't really matter if it's a bug in the polyfill or not. The question is whether the change proposed in this repo would cause websites to stop working from the user's perspective.

@ljharb
Copy link
Member

ljharb commented Apr 26, 2021

Fair enough; the wording of the OP wasn't clear as to whether core-js actually has that bug or not, but I suppose I should assume it does given the context.

@zloirock
Copy link
Author

@ljharb one more time - it's supported in actual versions of core-js, but not supported in old (and IIRC in old es6-shim versions too). But fetch is not the only API that should return a promise.

@ljharb
Copy link
Member

ljharb commented Apr 26, 2021

Thanks for clarifying.

@zloirock
Copy link
Author

@ljharb I checked the actual version of es6-shim, fetch(url) instanceof Promise // => false when Promise is polyfilled, but fetch not.

@ljharb
Copy link
Member

ljharb commented Apr 26, 2021

Thanks, looks like I need to fix that bug too then, if Promise is replaced in an environment where fetch exists (if there's more context, it'd be great to get an issue filed for it)

@zloirock
Copy link
Author

@codehag @syg so, any feedback?

This proposal is a blocker for many other proposals, for example, tc39/proposal-collection-methods#45 (comment) or tc39/proposal-change-array-by-copy#33

@codehag
Copy link
Collaborator

codehag commented May 31, 2021

It isn't quite clear to me what the exact issue here is, so I will try to answer but I may be misunderstanding your concern.

Question 1

Most of Array, %TypedArray%, RegExp methods will work tens and hundreds of times slower since polyfilled methods can't be so optimized like natives. Because of this, feature detection of subclassing was added to those methods only when it was supported in most modern engines. However, it's just a slowdown.

There might be a performance impact, but taking a look at coreJS, I don't expect it to be as significant as you are saying here. Performance issues can result in breakage, but that doesn't appear to be the case here.

I ran an experiment with array.map and 10000 iterations, here is what I found.

Just a small example:

for (var i = 0; i < 10000; i++) {
  [0,1,2,3,4,5,6,7,8,9].map( x => x + 1);
}

On Firefox, to give a ballpark: the polyfill was about 20% slower. Same for Safari.

What about v8? Well, you special class v8 to avoid performance problems. Cumulatively they make up around 80% of the market: https://github.com/zloirock/core-js/blob/ce52fdc735c5c809c9e85b2072f92d41b5a3885a/packages/core-js/internals/array-method-has-species-support.js#L8-L10 .

For the other 20% -- If we look at the difference between the browsers, the difference between browsers for the same piece of code is up to 169% different. While it is slower -- it is not so slow that it breaks the browsing experience.

We are looking at, potentially, a 20% slow down within a browser -- this is a less alarmist way to look at it. To say that methods "will work tens and hundreds of times slower" is simply not true. It isn't great, but this is not a web compatibility issue in the way you have framed this issue.

That 20% could also be made up for! By removing species. We expect a performance boost on not needing to do the species lookup every time we run a method. In addition, we will end up using less memory. The goal of this proposal is to investigate if (only if) we can have a faster, more secure JavaScript.

We can work together with you and make things faster for all users. There isn't any reason to do the version check for only v8 -- Firefox does too. If we address this, your library will have a better overall experience.

For your second question.

Question 2

Promise will be replaced. That means that for all native APIs that should return a promise, the result will not be an instance of global Promise.

fetch(url) instanceof Promise; // => false

It isn't clear to me what you are asking here. What do you mean by "Promise will be replaced"? Are you concerned that the following pattern will fail?

class MyArray extends Array {}
MyArray instanceof Array

If this is the case, then we generally don't think we will get rid of subclass II -- unless you meant something else and I misunderstood.

@zloirock
Copy link
Author

zloirock commented May 31, 2021

@codehag

Answer 1

I ran an experiment with array.map

Array.prototype.map is a bad example. Let's take a look, for example, at RegExp / String methods. For example, String.prototype.split.

It should rely on RegExp.prototype[@@split], RegExp.prototype[@@split] should rely on RegExp.prototype.exec and create new RegExp instances with usage @species pattern - it's III and IV types of subclassing. The same for the rest RegExp / String methods. Unlike @@species-related methods, the rest detections haven't a special case for V8. And a polyfill of all of this behavior is critically slow and it's not the worst example.

We can work together with you and make things faster for all users. There isn't any reason to do the version check for only v8 -- Firefox does too. If we address this, your library will have a better overall experience.

I would be happy to work together with you and make things faster for all users but seems you don't understand the problem.

The special case for V8 was added because the slowdown after some features detection even without polyfilling in V8 was catastrophic - see, for example, zloirock/core-js#677 (one more case of 100x slowdown in V8, but not so critical since it's in a rare use case), the problem is not so noticeable in other browsers. It's much better to use proper feature detection where it's possible instead of specifying of engine version.

If something works fine and in a new browser version starts to work 100x slower it could cause, for example, freezing of the page and it's definitely breaking the web.

In weekly downloads I see that 7.5% of users use core-js@1 - it's 6 years old version, 34% - core-js@2 - 2+ years old version, the similar situation in the statistics of sites, so even changing something in the actual version will not solve the problem.

Answer 2

That just means that Promise will be replaced - see Promise feature detection in core-js code.

All Promise-based APIs will return native promises, they will not be instances of global polyfilled Promise, so for old core-js versions and alternative polyfills instanceof Promise check will return false. instanceof Promise check is commonly used and I had some issues about it, it was related only to very old engines which had only some Promise-based APIs, but now we have many such APIs and it will be in modern engines.

Another case is APIs which expect only native promises, I showed this example: zloirock/core-js#815 - one more example of issues that will be in modern engines.

One more kind of issue is core-js pure version case (for example, @babel/runtime). core-js Promise polyfill is not something monolithic, it's modular. New methods haven't extended feature detection (why if we don't know broken implementations?) and core-js rely on their generic behavior - so instance and static methods will be copied to polyfilled Promise from native Promise. How do you think, what will be if we will try to use those methods on polyfilled Promise after removing the generic behavior of those methods (related #5)?

@ljharb
Copy link
Member

ljharb commented May 31, 2021

Re answer 1, If it's that critically slow such that it's unusable, why polyfill it at all in the first place? If it's still usable, just slow, then why is it a critical problem?

A 100x slowdown isn't "catastrophic" unless it's in a hot path, and if it is, then it wouldn't be able to pragmatically support an engine where it wasn't native.

@zloirock
Copy link
Author

zloirock commented May 31, 2021

@ljharb because it causes critical slowdown, it was polyfilled only when it was supported in all modern major browsers. A slowdown in obsolete browsers is not something critical. But this proposal will cause it in all modern major browsers, so it's inapplicable.

A constant 100x slowdown after feature detection even without polyfilling is catastrophic anyway.

@ljharb
Copy link
Member

ljharb commented May 31, 2021

The point of polyfilling is that older browsers are identically as important as modern ones - an unacceptable slow down in any is an unacceptable slowdown in all. If it's too slow for an old browser, then the old browser shouldn't be supported at all.

Put another way, if the performance is acceptable in any supported browser, then it is automatically acceptable in every supported browser.

@zloirock
Copy link
Author

...however, your es6-shim contains Map / Set with O(n) access time, many other your polyfills also are inapplicable slow.

@ljharb
Copy link
Member

ljharb commented May 31, 2021

Its performance (which is not O(n) in practice) is quite acceptable in every possible browser - including every modern one - or else it wouldn't be polyfilling it at all.

@zloirock
Copy link
Author

which is not O(n) in practice

It IS O(n) in most cases. It's unacceptable. And it can be polyfilled with sublinear access time. However, it's off topic, so let's close this discussion.

@codehag
Copy link
Collaborator

codehag commented Jun 1, 2021

I think we outlined this in the exit criteria in the proposal itself. I have been trying to understand why there was such an alarmist tone to this issue, but it might be unclear from the proposal how the investigation is going to go. We have a segment on it, please read it (as it seems you were focused on corejs and not on the method for testing). if it is still unclear let us know. In which case, we should clarify the readme.

It would be impossible for us to move forward with this without testing the compatibility of removal -- Firefox, and I believe chrome, have test implementations. On Firefox we are planning a test to determine the breakage on top sites as well as manual testing -- if there are serious issues like the ones you are concerned about, we will catch them. At present this isn't high priority so I can't say when this test will be run. Perhaps something that we should add in the readme to make this clear is the method by which we evaluate that criteria. This was a hard stop for us always, and the testing method was obvious to us.

As for your answers:

I've tested the RegExp functions on firefox with your library, and the difference is still around 20%. I wasn't too careful so I may have missed something. But, I am not going to spend more time investigating. Benchmarks like these are pretty meaningless and the test we have planned as part of our exit criteria should cover web breakages of this sort. Once that happens we will update the committee.

For promises -- I tested your library -- Completely disabling species doesn't cause any failures on modern browsers, even with evaluating promises produced by the async keyword against the forced promise from corejs in the absence of the species symbol. It only happens in two cases: when I force the polyfill to overwrite the promise itself. You mentioned your library has issues logged regarding this failure, like the one you linked, but it doesn't seem to apply in the case of removing species support. I am not an expert on your library, so, again - likely missed something. But the full browser implementation test will uncover any failures. This is again, something that would be covered as part of the exit criteria for stage 1.

This is something we want to verify experimentally. As mentioned in our readme exit criteria -- if there are web breakages and depending on the type and severity, we will change approach or abandon the proposal.

edit: for wording, understanding, and tried to make it shorter :/

@zloirock
Copy link
Author

zloirock commented Jun 3, 2021

I think we outlined this in the exit criteria in the proposal itself. I have been trying to understand why there was such an alarmist tone to this issue, but it might be unclear from the proposal how the investigation is going to go. We have a segment on it, please read it (as it seems you were focused on corejs and not on the method for testing). if it is still unclear let us know. In which case, we should clarify the readme.

I had initially read it. What could I say... since the method for testing found usage of @@species subclassing only in outdated core-js versions - it's the worst testing.

I'm concentrated on core-js because core-js is used on most internet sites, because core-js use all types of subclassing which you want to remove in this proposal, because core-js use it more than any other libraries and because I know that this proposal will break core-js and a big part of websites with it.

It would be impossible for us to move forward with this without testing the compatibility of removal

One more time - makes no sense even try to advance this proposal since it will break the web and I showed where it will be broken. Do you want to do a useless job?

At present this isn't high priority so I can't say when this test will be run.

This proposal is a block for some other proposals, in this case, it would be better to say them to use current ways of subclassing.

I've tested the RegExp functions on firefox with your library, and the difference is still around 20%.

I've tested the RegExp functions on Chrome, in my test cases, it's 6x slowdown.

For promises -- I tested your library -- Completely disabling species doesn't cause any failures on modern browsers, even with evaluating promises produced by the async keyword against the forced promise from corejs in the absence of the species symbol. It only happens in two cases: when I force the polyfill to overwrite the promise itself. You mentioned your library has issues logged regarding this failure, like the one you linked, but it doesn't seem to apply in the case of removing species support.

I don't know how to say, I already wrote it too many times, THE POLYFILL WILL OVERWRITE Promise CONSTRUCTOR IN ENGINES WITHOUT @@species SUPPORT, SEE THE FEATURE DETECTION. Even if something mentioned will not be broken with the actual core-js version - I already wrote that it will be broken with old versions. Or, for example, with @ljharb es6-shim.


Let's take a look at the reasons for the appearance of polyfills of these kinds of subclassing in core-js.

6 years ago @cscott asked for forced replacement of Promise constructor without @@species support in core-js. As you could see, type II and type III of Promise subclassing are required for his prfun library. After that, I saw many other cases like this.

Take a look at babel RegExp named capturing groups transform. As you could see, it uses delegation to @@replace and .exec (@nicolo-ribaudo). In case of removing delegation to .exec, regular expressions just will lose .groups property:

image

Even this case should be enough to stop thinking about removing those ways of subclassing.


Sorry, I'm fucking tired to explain obvious things to you, if you wanna break the web - feel free, I wash my hands.

@codehag
Copy link
Collaborator

codehag commented Jun 3, 2021

The old species subclassing was detected by a crawler running on the top 10000 sites, not by the actual web compat test which will be run in browsers. The test, as I mentioned before, has not yet been run, which is why we are still at stage 1. As mentioned in the readme, all of your concerns are covered (since long before this was published) and the proposal will not be pursued if it indeed breaks the web. I have said that a couple of times now. The example from babel as a potential breakage is useful. Class IV is orthogonal from the other classes, and we can investigate further.

Examples, and concrete points, are much more useful for evaluation than "it will break the web". The examples you initially gave from coreJS were not concrete. They were vague and difficult to evaluate, leading to a long and unproductive discussion. It is up to you how you want to communicate, but this absolutely did not need to be this frustrating for either of us.

Thanks for your input.

@codehag
Copy link
Collaborator

codehag commented Jun 3, 2021

I've now tested the regex example from babel, here is the output on the firefox build with class IV (the class that would cause breakage) disabled:

Screenshot 2021-06-03 at 10 39 24

I also verified that we never run the babel function that is implemented via symbol.replace. The wrapper does run.

You wrote the following

One more time - makes no sense even try to advance this proposal since it will break the web and I showed where it will be broken. Do you want to do a useless job?

Well, examples like this are why. You cannot -- purely from reading code and making a conjecture, know for certain that the code will break. We cannot know the behaviour of all libraries and their interactions with a given engine. The most thorough way to test is to actually check.

@zloirock
Copy link
Author

zloirock commented Jun 3, 2021

Did you see the link above?

image

How .exec will be called?

You cannot -- purely from reading code and making a conjecture, know for certain that the code will break. We cannot know the behaviour of all libraries and their interactions with a given engine.

Do you really think that I didn't call tests? If I wrote code, I know how it works a why it could stop work.

@codehag
Copy link
Collaborator

codehag commented Jun 3, 2021

Can you tone it down a bit? The aggression is absolutely not useful.

Also, what are you trying to achieve? That we abandon the proposal? It is stage 1, this is an incredibly weak signal from committee. It is literally an investigation -- nothing more.

The broader question of how we handle sub classing is what the other proposals are waiting on. That is partially related to this proposal, but is also a question of direction. Even if this proposal fails, we may choose not to do sub classing for future proposals.

@zloirock
Copy link
Author

zloirock commented Jun 3, 2021

I only asked how will be called .exec method from babel wrapper and I'm asked this in an acceptable tone - please, show me it if I wrong.

I'm just showing cases that will be broken by this proposal - and I showed some examples for each from II-IV types of subclassing - and it's just 4 libraries - core-js, babel, es6-shim, prfun, I didn't check more - so I have no ideas how something like that can be accepted.

@codehag
Copy link
Collaborator

codehag commented Jun 3, 2021

When you say "something like that can be accepted" I guess you mean into stage 1?

This is an unusual proposal. We are considering a new process for investigation of removal of features -- we never had one before. For now, we are using the pre-existing proposal system. I can see that this is causing concern. What stage 1 means is that the committee has acknowledged that we are investigating.

The staging process is not linear -- many proposals enter stage 1 and are retracted. Some reach stage 3, and are rolled back to stage 1. That something is stage 1 is not a signal that "it will be implemented", it is a signal that "we are investigating". Stage 1 is also highly malleable. Your contribution here can shape the way that this goes. Complete abandonment of the proposal just gets us back to square one, whereas investigation and collecting information might actually be productive and help us rethink the spec in a web compatible way.

For why this moved to stage 1: This proposal has been on the mind of many implementer for a long long time. But we never sat down and thought about how to fix it. Stage 1 seemed like a good way to say "hey, we are thinking about this, and how we might do it". In the end, it is likely that the shape of this will change.

We might instead come up with a recommendation for the future, based on how subclassing works today and which classes are highly used. Or we may be able to restrict sub classing in such a way that it maintains web compatibility while improving what engines can do to optimize. Ultimately that is the goal. We might not achieve it, but we are giving it a shot and doing so in the public.

In part -- it is to invite library authors such as yourself to comment and give advice. Ideally, it would take a more productive form than this discussion has been, by having concrete examples that we can look at and consider. Keep in mind, much like a language might not have all of its parts used -- a library might not either. Or, alternatively as we have done in the past for introducing new methods on builtins, we can reach out to individual sites if the breakage is small. It isn't productive to say "just abandon this", because the investigation may uncover other paths forward.

I would be very very thankful for clear examples from each of the classes. I think a good way to do this is to have dedicated issues for "potential class II breakage" etc for each of the classes. This way we can collect them and make sure they are not missed when we run the actual test. It can also help verify that our implementation is correct.

Does this bring us to the same page?

@zloirock
Copy link
Author

zloirock commented Jun 3, 2021

I mean that nothing that will break the web can be accepted and it can't be fixed in the scope of this proposal. Even for stage 1. Since no one of type II-IV ways of subclassing can be removed from current APIs, it should be withdrawn or changed to something that will rethink subclassing only for new APIs - as you wrote. The existence and unresolved state of this proposal is a block for other proposals, the links above.

We are considering a new process for investigation of removal of features -- we never had one before.

We had Reflect.enumerate.

Does this bring us to the same page?

Yes.

@codehag
Copy link
Collaborator

codehag commented Jun 3, 2021

Reflect.enumerate was a normative PR. I did not say this was the first backwards breaking change, it isnt and neither was Reflect.enumerate. I was referring to a new process that is designed to ensure we fully vet removals, which we do not have.

The unresolved state of this proposal is not in and of itself a blocker to those other proposals. The blocker is the question "what do we do about subclassing" -- If this proposal did not exist, as this is something implementations are concerned about this would still potentially be a blocker. We have similar discussion about other types of "exists but we really don't like it so we won't allow it into new language features" features, but they are not recorded. It is being investigated in depth here, and why people are pointing here.

As a way forward: I can update the proposal readme to be more explicit about this, and reword things a bit to make the fluidity clear. Stage 1 is not about the solution -- it is about the problem.

As for why exec isn't being called -- that is a good question. I have other work to do so i didn't read the code -- I just ran it. But now that you mention it: I threw a few logs and it looks like we are saving the state of the object somewhere, so it is failing on every other run. For some reason we are not calling this consistently: https://searchfox.org/mozilla-central/rev/85a74c965bd4bcab091b5767c13c44f84671d9ce/js/src/builtin/RegExp.js#9-48 and it is leading to false positives, which is what I saw. So the breakage is legit, looks like we have a bug, thanks for pointing it out.

@zloirock
Copy link
Author

zloirock commented Jun 3, 2021

Ok, so the list of problems from this issue:

@codehag
Copy link
Collaborator

codehag commented Jun 3, 2021

Fantastic, thank you!

@ljharb
Copy link
Member

ljharb commented Jun 3, 2021

To be clear, we can always accept something that will break the web into stage 1, because stage 1 is solely about exploring a problem space - only a solution could break the web, and no solution exists until at least stage 2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants