-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Entirety of speculation rules should be filterable, not just the paths to exclude #1156
Comments
I had thought that only one |
It would be great to have the ability to set a limit when using wildcards. For example, eagerly prerender /product/* up to 8. In this case, given the limit of 10, we would still have 2 available on hover "slots" provided by the |
For context, my original use case was validation in unit tests, but runtime validation is interesting too. To make sure that's not too slow, I'd transform the JSON schema into a PHP file and then use that for the validation. Then there's no need to do any JSON parsing.
This sounds like a feature request for https://github.com/WICG/nav-speculation/issues. |
These are already separate limits. so you can prerender up to 10 eagerly, and 2 moderately. |
Check https://altsetup.com. I have: and
The 10 eager prerenders happen, but the 2 moderate ones (on hover) don't. Why? |
I believe because Chrome has a limit of 10 prerenders. |
According to @tunetheweb, the limits are 10 eagerly + 2 moderately, and I can confirm it works like that on another site I'm testing it on. The issue is on https://altsetup.com and some other sites having similar stack. I can provide admin access to check it. EDIT |
I see Chrome is reporting this error with the prerenders:
I see there is only one URL which is "Not triggered", and that is the Contact page: However, that Did you try excluding <script type="speculationrules">
{
"prerender": [{
"where": {
"and": [
{ "href_matches": "/*" },
{ "not": { "href_matches": "/contact/" }},
{ "not": {"selector_matches": ".do-not-prerender"}},
{ "not": {"selector_matches": "[rel=nofollow]"}}
]
},
"eagerness": "eager"
}]
}
</script> |
Actually all URLs having
I just replaced the script with yours, and now |
This is basically an extension of this bug: https://issues.chromium.org/issues/335277576. Once a speculation fails it is not retried. But to be honest a high eagerness is best used with a targeted list of URLs where you KNOW there is a high probability the link will be used. The lower eagerness (moderate or conservative) is for URLs where this cannot be predicted. The scattergun approach of trying to eagerly fetch ALL URLs, with the low eagerness backup for the same URLs is not recommended. Speculating 12 URLs per page load is a LOT of potential wastage which will affect both your hosting costs (every visitor is now effectively 12 visitors) and, more importantly, your visitors bandwidth and CPU resources. This is precisely why we have this 10 URL limit. And it be honest I think that's a little high and may be lowered if we see a lot of this scattergun approach. I would suggest a more targeted approach for eagerly fetched URLs, or just using less eagerly fetched URLs. |
Feature Description
Currently the Speculative Loading plugin provides a
plsr_speculation_rules_href_exclude_paths
filter to exclude URLs (and URL patterns) from being speculatively loaded. However, two situations came up recently where this was not sufficient.rel=nofollow
to the links. However, there is no way to exclude links via such an attribute without manually modifying the default rules, which was done in Exclude rel=nofollow links from prefetch/prerender #1142. (This was to avoid having to add a WooCommerce-specific URLPattern to exclude URLs with anadd-to-cart
query param, since excluding links withrel=nofollow
may be generally advisable: Should we excluderel=nofollow
by default? WICG/nav-speculation#309).To account for these two use cases, I suggest that the entire set of speculation rules (speculation ruleset?) be filterable, doing something like this:
Also, @swissspidy suggested in #1144 (comment) that a JSON Schema could be written which could validate whatever is being returned by the filter. If not valid, it could trigger a
_doing_it_wrong()
.The text was updated successfully, but these errors were encountered: