Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Long URLs can cause seemingly exponential parse times, which makes untrusted input problematic #868

Open
Byron opened this issue Oct 5, 2023 · 0 comments

Comments

@Byron
Copy link
Contributor

Byron commented Oct 5, 2023

Please find attached three URLs that were generated by a fuzzer - all of them take more than 25s (in debug mode) to parse with url::Url::parse(long_url).

long-urls.zip

About Security

I was advised to open an issue here after getting in touch privately at first as per Security Policy.

Possible Fix

It seems feasible to limit the host-name length, which is the portion that causes long parsing times, to smaller values that can't be exploited. The current internet has limits for host-names as well which would be safe to parse.

Workaround

For now the only known way is to inspect the URL prior to passing it to url for parsing. However, at least with gix-url that's easier said than done as the fuzzer keeps finding bypasses for the extra-logic that I put into place.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant