Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question - why is there a max URI length? #462

Closed
mlodato517 opened this issue Jan 6, 2021 · 13 comments
Closed

Question - why is there a max URI length? #462

mlodato517 opened this issue Jan 6, 2021 · 13 comments

Comments

@mlodato517
Copy link

I'm running into an issue very similar to seanmonstar/reqwest#668 (comment). To get production working for the near future I was going to fork this and change the MAX_LEN to be u32::MAX - 1 until reqwest is maybe updated to return a Result instead of panicking.

Before doing that - I wanted to make sure I wasn't opening myself up to any severe issues - is there a reason for the current URI max length being what it is?

@olix0r
Copy link

olix0r commented Jan 6, 2021

If a client can force hyper to allocate u32::MAX bytes per-request, this becomes a Denial-of-Service vector for internet-exposed services.

@dekellum
Copy link
Contributor

dekellum commented Jan 6, 2021

The current u16::MAX - 1 (= 65,534) seems pretty generous, from what I recall of browser and common server behavior.

@mlodato517
Copy link
Author

Interesting - so basically hyper is trying to "not be a tool" used for DoSing?

@dekellum
Copy link
Contributor

dekellum commented Jan 6, 2021

Sounds, like reqwest panic'ing could also be a DoS vector, however.

Better to fix that in reqwest.

@mlodato517
Copy link
Author

If hyper is worried about being used as a tool for DoSing I'm interested in how useful this guard is. If I were trying to do damage it seems trivial to patch hyper to allow a larger limit.

Again, the "real" solution is to not .expect this in reqwest but, if for whatever reason, your application can reasonably expect to receive a request of 100k characters that you might need to modify/proxy off to another service you manage then there'd still be the issue of reqwest returning a Result::Err and you wouldn't be able to send a request you wanted to.

@dekellum
Copy link
Contributor

dekellum commented Jan 6, 2021

Oh, I think the primary concern that motives the limit here, is a reasonable "default" maximum URI for users implementing servers?

@mlodato517
Copy link
Author

Oh, I think the primary concern that motives the limit here, is a reasonable "default" maximum URI for users implementing servers?

I'm not sure I understand. Are you clarifying the limit as more for how much data one can receive instead of send? And then is the security just ensuring the server doesn't have to spend a lot of time parsing a lot of long requests? (sorry I'm very green on the subject of security here)

@dekellum
Copy link
Contributor

dekellum commented Jan 6, 2021

No worries.

Are you clarifying the limit as more for how much data one can receive instead of send?

yes

And then is the security just ensuring the server doesn't have to spend a lot of time parsing a lot of long requests?

That could also be a factor, but not the primary one: Holding more than (2^16) 64KiB for an URL per pending request could be an easy DoS. For example, your proposed limit of u32::MAX is 4GiB, could easily kill a server in one (1) request!

@mlodato517
Copy link
Author

Great, that all makes sense. Thanks for those explanations!

@mlodato517
Copy link
Author

Actually, I wonder if I might re-open this 🤔

If a server wanted to accept URLs that were, for instance, 100k characters - should they be able to? Currently, if they're using this library, they just can't, can they? Should this max length be configurable - maybe a parameter with a default size or an environment variable or something else? Maybe with a local check to warn users if they put in something REALLY big to let them know that they maybe don't want that and are maybe inviting DoS?

@dekellum
Copy link
Contributor

dekellum commented Jan 7, 2021

I don't want to get your hopes up (don't have commit bit myself), but in my opinion, the best you could hope for here would be some kind of compile time setting, where the default max continues to be 64KiB. Since cargo features don't work with scaler values (they are just boolean flags) it might have to be a feature named something like "dangerously-disable-max-url-len", but even that kind of sucks because after the first dependency enables it, it can't be disabled (currently).

Forking off might be the better option for you.

@mlodato517
Copy link
Author

Huhhh interesting. Can you explain more about

the best you could hope for here would be some kind of compile time setting

It's not totally obvious to me why we couldn't add a new function that does the same parsing but with a max_length argument. It wouldn't be as ergonomic (which would maybe encourage users to generally stick to stuff like let uri: Uri = string.parse() but would allow those who want a different limit to still use this library.

@dekellum
Copy link
Contributor

dekellum commented Jan 7, 2021

Best of luck, if you wish to propose that. I don't really have anything more to add here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants