Max Size of Url Parameters in _Get

Max size of URL parameters in _GET

Ok, it seems that some versions of PHP have a limitation of length of GET params:

Please note that PHP setups with the suhosin patch installed will have
a default limit of 512 characters for get parameters. Although bad
practice, most browsers (including IE) supports URLs up to around 2000
characters, while Apache has a default of 8000.

To add support for long parameters with suhosin, add
suhosin.get.max_value_length = <limit> in php.ini

Source: http://www.php.net/manual/en/reserved.variables.get.php#101469

What is the maximum possible length of a query string?

RFC 2616 (Hypertext Transfer Protocol — HTTP/1.1) states there is no limit to the length of a query string (section 3.2.1). RFC 3986 (Uniform Resource Identifier — URI) also states there is no limit, but indicates the hostname is limited to 255 characters because of DNS limitations (section 2.3.3).

While the specifications do not specify any maximum length, practical limits are imposed by web browser and server software. Based on research which is unfortunately no longer available on its original site (it leads to a shady seeming loan site) but which can still be found at Internet Archive Of Boutell.com:

  • Microsoft Edge (Browser)

    The limit appears to be around 81578 characters. See URL Length limitation of Microsoft Edge

  • Chrome

    It stops displaying the URL after 64k characters, but can serve more than 100k characters. No further testing was done beyond that.

  • Firefox (Browser)

    After 65,536 characters, the location bar no longer displays the URL in Windows Firefox 1.5.x. However, longer URLs will work. No further testing was done after 100,000 characters.

  • Safari (Browser)

    At least 80,000 characters will work. Testing was not tried beyond that.

  • Opera (Browser)

    At least 190,000 characters will work. Stopped testing after 190,000 characters. Opera 9 for Windows continued to display a fully editable,
    copyable and pasteable URL in the location bar even at 190,000 characters.

  • Microsoft Internet Explorer (Browser)

    Microsoft states that the maximum length of a URL in Internet Explorer is 2,083 characters, with no more than 2,048 characters in the path portion of the URL. Attempts to use URLs longer than this produced a clear error message in Internet Explorer.

  • Apache (Server)

    Early attempts to measure the maximum URL length in web browsers bumped into a server URL length limit of approximately 4,000 characters, after which Apache produces a "413 Entity Too Large" error. The current up to date Apache build found in Red Hat Enterprise Linux 4 was used. The official Apache documentation only mentions an 8,192-byte limit on an individual field in a request.

  • Microsoft Internet Information Server (Server)

    The default limit is 16,384 characters (yes, Microsoft's web server accepts longer URLs than Microsoft's web browser). This is configurable.

  • Perl HTTP::Daemon (Server)

    Up to 8,000 bytes will work. Those constructing web application servers with Perl's HTTP::Daemon module will encounter a 16,384 byte limit on the combined size of all HTTP request headers. This does not include POST-method form data, file uploads, etc., but it does include the URL. In practice this resulted in a 413 error when a URL was significantly longer than 8,000 characters. This limitation can be easily removed. Look for all occurrences of 16x1024 in Daemon.pm and replace them with a larger value. Of course, this does increase your exposure to denial of service attacks.

What is the maximum length of a URL in different browsers?

Short answer - de facto limit of 2000 characters

If you keep URLs under 2000 characters, they'll work in virtually any combination of client and server software.

If you are targeting particular browsers, see below for more details on specific limits.

Longer answer - first, the standards...

RFC 2616 (Hypertext Transfer Protocol HTTP/1.1) section 3.2.1 says

The HTTP protocol does not place
any a priori limit on the length of
a URI. Servers MUST be able to handle
the URI of any resource they serve,
and SHOULD be able to handle URIs of
unbounded length if they provide
GET-based forms that could generate
such URIs. A server SHOULD return
414 (Request-URI Too Long) status if a
URI is longer than the server can
handle (see section 10.4.15).

That RFC has been obsoleted by RFC7230 which is a refresh of the HTTP/1.1 specification. It contains similar language, but also goes on to suggest this:

Various ad hoc limitations on request-line length are found in
practice. It is RECOMMENDED that all HTTP senders and recipients
support, at a minimum, request-line lengths of 8000 octets.

...and the reality

That's what the standards say. For the reality, there was an article on boutell.com (link goes to Internet Archive backup) that discussed what individual browser and server implementations will support. The executive summary is:

Extremely long URLs are usually a
mistake. URLs over 2,000 characters
will not work in the most popular web
browsers.
Don't use them if you intend
your site to work for the majority of
Internet users.

(Note: this is a quote from an article written in 2006, but in 2015 IE's declining usage means that longer URLs do work for the majority. However, IE still has the limitation...)

Internet Explorer's limitations...

IE8's maximum URL length is 2083 chars, and it seems IE9 has a similar limit.

I've tested IE10 and the address bar will only accept 2083 chars. You can click a URL which is longer than this, but the address bar will still only show 2083 characters of this link.

There's a nice writeup on the IE Internals blog which goes into some of the background to this.

There are mixed reports IE11 supports longer URLs - see comments below. Given some people report issues, the general advice still stands.

Search engines like URLs < 2048 chars...

Be aware that the sitemaps protocol, which allows a site to inform search engines about available pages, has a limit of 2048 characters in a URL. If you intend to use sitemaps, a limit has been decided for you! (see Calin-Andrei Burloiu's answer below)

There's also some research from 2010 into the maximum URL length that search engines will crawl and index. They found the limit was 2047 chars, which appears allied to the sitemap protocol spec. However, they also found the Google SERP tool wouldn't cope with URLs longer than 1855 chars.

CDNs have limits

CDNs also impose limits on URI length, and will return a 414 Too long request when these limits are reached, for example:

  • Fastly 8Kb
  • CloudFront 8Kb
  • CloudFlare 16Kb

(credit to timrs2998 for providing that info in the comments)

Additional browser roundup

I tested the following against an Apache 2.4 server configured with a very large LimitRequestLine and LimitRequestFieldSize.

Browser     Address bar   document.location
or anchor tag
------------------------------------------
Chrome 32779 >64k
Android 8192 >64k
Firefox >64k >64k
Safari >64k >64k
IE11 2047 5120
Edge 16 2047 10240

See also this answer from Matas Vaitkevicius below.

Is this information up to date?

This is a popular question, and as the original research is ~14 years old I'll try to keep it up to date: As of Jan 2021, the advice still stands. Even though IE11 may possibly accept longer URLs, the ubiquity of older IE installations plus the search engine limitations mean staying under 2000 chars is the best general policy.

What is the limit on QueryString / GET / URL parameters

There is no limit in theory. For HTTP URLs, the HTTP 1.1 specification states:

The HTTP protocol does not place any a priori limit on the length of
a URI. Servers MUST be able to handle the URI of any resource they
serve, and SHOULD be able to handle URIs of unbounded length if they
provide GET-based forms that could generate such URIs. A server
SHOULD return 414 (Request-URI Too Long) status if a URI is longer
than the server can handle (see section 10.4.15).

But in practice, many clients and servers do only support URLs up to a certain length. The rule of thumb is not to use URLs longer than 2000 characters (percent encoding already taken into account).

Maximum length of HTTP GET request

The limit is dependent on both the server and the client used (and if applicable, also the proxy the server or the client is using).

Most web servers have a limit of 8192 bytes (8 KB), which is usually configurable somewhere in the server configuration. As to the client side matter, the HTTP 1.1 specification even warns about this. Here's an extract of chapter 3.2.1:

Note: Servers ought to be cautious about depending on URI lengths above 255 bytes, because some older client or proxy implementations might not properly support these lengths.

The limit in Internet Explorer and Safari is about 2 KB, in Opera about 4 KB and in Firefox about 8 KB. We may thus assume that 8 KB is the maximum possible length and that 2 KB is a more affordable length to rely on at the server side and that 255 bytes is the safest length to assume that the entire URL will come in.

If the limit is exceeded in either the browser or the server, most will just truncate the characters outside the limit without any warning. Some servers however may send an HTTP 414 error.

If you need to send large data, then better use POST instead of GET. Its limit is much higher, but more dependent on the server used than the client. Usually up to around 2 GB is allowed by the average web server.

This is also configurable somewhere in the server settings. The average server will display a server-specific error/exception when the POST limit is exceeded, usually as an HTTP 500 error.

Maximum url query string too many different opinions?

The maximum length of a URL that can be processed by Firebase Hosting is 8KiB (8,192 bytes).

Browsers / HTTP clients may have more restrictive limitations, but Firebase Hosting will allow URLs up to 8KiB.

How do I change maximum length of HTTP GET request to store the params ? Is it possible?

Search returned this question:
Is there a limit to the length of a GET request?

To conclude: in specifications no limit is set for GET request size, but some web browsers set limits (for example, IE, which limits the GET request size to about 2000 characters). Also, from my own experience I know that some web servers (like Apache) allow to configure the maximum GET/POST request size, but this depends on which web server You are using in the backend.

Suggestion: use POST instead of GET, if You need to have bigger request size than 2000 characters. Verify whether Your web-server sets limits to GET/POST request sizes or not.

What is a safe maximum length a segment in a URL path should be?

Possibly related of What is the maximum length of a URL in different browsers?

In short

According to the HTTP spec, there is no limit to a URL's length. Keep your URLs under 2048 characters; this will ensure the URLs work in all clients & server configurations. Also, search engines like URLs to remain under approximately 2000 characters.

Chrome has a 2MB limit for URLs, IE8 and 9 have a 2084 character limit. So everything points in keeping your URLs limited to approx. 2000 characters.

Also, from a usability point-of-view, URLs that long are not usable/readable by users.

However, the domain name has a max. length of 255 characters.
So to be on the safe side, the max. length of an URL segment would be around 1745 characters, given that your URL exists out of 1 segment.



Related Topics



Leave a reply



Submit