11-22-2009, 06:06 PM
|
#1
|
Lifetime Suspension
Join Date: Sep 2009
Location: Calgary Alberta
|
Google Chrome users:
In Firefox, you can type in about :config
and edit the network.http.pipelining to true and set the connections to 10+ to make firefox load pages super fast, does chrome have anything like this?
btw, if your using firefox and want to make it a lot faster, check this link out
http://www.dagorret.net/2009/03/25/a...browser-speed/
|
|
|
11-22-2009, 08:00 PM
|
#2
|
#1 Goaltender
|
This doesn't do what you think it does, and it can actually degrade performance.
HTTP pipelining is the act of issuing multiple requests for HTTP elements and then waiting for them to arrive back to the client, in the order they were requested, over a single persistent HTTP connection. In other words, rather than request a page element, get it back, request the next one, etc, (within your http 1.1 compliant persistent connection, of course) you fire off all your requests up to some upper limit, and then wait for them to come back one after another.
So there are two problems here - first, if one of the elements requested in the pipeline is slow coming down, such as data coming back from a database query on the server, a URL pointing to a resource that is offline or has a bad DNS entry, etc, then the entire pipeline stalls. You will hit that one bad or slow element, and no more data will come down until the bad request times out.
Even a large graphic in the pipeline that takes a couple of seconds to download will cause all the other HTTP elements that are potentially available for download to queue up behind it.
Second, pipelining support can spotty at best - you'll find websites that aren't designed with pipelining in mind, hosted by servers, content distribution systems, and load balancers that don't correctly implement the specification either.
Finally, all modern browsers support more than one simultaneous persistent HTTP connection to the server. This allows them to issue multiple (usually between 4 and 6) HTTP requests simultaneously, and independently, of one another, so that once they've requested the initial HTML document, they can immediately request, and receive in parallel, multiple elements that make up the page.
IE, Safari, Chrome, and Firefox all come with pipelining disabled by default, for good reasons - better compatibility and reduced risk of unpredictable performance decreases caused by stalled pipelines.
__________________
-Scott
|
|
|
The Following 2 Users Say Thank You to sclitheroe For This Useful Post:
|
|
11-22-2009, 09:36 PM
|
#3
|
Franchise Player
Join Date: Nov 2006
Location: Supporting Urban Sprawl
|
Reducing RAM used for cache is only useful is you rarely, or never' hit your browsers 'back' button. Loading from RAM is fast, loading from HDD is slower, and loading it across the network from your cache server is even slower.
The overhead of deallocating 10M of RAM on minimize and reallocating when it comes back to the foreground probably far outweighs any benefits you might receive from having 10M free while it is minimized.
These 2 things are even more true when you consider the amount of RAM in most modern systems.
__________________
"Wake up, Luigi! The only time plumbers sleep on the job is when we're working by the hour."
|
|
|
The Following User Says Thank You to Rathji For This Useful Post:
|
|
11-23-2009, 02:55 PM
|
#4
|
Lifetime Suspension
Join Date: Sep 2009
Location: Calgary Alberta
|
Quote:
Originally Posted by sclitheroe
This doesn't do what you think it does, and it can actually degrade performance.
HTTP pipelining is the act of issuing multiple requests for HTTP elements and then waiting for them to arrive back to the client, in the order they were requested, over a single persistent HTTP connection. In other words, rather than request a page element, get it back, request the next one, etc, (within your http 1.1 compliant persistent connection, of course) you fire off all your requests up to some upper limit, and then wait for them to come back one after another.
So there are two problems here - first, if one of the elements requested in the pipeline is slow coming down, such as data coming back from a database query on the server, a URL pointing to a resource that is offline or has a bad DNS entry, etc, then the entire pipeline stalls. You will hit that one bad or slow element, and no more data will come down until the bad request times out.
Even a large graphic in the pipeline that takes a couple of seconds to download will cause all the other HTTP elements that are potentially available for download to queue up behind it.
Second, pipelining support can spotty at best - you'll find websites that aren't designed with pipelining in mind, hosted by servers, content distribution systems, and load balancers that don't correctly implement the specification either.
Finally, all modern browsers support more than one simultaneous persistent HTTP connection to the server. This allows them to issue multiple (usually between 4 and 6) HTTP requests simultaneously, and independently, of one another, so that once they've requested the initial HTML document, they can immediately request, and receive in parallel, multiple elements that make up the page.
IE, Safari, Chrome, and Firefox all come with pipelining disabled by default, for good reasons - better compatibility and reduced risk of unpredictable performance decreases caused by stalled pipelines.
|
Cool man thanks for the info. I have to say tho, using FF with pipelining on in general has reaped huge speed benefits for most websites (but i love the look and feel of chrome)
|
|
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT -6. The time now is 04:56 PM.
|
|