At what point is better to have more HTTP Requests if that means the page size goes down? For example, if I have an image that is 20KB, how much size would I have to reduce before it makes more sense to use two images?
-
The requests are literally a few hundred bytes. There is no reason to break up images to reduce size, as the sum of the sizes of the parts will be greater than the size of the original (each file includes a header, which takes up space). – Blender Mar 24 '12 at 20:59
-
2Fewer requests, is better. 1 rq with 20 KB performs better than 2 rq with 10KB + 10KB – dotoree Mar 24 '12 at 20:59
-
@dotoree There are situations where you can greatly reduce the size of an image by splitting it up into more than one image file. This is what I am asking about. Instead of splitting a 20KB into two 10KB files, I'm talking about splitting it into two 4KB files (for example). At what point would it make it better to have multiple files? – Mike H Mar 24 '12 at 21:26
-
1@MikeH You 're totally in the wrong path. There 're patterns to do exactly the opposite see http://www.w3schools.com/css/css_image_sprites.asp All images in one ! – dotoree Mar 24 '12 at 21:29
-
@dotoree I know about and use css sprites. I can reverse the situation if it makes it easier. Let's say you have two 4KB images. At what point will it not be worth it anymore to combine the two images. When the single image is 20KB? More? Less? – Mike H Mar 24 '12 at 21:42
-
@MikeH No straight answer I 'm afraid. It depends. If you have something like google maps you got to have an ajax loading technique. The tile size depends on your network and various other things. Maybe this will help http://www.codeproject.com/Articles/14793/How-Google-Map-Works – dotoree Mar 24 '12 at 21:47
2 Answers
The practical answer is never, especially when you're talking about relatively minuscule amounts of data like a kilobyte or two.
The real enemy of a web page's performance is not the number of bytes transferred; rather, it is network latency. Let's take your example and consider a 5 Mb/s connection (the average connection speed in the US is a little over that) with a ping time to your server of 80ms:
1x 20 kB files: 80ms latency + 31ms transfer time = 111ms
2x 4 kB files: 160ms latency + 13ms transfer time = 173ms
This "optimization" just cost at least 62ms with all other variables being equal. In the real world, I'd wager that performance would be even worse due to things like extra server load.
Also consider that you're now using an extra one of the limited number of parallel requests a browser will make (somewhere between 2 and 8 depending on browser) for a half of an image rather than something more valuable like script, CSS, or other non-spritable image. This will slow down the overall load time of your page.
Furthermore, I have a suspicion that your entire premise is flawed. In general, splitting an image into two files cannot truly yield a smaller overall file size because every image container format has header data; for example, a PNG file has at least 57 bytes of overhead before any actual image data. Plus, an extra HTTP request means an additional ± 800-900 bytes of overhead over the wire.
I suspect you'll find that one properly compressed PNG will be no larger than the total size of two PNGs making up the same image.
(source: josh3736.net)
(1027 bytes)
(source: josh3736.net)
(source: josh3736.net)
(730 + 809 = 1539 bytes)
Even though the first single PNG has 150x100 pixels of "dead" transparent space, it is 33% smaller than the two PNGs that represent the same image. (Disregard that I can't align the <img>
tags properly here to make the two examples look the same.)
-
I appreciate the response. I do want to say that there are places where you can shrink file size. For example, the situation that first started to make me think about this. I needed a gradient background that became transparent and tapered off at the top and bottom. I could either use one large image or make a 1px high image for the majority of the gradient and separate images for the ends. This would reduce 80% of the size but add 2 HTTP requests. Also, if the answer truly was never I could theoretically put every image from a site into one CSS Sprite and only make one image call a page. – Mike H Mar 25 '12 at 01:20
-
@MikeH, gradient backgrounds are their own challenge, but why not [skip images altogether](http://www.colorzilla.com/gradient-editor/)? At any rate, that's why I said "in general" -- there are situations where you *need* separate images, such as a background that repeats on both axes. (A background that repeats on one axis can still be sprited -- for x-repeating, stack the backgrounds on the y-axis and stretch the background over the width of the output image. That's how [SmartSprites](http://csssprites.org/) does it.) – josh3736 Mar 25 '12 at 03:23
-
Of course, I'm not suggesting you put *every* image into a sprite -- that would be silly. There is a balancing act to maintain HTTP roundtrip efficiency and client-side cache longevity. In other words, it would be counter-productive to put images likely to change in a sprite since the whole thing would have to be re-downloaded even though only a small part changed. My rule of thumb is design elements/small icons (which would generally be PNGs) go into sprites, and photographic things (which would generally be JPEGs) are their own files. – josh3736 Mar 25 '12 at 03:24
-
I also understand CSS Gradients. In the case that I explained, it's not possible to use CSS gradients in this case while making IE look good. Before you say just use an image for IE to fall back on, I'd have to go back and ask my original question again. This was just an example, though. Why does everyone assume that I am ignorant about the basic principals of my question? And yes, there is a balancing act to maintain. That was the point of my question- to find out where the point is where the scales tip. – Mike H Mar 25 '12 at 14:51
-
@mikeh There may be some edge cases where two images are more efficient than on but my guess is that it will only hold true in environments where there is low latency. The way to find out is to test using some carefully designed images in varying bandwidth and latency scenarios. Don't forget to consider how slow-start might complicate things too. – Andy Davies Mar 26 '12 at 09:52
-
@josh3736 I can see some situations where Mike's premise might hold true - perhaps also if is slow-start is involved. That said I'd go looking for other optimisations and probably wouldn't worry about this! – Andy Davies Mar 26 '12 at 10:06
-
@josh3736, is your calculation at the top entirely accurate? I mean, do the latencies of the http requests stack up with no overlap? Can't requests be happening simultaneously? – Moss Mar 15 '14 at 18:00
-
@Moss: It's hard to say since browsers do make requests in parallel, but the [concurrent request limit](http://stackoverflow.com/q/985431/201952) is different in every browser. We also have to account for other resources that might also need to be loaded (script, stylesheet, XHR, etc). So take those numbers as an illustration -- YMMV. – josh3736 Mar 15 '14 at 19:34
The conclusion of josh's answer didn't really changed. Considering the "Mobile Network Experience Report Jan. 2020" the latency has decreased by around 30% (from 80ms to approx. 55ms) but the average download rate (mobile) has increased up to 23 MB/s for the lowest rated operator.
So, we came up with this theoretical numbers for the lowest rated US mobile operater of 2019:
1x 20 kB files: 55ms latency + 7ms transfer time = 62ms
2x 4 kB files: 110ms latency + 2 ms transfer time = 112ms

- 976
- 1
- 13
- 25