0

At present, if a page has 2 images and 2 JavaScript files, there are 5 calls made by the browser. Sure, you can have keep alive and pipelining, but it still 5 network calls. Is there a way to send one zip file and let browser resolve the resources within the zip or similar compassed file?

5 calls is just an example. In large websites, 30-50 calls per page are are not uncommon. Also, in-lining does not help, because your subsequent pages are going to refer to individual js/css/image/icon files. So those requests should be served from cache.

Network calls matter, specially if you have a hybrid application running on cell phone, and the user is in east coast and your data center is in west coast or worse, your user is in europe and you have datacenter in west coast etc.

Jimm
  • 8,165
  • 16
  • 69
  • 118
  • 1
    Not that I'm aware of. But really, 5 requests is not a big deal in 2014. – Christopher Marshall Mar 08 '14 at 20:44
  • You could compress resources you serve using gzip and it's common practice, but it's still 5 calls. – Daniel Kmak Mar 08 '14 at 20:45
  • 1
    The network calls themselves don't really matter AFAIK, its the time these calls take, which depend on the size of the files, which you can minify etc. – matthijs Mar 08 '14 at 20:47
  • 1
    You could place all your JS on the page and embed your images as Base64 or use inline SVG. Not sure what you'd gain from that, though. – DA. Mar 08 '14 at 20:47
  • See http://en.wikipedia.org/wiki/HTTP_pipelining and http://en.wikipedia.org/wiki/SPDY – ddelemeny Mar 08 '14 at 21:10
  • What matthijs said. The number of call isn't an issue. HTTP is stateless and can handle hundreds of calls if required. What matters is the size of the total payload required to deliver the page. If you're shipping lots of JS and images and performance is a pain point, you need to be looking at application design and CDN solutions like Akamai or Amazon Cloudfront. – Garreth McDaid Mar 08 '14 at 21:37
  • If you're not paying attention to the number of required round trips you're making the browser do or if you're in-lining all of your js and base64 images, please let me know what websites you've created so I know to avoid them. – Brian McGinity Mar 08 '14 at 23:23
  • 1
    @ChristopherMarshall That is the kind of attitude I hate. Just because you *can* make a hundred requests in just a few seconds, doesn't mean you should. I spend hours of my time optimising the hell out of my HTTP traffic, and my users have actually noticed and thanked me for it because of how fast things are on their mobile devices. – Niet the Dark Absol Mar 08 '14 at 23:29
  • @NiettheDarkAbsol I was not advocating a hundred requests. I'm well aware of optimization techniques, and would obviously advise against 100 requests. Come on dude. – Christopher Marshall Mar 09 '14 at 21:08

2 Answers2

1

Well if you're willing to buy an SSL certificate or already have one, I would recommend using SPDY, it's available in the nginx-extras package, to enable it you just add it to the listen line

listen 443 spdy;

EDIT:

There's few other things you can do, for JS and CSS files there's a lot of frame works that compile them into 1 single file, you can do that manually, also minifying the files in the process, also take a look at preprocessors like lesscss, they make my life easier, you might like using it.

As for the images, you can't really do much, unless they are small icon files, then you should create an image sprite, also try to consider using a font instead if possible, like font-awsome or glyphicons, since those are fonts you can set their size and colors easily.

Also make sure you have gzip on, check the response headers if they say that gzip is enabled.

Mohammad AbuShady
  • 40,884
  • 11
  • 78
  • 89
0

If you want fast page loads, pay close attention to the number of round trips the browser needs to make before the page can be rendered. Each trip is 20ms? 30ms, 40ms? going 1/2 way across the US 50ms? going from EST to PST--80ms.... yeah it adds up.

The best way that I've found to reduce the number "gets" and "304 not modified" is to cache all of your javascript files, images and css for 1 year.

The first time the browser needs the resource, it makes a bunch of round trips...can't help this. The 2nd time it needs the resource--serve it from cache in 0ms--won't even show in the webserver log files.

You want to set the max-age and expires to 1 year. Here's how I did it in apache: Apache: set max-age or expires in .htaccess for directory

Community
  • 1
  • 1
Brian McGinity
  • 5,777
  • 5
  • 36
  • 46