The biggest draw on our bandwidth is the podcast.xml feed (mp3 files are hosted elsewhere). We have Apache set to use http compression, which is the only way to stay below the 4TB limit imposed by ISP. But our Apache threads are now consuming an average of 46MB(!) of RAM, and we suspect this is because all that compression is running constantly to serve that same file. This is a LAMP machine (RHEL). Is there a way to compress the RSS feed once each day when it's generated and serve that out, rather than compressing it with each request? Obviously we need something that will be compatible with every possible podcast reader out there.
Asked
Active
Viewed 35 times
0
-
Have you thought about using Varnish to cache that file? – Tyler Marien Aug 24 '14 at 18:26
-
I am not familiar with it. Could it cache the http-compressed version? – CaymanCarver Aug 24 '14 at 18:55
-
I believe it would as Varnish just caches whatever Apache returns. It will also cache different versions based on whether the client accepts compressed responses. If you are just trying to reduce the number of resources your server is using, you could also investigate throwing up a nginx server in front of apache to server static files. As well, I found this thread that might help you specifically with your question: http://stackoverflow.com/a/9158330/3610351 – Tyler Marien Aug 24 '14 at 19:43