I am hosting around 18Mb pdf file to s3 bucket and trying to get it, but it takes a long time on a bit slow network, I also tried to covert the file to HTML and then render it but it becomes of around 48MB because of which the phone starts hanging. I have also moved the s3 to Singapore location to reduce latency and have also tried to pipe it through the server, Now I am only left with a option to disintegrate the PDF into images for every page and load them when requested, Is there anything that I am missing to make the load time of pdf bearable?
Asked
Active
Viewed 2,775 times
1
-
2Google "linearized pdf" – samgak Mar 14 '16 at 15:15
-
Check out this stackoverflow question for a good description of what linearizing a PDF does : http://stackoverflow.com/a/8390572/924 – Brandon Haugen Mar 15 '16 at 13:28
1 Answers
1
You have the following options as you are facing limitations on end-users devices:
- Split large PDF files into several parts and allow users to download these parts separately.
- Linearize PDF files, this will affect how files are loaded but will not decrease the size so you may face issue with crashes on end-user devices too.
- Optimize file size of PDF files by re-compressing images inside.
- Render low resolution JPEG images of PDF pages (with Ghostscript or ImageMagick) but please do not use JPEG as main format as JPEG compression is not designed for text compression (but for human faces).