21

I understand there are ways to upload multiple images from browser to server and also the uploading speed is base upon the speed of the server and network. The standard way of doing it:

Click a upload button in the website -> choose what images you want to upload -> click Submit Button to upload all the images to the server (Prompt "please wait" to user) -> Upload Successful!

But in terms of the coding part, I was just wondering is there a faster way to efficentily upload images from client's device to server? (Using Javascript and php)

Currently what I am doing is first "cut-down" the images in client side first, then send the images back to server. But this is very slow since javascript takes time to "cut-down" the size of the images. By "cut-down", I mean to make the image width and height smaller. Is there a faster way to do it?

(Some javascript and php coding example would help as well.)

Ryan Fung
  • 2,069
  • 9
  • 38
  • 60
  • 1
    Why the hurry? Just upload the image – Ed Heal Dec 07 '15 at 02:09
  • @EdHeal What if the image is really big? I will cause my server to overload. Would it not? – Ryan Fung Dec 07 '15 at 02:10
  • You can limit the upload size. You can limit the bandwidth per client you should do this anyway to prevent dos – Ed Heal Dec 07 '15 at 02:14
  • Please try to better define constraints. How big are images you would like to optimize? You are considering only a client side solution? Is mobile an issue? How slow is the current implementation? How faster should be? Could you sharing your code? – Davide Ungari Jan 04 '16 at 11:12
  • @Davide Consider user upload a few 8mb pictures taken from his/her phone to the website and press the upload button. I want to speed up the uploading process from client to server. So server side can quickly receive the images. I don't have a specific time in mind, but just want to know the best practise of doing so. – Ryan Fung Jan 04 '16 at 12:32
  • @RyanFung thanks for the clarification. My understanding is that it is not really a problem of backend scalability but mostly of client side performance and user experience. I try to simplify a bit otherwise the problem is too wide. – Davide Ungari Jan 04 '16 at 14:23
  • if you have many images, you can upload them in parallel instead of sequentially. network perf factors are complex, but pipelining usually decreases overall transfer time. – dandavis Jan 05 '16 at 05:30
  • @dandavis could you advise how could this be achieved? – Ryan Fung Jan 05 '16 at 06:29
  • @RyanFung you claim that the cut-down process is slow. Can you elaborate on the techniques you've used to cut-down? From my personal experience, it shouldn't take more than 200ms. – iMoses Jan 08 '16 at 15:49

8 Answers8

7

Short Answer: No, Javascript is not slow, neither PHP.

If the image is big (some megabytes), it will take a moment to crop/resize the image in javascript/php. There is no way to avoid it.

I believe you're having a performance issue because:

  1. You are uploading many images at once, and..
  2. Some of them are big (by megabytes)

Consider optimizing images before uploading them. I suggest you use kraken.io, which you can reduce the size of the images without changing image quality (loseless mode).

Also, consider uploading images by chunks, which:

  1. Allows you to track upload progress
  2. AJAX-driven

There are many chunk upload plugins, here is one of them:

https://github.com/blueimp/jQuery-File-Upload/wiki/Chunked-file-uploads

Last but not least, look at kraken.io itself, maybe you learn few things on how to speed up image uploading.

evilReiko
  • 19,501
  • 24
  • 86
  • 102
3

I previously had to solve a similar issue, although my solution was to optimize the images on the client-side, which you claimed to be slow. Since my own experience says otherwise, I'm interested in knowing what solutions did you already try.

My solution was:

  1. Read the images on the client-side, using the FileReader API. Seems that it's not necessary.
  2. Optimize each image by scaling and setting the quality, using the Canvas API. If for some reason this process turns to be slow, you can always use the WebWorkers API to split the load into multiple background processes. The benefit is that performance issues with the optimization process won't affect your UI (won't make it stuck).
  3. Merge all images into a single sprite, also using Canvas, and save the sprite's metadata (each image: x, y, width, height) in a separate object.
  4. Extract the sprite's base64 using Canvas toDataURL.
  5. Upload the compressed sprite file to the server along with the sprite's metadata. On the server decompress the file, then split and save it into separate images files according to the metadata.

This should do the trick of utilizing the client, which in most cases will reduce your network use and bandwidth requirements.

Update:

Here's a code example for the client-side code. You select as many files as you want in using the input field, afterwards only the images are picked, resized, packed a sprite and exported as an object containing the base64 version of the sprite and the metadata about each image within it. I made the base64 data uri clickable so you'll be able to see the results immediately.

What's missing is the server-side part, in which you take the object and use it. You create an image using the base64 data uri, I suggest using imagemagick, but any such library would do the trick, then you crop your images out, using the library you chose, according to the sprite's metadata and save each image separately (or whatever).

You can do a lot of small but cool optimizations on the client-side to reduce the load on your servers.

iMoses
  • 4,338
  • 1
  • 24
  • 39
3

Let's try to face this challenge systematically! Your task is as simple as that: send X bytes from users computer to server Y as fast as possible. Let's assume that once data reaches server, processing time is negligible – any modern server can process big images very quickly.

We can divide possible solutions in two categories:

  1. Decrease the X – decrease the amount of data being sent. Scale down the image on users side in JavaScript and send smaller images to server Y.

  2. Move server Y closer to user (surprisingly nobody mentioned this?) There is no need to deploy multiple servers worldwide, you could just use a CDN (and many of them are free or very cheap today). Even though CDN would still hit your web server in the end, it's usually safe to assume that CDN providers have better international bandwidth than average user.

Extra notes:

  1. Do some speed tests, your current server may have bandwidth issues and images are being uploaded way slower than they should. By simply moving the site to another hosting company, your image uploads could become times faster.

  2. Depending on your application UX, you can improve the speed as perceived by user by displaying image thumbnails as soon as files are chosen (done on JS side), while the files are still being uploaded. You could even allow the user to move around the page/site while upload is happening – if you don't block UI, slow uploads are less annoying.

In general, it's a complex issue and there is no one single approach – it's a set of steps. And if some user has a terrible connection, nothing else is going to matter!

Denis Mysenko
  • 6,366
  • 1
  • 24
  • 33
2

No, the upload still will be slow. The speed pretty much has nothing to do with the code. The main reason it would be slow is that images can take up a lot of space. Usually up to many megabytes. If you were to upload a string of text for comparison, it would be a lot faster as that string of text is only a few bytes or kilobytes. The one way that would make this a little faster is to put the image in a compressed .zip folder before uploading to server. It still would take really long though.

OneStig
  • 868
  • 2
  • 11
  • 24
2
  1. When original images are uploaded, optimization gains will likely be insignificant unless your users upload uncompressed *.bmp. If you're re-sizing images locally, make sure suitable compression has been applied - at least setting toDataURL encoderOptions when working with canvas, or something like JIC (demo). Beware of poor re-sampling quality.

  2. If you expect massive uploads or deal with low bandwidth users, you may want to introduce resumable uploads. A library like resumablejs would use html5 file api and support chunking, making the upload process less painful on poor connections. That's a pretty powerful combo when coupled with UX enhancements!

  3. Rather than focusing on upload performance, you may achieve better results by focusing on perception of performance. A useful pattern is moving the upload to an earlier stage in the workflow - Instagram starts an upload as soon as the image has been selected and continues in the background while the user works on the caption, location, sharing options...

    More thoughts from "Secrets to Lightning Fast Mobile Design" slides summarized on highscalability:

    https://speakerdeck.com/mikeyk/secrets-to-lightning-fast-mobile-design?slide=82

    Move bits when no-one is watching. Picture uploading starts before you share them, most apps wait until after the share screen. The rule is send data as soon as it's ready, then match it with user actions later. This doubles the number of requests, but the perceived performance is improved, even if a cancel is sent out later. Data is deleted on a cancel.

  4. Upload to a CDN. Stackoverflow integrates with imgur for image uploads and so can you. Of course this depends on how sensitive the uploads are, expected functionality and CDN's T&Cs but may save you traffic and storage while providing specialist api access to original images, resized copies and additional metadata.

Oleg
  • 24,465
  • 8
  • 61
  • 91
0

This is a great solution, it reduces the size of the image before the upload.

https://github.com/blueimp/jQuery-File-Upload

You didn't say whether you were opposed to using jQuery, if you are you may be able to find an equivalent solution that is pure JavaScript.

user2182349
  • 9,569
  • 3
  • 29
  • 41
0

I think that even if you find a better optimization algorithm at the end of the day if we are talking about mobile phones you should consider that network could be pretty slow and also the CPU could. My suggestion is to give a try to web workers and there is this tutorial could be very helpful.

The web workers run scripts in the background independently from user interaction and this could be very useful to avoid impact on the user and manage image processing and upload in the background.

Two main advantages:

  • you can run multiple threads / web workers in parallel
  • the web workers do not block the main thread and the user experience

Does advantages could not only to speed up the image processing but overall avoid to freeze the user interface.

Davide Ungari
  • 1,920
  • 1
  • 13
  • 24
  • So like a behind the sceen way of running javascripts? – Ryan Fung Jan 05 '16 at 01:11
  • I don't think this will speed up the process, it is just generating another extra process behind the scene when user upload. But what I want is to speed one the process at one blow when the user hits the submit button right after uploading all the images. – Ryan Fung Jan 07 '16 at 03:52
  • @RyanFung I introduce my answer saying that it's not easy optimize on mobile phones and you can not be sure of how good will be your solution. My suggestion is anyway that you need a more robust architecture that does not impact directly the user experience. This is more robust since it is running on the background and the user is free to do other stuff. – Davide Ungari Jan 07 '16 at 05:10
  • Just an example that my proposed approach is not so uncommon: https://blogs.msdn.microsoft.com/eternalcoding/2012/09/20/using-web-workers-to-improve-performance-of-image-manipulation/ – Davide Ungari Jan 07 '16 at 05:53
  • Another example of web worker and image processing https://dl.dropboxusercontent.com/u/2272348/codez/resizes/index.html – Davide Ungari Jan 07 '16 at 06:21
  • The link you provided only for performance of image manipulation. No neccessary the same as upload images from client to server. – Ryan Fung Jan 07 '16 at 07:22
  • @RyanFung I never said this would optmize the upload itself, and I specified that I'm suggesting a different architecture independent from the optimization itself. – Davide Ungari Jan 07 '16 at 07:30
-1

In this case it might turn out that if you stop cutting down the size of the image - the whole process will be faster.

How big are the images, both width/height and size(mb, kb)? How long does it take to upload an image? Why are you cutting down the images?

I think something in your javascript is not right - here is a way, using canvas and File API to resize image, before uploading it.

Community
  • 1
  • 1
inf1ux
  • 283
  • 1
  • 14
  • Let's say someone upload a huge image with high quality resolution (10 MB or above), and with today's camera there might be wallpaper size images being uploaded as well. I thought it will be better to first trim down the image before uploading to server. (Safe my server from overloading as well) – Ryan Fung Dec 07 '15 at 02:10
  • Acutally by trim down, I mean exactly that in the answer you posted. And it seems to take a while before uploading those images to server using the "File API" in canvas element – Ryan Fung Dec 07 '15 at 02:20
  • you can make that type of process MUCH faster by using a window.URL instead of a dataURL (where supported) https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement/toBlob – dandavis Jan 05 '16 at 05:29