11

Currently I was able to optimise performance quite a bit, but it is still somewhat slow :/

LATEST EDIT:

My current solution (the fastest atm (but still slow) and keeps order):

server

router.post('/images', function(req, res, next) {
    var image = bucket.file(req.body.image);
    image.download(function(err, contents) {
        if (err) {
            console.log(err);
        } else {
            var resultImage = base64_encode(contents);
            var index = req.body.index;
            var returnObject = {
                image: resultImage,
                index: index
            }
            res.send(returnObject);
        }
    });
});

client query

$scope.getDataset = function() {

                fb.orderByChild('id').startAt(_start).limitToFirst(_n).once("value", function(dataSnapshot) {

                    dataSnapshot.forEach(function(childDataSnapshot) {
                        _start = childDataSnapshot.child("id").val() + 1;

                        var post = childDataSnapshot.val();
                        var image = post.image;

                        var imageObject = {
                            image: image,
                            index: position
                        };
                        position++;
                        $.ajax({
                            type: "POST",
                            url: "images",
                            data: imageObject,
                        }).done(function(result) {
                            post.image = result.image;
                            $scope.data[result.index] = post;
                            $scope.$apply();
                            firstElementsLoaded = true; 
                        });
                    })  
                });
            };

client HTML

<div ng-controller="ctrl">
        <div class="allcontent">
            <div id="pageContent" ng-repeat="d in data track by $index"><a href="details/{{d.key}}" target="_blank"><h3 class="text-left">{{d.title}}<a href="../users/{{d.author}}"><span class="authorLegend"><i> by {{d.username}}</i></span></a></h3>
                </a>
                <div class="postImgIndex" ng-show="{{d.upvotes - d.downvotes > -50}}">
                    <a href="details/{{d.key}}" target="_blank"><img class="imgIndex" ng-src="data:image/png;base64,{{d.image}}"></a>
                </div>
                <div class="postScore">{{d.upvotes - d.downvotes}} HP</div>
            </div>
        </div>
    </div>
TheProgrammer
  • 1,409
  • 4
  • 24
  • 53
  • What is `bucket.file()`? I'm not sure what package that's from. – Patrick Roberts Jun 20 '17 at 13:46
  • @PatrickRoberts Forgot to add it. Same as last time. Question edited. – TheProgrammer Jun 20 '17 at 13:58
  • After looking through documentation and your code, this is the same issue as last time. You're `push()`ing each result to `$scope.data` like how you did [here](https://stackoverflow.com/q/44638301/1541563) with `fileArray`. Given that, can you figure out how to fix the issue in a similar way? You should try to learn how to identify and resolve this problem so you don't have to keep asking each time. – Patrick Roberts Jun 20 '17 at 14:22
  • @PatrickRoberts My main issue here is that I do not know how to deal with the different segmentation between server and client – TheProgrammer Jun 20 '17 at 14:29
  • Your server code here is fine, it's your firebase code that's a bit messy. I was looking for how to at least get the index of each snapshot, but apparently I have a fundamental misunderstanding about how snapshots work, but check https://stackoverflow.com/questions/26471666/how-to-get-the-index-of-snapshot-on-child-added for how to get a reference to each child. – Patrick Roberts Jun 20 '17 at 14:37
  • @PatrickRoberts Actually, the firebase code worked perfectly so far ^^ The only thing I have changed is that I am trying to retrieve the actual images from my bucket using an ajax request instead of just using a link like before. – TheProgrammer Jun 20 '17 at 15:19
  • `messy != broken`, I wasn't saying it doesn't work, I'm saying it could be cleaner. Since I'm not familiar with firebase or its usage though, I don't know how to answer this question. – Patrick Roberts Jun 20 '17 at 15:21
  • @PatrickRoberts Oh ^^ I misunderstood, my bad. Well, thanks for taking the time to try at least :) and giving me a few clues ^^ – TheProgrammer Jun 20 '17 at 15:22
  • @PatrickRoberts All right, would you be interested in trying again now ? – TheProgrammer Jun 22 '17 at 15:30
  • @PatrickRoberts All right. I thought you literally said you stopped working on it because there was not enough rep to be gained. I also believe it's actually the first time you tell me you are not interested anymore without mentioning rep as a cause and the second time you tell me you are not interested which is strange considering the first time you seemed to suggest more rep to gain would entice you to keep trying. – TheProgrammer Jun 22 '17 at 16:05
  • @TheProgrammer try adding an index to the node on "id". its been used using the database rules, with ".indexOn" https://firebase.google.com/docs/database/security/indexing-data – Ben Yitzhaki Jun 22 '17 at 16:51
  • @BenYitzhaki Already done that ^^ – TheProgrammer Jun 22 '17 at 17:41
  • So obviously your promise architecture deals with the order, I have used local storage to speed things up in the past at least on reload. Would this work for you? Essentially the image could be stored locally so you would not have to get it each time. http://dexie.org/ is a good manager for this that is pretty simple to set up. If that seems useful I can provide code and examples to deal with this. The only other way I can think of would be to optimize the images and perhaps the database, but it seems like the query part is not the bottleneck, the data transfer is. – aduss Jun 26 '17 at 17:44
  • @aduss Check my latest edit. I keep the order and load it faster without using Promises. Sadly, local storage does not work in my case :/ Optimising the images is definitely something I would love to be able to do. Currently, I need to encode them in base64 to transfert them from my server to the client, maybe there is a better solution ? – TheProgrammer Jun 27 '17 at 11:59
  • https://stackoverflow.com/questions/522897/base-64-encode-vs-loading-an-image-file – l2aelba Jun 27 '17 at 13:51
  • @l2aelba Then why is it so slow in my case ? (although in my case each image is approx 5MB) – TheProgrammer Jun 27 '17 at 14:02
  • Why you just don't compile the link to the image and put it in the `ng-src` directive? Then from your server return the image with the right encoding and as a byte array (Remember that Base64 will add a 30% extra size to your original image, you said that the size of the images that you use is ~5mb, a 30% it's a lot here). This will improve the performance of your original request and you don't need to worry about the image anymore. – dlcardozo Jun 27 '17 at 14:51
  • @camaron I am not sure I understand you :/ What do you mean by: "compile the link to the image" ? – TheProgrammer Jun 27 '17 at 14:54
  • @TheProgrammer my bad, I saw that is a Post request to the server, you do something with the image there and then you return it in the response. Dismiss my comment. – dlcardozo Jun 27 '17 at 15:02
  • @camaron All right :/ – TheProgrammer Jun 27 '17 at 15:03
  • @TheProgrammer you could keep the promise architecture and just respond to each AJAX rather than the promise.all. This would allow you to use a get rather than a post which may shave off a very small amount of time. What are you using as a server? Restify, Express or something else? The next two steps would be to optimize the images, making a second smaller version of them automatically when they are added and serving that up by default. And exposing the urls to the images directly rather than doing a conversion. That way you can just add the img tag to the html page and skip all of this. – aduss Jun 27 '17 at 19:43
  • @aduss I used urls before. But urls cannot be secured. Anyone can hit a url millions of times with a bot. As for making smaller versions of the images, that's an excellent idea. I will do that. I just need to learn how to do it ^^ – TheProgrammer Jun 27 '17 at 20:25
  • @TheProgrammer you can block this to an extent using the same sort of thing (with restify). Essentially just check for repeat offenders before allowing this to execute: server.get(/\/img\/?.*/, restify.serveStatic({ directory: "./server_imgs" })); For the smaller versions of the image there are a number of tools out there. I would start with a simple resize, then move to a compress. If you are running linux you could use imagemagick. – aduss Jun 27 '17 at 21:10
  • so the problem is to show `post.image` locally without help of server, and it should be OK on modern browsers, what is the type of `post.image`? I mean `console.log(typeof post.image)` – Josh Lin Jun 28 '17 at 08:49
  • @JoshLin I am not sure I understand your question ? It will be a 64 bit encoded string representing an image atm. – TheProgrammer Jun 28 '17 at 10:43
  • @TheProgrammer performance means less network, less calculation, no bottle neck, `base64_encode(contents)` might be a bottle neck, because it would cost a lot CPU and a lot request come concurrently, May be some thing like `bucket.file(post.image)` and calculate data-URI at client side would be better – Josh Lin Jun 28 '17 at 11:35
  • @JoshLin What do you mean by "and calculate data-URI at client side" ? – TheProgrammer Jun 28 '17 at 11:45
  • @JoshLin Maybe it's better you post an answer so I can see in detail what you mean ? – TheProgrammer Jun 28 '17 at 11:46

4 Answers4

3

Your solution is slow because you are downloading the images from your Cloud Storage and serving them on your own server. You get a delay on the download and upload, a ~33% overhead using base64-encoded data, plus your server is strained in delivering images instead of focusing on delivering your website content.

As pointed by many on the comments, the best-practice solution is to use the public URL for the images like so:

function getPublicUrl (filename) {
  return "https://storage.googleapis.com/${CLOUD_BUCKET}/${filename}";
}

By using the public URL, you are directly serving from Cloud Storage leveraging Google’s global serving infrastructure. And the application does not have to respond to requests for images, freeing up CPU cycles for other requests.

If you do not want bots to crawl your images using the above method, Google recommends using a robots.txt file to block access to your images.

Kenworth
  • 593
  • 2
  • 9
  • The robots.txt file is not observed by malicious bots. I was using public URLs before, but that's not a secure solution :/ Signed URLs also offer limited protection against such attacks. – TheProgrammer Jun 30 '17 at 10:31
  • I guess there is no better solution than signed URLs atm :/ If only there were a way to restrict reads of the files to only originate from my domain :( – TheProgrammer Jun 30 '17 at 10:45
1
  1. Before optimize, you'd better collect some data, I will show some in the following snippet
  2. Looks like base64_encode(contents) may cost a lot CPU, your logic seems to repeatedly doing this. This is guessing, the true bottleneck you have to find it by your self
  3. Other suggestions may make less improvement, but that will be much in total effect(gzip\CDN\http2\loadbalance...)

Optimization Data Collect - Server Side, which operation took too much time

router.post('/images', function(req, res, next) {
  var d = new Date()
  var image = bucket.file(req.body.image);
  image.download(function(err, contents) {
    console.log('download:' + new Date() - d)
    if (err) {
      console.log(err);
    } else {
      var resultImage = base64_encode(contents);
      console.log('base64_encode:' + new Date() - d)
      var index = req.body.index;
      var returnObject = {
        image: resultImage,
        index: index
      }
      res.send(returnObject);
    }
  });
});

Optimization Data Collect - Client Side ()

chrome debug bar

Spare the use of base64_encode(contents)

$scope.getDataset = function() {

  fb.orderByChild('id').startAt(_start).limitToFirst(_n).once("value", function(dataSnapshot) {

    dataSnapshot.forEach(function(childDataSnapshot, index) {
      _start = childDataSnapshot.child("id").val() + 1;

      var post = childDataSnapshot.val();
      getImageBase64(post.image)
        .then((image) => {
          post.image = image;
          $scope.data[index] = post;
          $scope.$apply();
          firstElementsLoaded = true;
        })
    })
  });

  function getImageBase64(image1) {
    //without help of server, this will make your app faster
    //network reduced
    //server calculation reduced
    if (CanIUseBucktAndBase64Here) {
      return new Promise((reslove, reject) {
        var image = bucket.file(image1);
        image.download(function(err, contents) {
          if (err) {
            reject(err);
          } else {
            //use worker thread might gain better performance
            var resultImage = base64_encode(contents);
            resolve(resultImage)
          }
        });
      })
    }
    //with help of server
    return $.ajax({
        type: "POST",
        url: "images",
        data: image1,
      })
      .then(result => result.image)
  }
};

avoid download every time

//------------load all to local suit for less images----------
// if you have many images and you can use cheaper cache like file cache

//--init.js download all images, run only once
downloadAll()

//--server.js
//when image updated, let server know and flush cache
server.get('/imageupdated', (req, res) => {
  downfile(req.params.imgid)
  res.send('got it')
})

//form cache first
server.post('/image', (req, res) => {
  memcache.get(req.body.imgid)
    .then((content) => {
      if (!content) return downfile(req.body.imgid)
      res.send({
        content
      })
      return true
    })
    .then((content) => {
      if (content === true) return
      res.send({
        content
      })
    })
})

server.listen()


//--common.js download file and cache to memcache
function downfile(imgid) {
  var base64 = ''
  return bucket.download(imgid)
    .then((file) => {
      base64 = base64(file)
      return memcache.set(imgid, base64)
    })
    .then(() => {
      return base64
    })
}
//downfileBatch
async function downfileBatch(skip, limit) {
  return cloudDrive.getImages(skip, limit)
    .then((list) => {
      return Promise.all(list.map())
    })
}
//down load all images
async function downloadAll() {
  var i = 0,
    limit = 5
  while (true) {
    var list = await downfileBatch(i, limit)
    if (list.length < limit) {} else {
      i += limit
    }
  }
  return true
}
Josh Lin
  • 2,397
  • 11
  • 21
  • The problem is that image.download is part of the google cloud storage Node.js library. I can't use it on the client to my knowledge. – TheProgrammer Jun 29 '17 at 09:23
  • Regardless, I can't expose my api key on the client, it defeats the whole purpose. – TheProgrammer Jun 29 '17 at 11:17
  • @TheProgrammer then maybe some solution may have less effect, use more servers to do the task,e.g. loadbanlance through nginx; If `image.download` costs alot time, cache your images somewhere fast, e.g. local file, cache hot ones if too many – Josh Lin Jun 30 '17 at 01:42
  • @TheProgrammer look at Kenworth's answer, I think that should be what you want – Josh Lin Jun 30 '17 at 01:46
  • It's not. That's exactly what I was doing before :/ – TheProgrammer Jun 30 '17 at 10:32
  • @TheProgrammer have you collect your data? Is it download took too much time, or concurrent request in browser silled up? – Josh Lin Jun 30 '17 at 11:33
  • It's Download :( – TheProgrammer Jun 30 '17 at 12:05
  • @TheProgrammer add some code to show how to cache them up before they use the api, see the last snippet – Josh Lin Jun 30 '17 at 14:02
  • I think it would be of very limited use :/ My website is an UGC (User Generated Content) website. The content changes constantly. – TheProgrammer Jun 30 '17 at 14:40
  • @TheProgrammer let server know the instant the change happens, see the `imageupdated` api; If the amount is too large to cache, then cache the hot ones, you should know how to find hot ones yourself – Josh Lin Jul 01 '17 at 03:50
-1

The easiest way to optimize your code is to send a single ajax request to the server-side, rather than performing an ajax request for each of the images.

Remove the ajax calls from the loop. Loop through the collection, gather the data you need to send to the server in an array and then send it via a single ajax request. This will speed things up.

Also, make sure to modify your server-side POST handler so that it can handle the array, you'll be passing from the client-side.

neatsu
  • 162
  • 2
  • 6
  • Sadly, this means the images will display only once they have all been downloaded, which means a much slower feedback time for the user. I just tried it: 15 seconds to load 5 images per row with your approach, with nothing happening for that time. 5 seconds with my current approach. – TheProgrammer Jun 27 '17 at 11:56
  • I find it odd that you're downvoting a common, well-known optimization for ajax requests. This optimization works as you'll be performing a single request for all images. This will reduce the communication time between your front and back end. Whether this affects the visual feedback for the user is out of the question and I don't see anything about that in your initial post. – neatsu Jun 28 '17 at 13:01
  • 2
    What makes you think I downvoted you ? You do realise I don't have enough rep to do so ? (Min 125 rep to downvote) – TheProgrammer Jun 28 '17 at 15:53
-2

I am unsure how to increase the speed of the delivered files, but the problem of them being out of order is because they are asynchronous. Basically what is happening is that you are telling the server to get you a bunch of files then you wait. once the server sends them to you, you handle them. but you dont know the order they are coming in at. What you need to do is keep track of the order they are arriving. The way to do this is to have some way of tracking information about each post. For example lets say you have

var arr = new Array(10);
for (var i = 0 ; i < arr.length; i++){
  $.ajax({
    url:____,
    type:"GET" (or whatever type you want),
    data: _____,
    context: {index: i}
    success : function(data){
                 arr[this.index] = data
              }
  })
}

this should set the values correctly. The syntax might be a little off, but I believe this is the right idea for setting them in the correct order. The important part is setting the context, which will set what "this" is equal to in the success handler

Luple
  • 481
  • 3
  • 21