I have an API endpoint in node that uses a simple query to retrieve about 1.8MB of data from mongodb (size shown in mongodb Compass). The query uses an index and the explain plan shows the query execution time is almost 0 ms. But, when I console log the time to retrieve the data in node, it takes about 200 ms. What is happening between mongodb finding the results and having the results available in node? Is there a way to speed this up? The mongodb server is on the same server as the node app. Here is how I'm timing the retrieval from mongodb in my node/express route:
app.get('/myRoute', function(req, res) {
//get mongodb connection client
const db = app.locals.db
//start timer
let start = new Date()
//query DB
db.collection("my_collection").find({year: 2019}).toArray()
.then((data)=> {
//log timer shows about 200 ms
console.log(new Date() - start)
res.writeHead(200, {'Content-Type': 'application/json'});
res.end(JSON.stringify(data))
})
})
My guess is since the amount of data is of decent size that it takes a while to convert from BSON to a JS object. If so, anyway to speed this up? I send this to the client as JSON but preferably as gzip JSON (to speed up the download).
The data are for reporting purposes but it just seems inefficient to query mongodb -> mongodb then converts to JS object -> stringify to JSON -> potentially gzip the JSON -> then send to client.
In production I use a file system with many files and folders with pre gzipped JSON that the API finds and sends to client. This is very fast but just hard to maintain. Any thoughts? Thanks!