I have a MongoDB database with about a year of temperature readings for 4 different sensors (one reading pre sensor every 5 mins) running on a Raspberry PI 3. I am using Mongoose in Node.JS to query the results. I realize there are limitations with reading large datasets from a SD card. I am wondering if there is anything I can do in code that will reduce the query time. I have tried 2 methods to get the data and get varying results depending on the number of results requested. The "old" method was my first attempt at it and I read is not scalable with large a large number of entrys. The "new" method I found here and modified to work for my situation. Is there a method I have missed in my search for faster query's?
console.time(`new ${number}`);
Readings.find({
sensor: 'Freezer'
}).sort('-time').limit(number).exec((err, readings) => {
if (err) {
console.log(err);
return;
}
readings.reverse();
// readings.forEach(reading => {
// console.log(new Date(reading.time).toLocaleString());
// });
console.timeEnd(`new ${number}`);
return;
});
console.time(`old ${number}`);
Readings.count({
sensor: 'Freezer'
}, (err, count) => {
if (err) {
console.log(err)
return;
}
Readings.find({
sensor: 'Freezer'
}).sort({'time':1}).skip(count - number).limit(number).exec((err, readings) => {
if (err) {
console.log(err);
return;
}
// readings.forEach(reading => {
// console.log(new Date(reading.time).toLocaleString());
// });
console.timeEnd(`old ${number}`);
return;
});
});
the "new" method is faster for small or medium query's while the "old" method is faster getting a large number of readings.
new method 288 results = 2719 ms
old method 288 results = 3831 ms
new method 8640 results = 3288 ms
old method 8640 results = 4184 ms
new method 103680 results = 3364 ms
old method 103680 results = 2795 ms