I am thinking of re-developing an application which stores and visualise sensor data. The application is accessible by multiple users and they can add unlimited sensors. We are 10 users right now and we have about 1000 sensors. Number of users will probably not change. The sensors send data every 5 seconds if there is enough power(solar panels).
Right now data is stored in 4 tables.
- User [id, email, password, etc..]
- Device [id, name, user_id] user_id: foreign key
- Sensor [id, device_id, type] device_id: foreign key
- Data [id, sensor_id, time, data] sensor_id: foreign key
The problem is that Data table gets extremely big. The goal is to retain data for almost a year. I use MySQL and I am very disappointed with the performance of it. Right now I am using flask with Gunicorn and I am using RabbitMQ to queue the storing procedure. Is there anything I can change to increase the performance of the existing system? If you started this thing from scratch what changes you would make? Will NoSQL make a big difference in this situation? I am asking too much but it's the first time I face this kind of problem.