In a MySQL database, I have one table that has 330 columns, each column is either a float or integer value. The entries are indexed by a millisecond time stamp column. Over the life of the application there is expected to be on the order of 100 - 200 million entries. This table is completely independent and has no relations to other tables. The only queries are ones filtering by the time stamp index.
Assuming I have a modern intel server with 6 cores and 32GB of ram and enough disk storage for the data, will any size limits be hit or will performance significantly degrade?
If there will be problems, what should be done to mitigate the problems.
I know similar questions have been asked, but the answer always seems to be it depends. Hopefully I've provided enough information so that a definitive answer can be determined.