I have a project that requires me to create tables on the fly. When tables are created it needs to support at least 1000 columns in each table. All INTs, Dates, BITs, will be indexes. So there could be about 400 indexes on one table. Once data is uploaded to the server no other inserts or updated will be performed on the table. I will use a library like lucene to index text.
My questions are:
- Can CUBRID handle 1000's of tables in one database?
- What is the performance of CUBRID when selecting on a table with a thousand columns and hundreds of indexes?
- Does CUBRID have a windows GUI interface to run ad hoc queries and manage tables
- Can DDL and DML statements be in the same transaction?
I am well aware of database normalization design issues here. I have fully researched normalization issues and arrived that creating many tables and columns is the best solution.