Have an application that has a Jtable with the data held in RAM, the table can upto be 1,000,000 rows by 100 columns so this just uses too much memory. So now I'm moving to backing the JTable from a database, but I think getting directly from JTable to database is not going to help much because either all the data from the database is going to get loaded into memory when jtable initilized , or as the user scrolls down the tabel I would do select staements to get the next data, which would be too slow, and how would I handle sorting.
So I think the correct solution is to stick Hibernate between the JTable and Database, but I still can't quite see how this is going to help with a really large JTable.
Can anyone point me to a good example / have experience of using this with large dataset to keep memory usage down.
EDIT:Ive read some comments on other threads that a table with this much data should have filters so that only a subset of data is ever shown. I agree with that as a general principle and I will provide a filter HOWEVER only the user can decide how they want to filter it, and I still need to provide a 'ALL' option, and this is where it could all blow up.
I also remember something about having one table on top of another showing some kind of subset of the data that changes as you scroll down, but not seen a concrete example of this idea.