You can still use Hibernate to fetch millions of data, it's just that you cannot do it in one round because millions is a big number and of course you will have out of memory exception. You can divide it into pages and then dump to XML each time, so that the records won't be keep in RAM and your program would not be needing so huge of memory.
I have these 2 methods in my previous project that I used very frequently. Unfortunately I did not like to use HQL so much so I don't have the code for that.
So here INT_PAGE_SIZE
is the amount of rows that you would like to fetch each round, and getPageCount
is to get the amount of total rounds to do to fetch all of the records.
Then paging
is to fetch the records by page, from 1 to getPageCount
.
public int getPageCount(Criteria criteria) {
ProjectionList pl = Projections.projectionList();
pl.add(Projections.rowCount());
criteria.setProjection(pl);
int rowCount = (Integer) criteria.list().get(0);
criteria.setProjection(null);
if (rowCount % INT_PAGE_SIZE == 0) {
return rowCount / INT_PAGE_SIZE;
}
return rowCount / INT_PAGE_SIZE + 1;
}
public Criteria paging(Criteria criteria, int page) {
if (page != -1) {
criteria.setFirstResult((page - 1) * INT_PAGE_SIZE);
criteria.setMaxResults(INT_PAGE_SIZE);
}
return criteria;
}