I am working on asp.net mvc application and it provides the functionality of reading from ORACLE database using DATAREADER and present those rows to the user (sometimes up to 10 mil). The datareader read operation throws out of memory exception after reading about 900,000 rows.
I was discussing this issue with my colleague and he suggested that I should use connectionless paradigm (may be Entity framework) or stored procedure and bring data in chunks.
I wonder if there is someone out there who can authoritatively say which is the best way to accomplish above issue.