This answer gives a method to split datatables into smaller chunks (its generic but will work with datatables), and since the issue you need to fix is the size of the datatable, something like this should work:
public static IEnumerable<IEnumerable<T>> ToChunks<T>(this IEnumerable<T> enumerable, int chunkSize)
{
int itemsReturned = 0;
var list = enumerable.ToList(); // Prevent multiple execution of IEnumerable.
int count = list.Count;
while (itemsReturned < count)
{
int currentChunkSize = Math.Min(chunkSize, count - itemsReturned);
yield return list.GetRange(itemsReturned, currentChunkSize);
itemsReturned += currentChunkSize;
}
}
After that, use it as follows:
int rowCount = 1;
int chunkSize = 50000;
using (ExcelPackage objExcelPackage = new ExcelPackage())
{
ExcelWorksheet objWorksheet = objExcelPackage.Workbook.Worksheets.Add("Conversion Data");
//objWorksheet.Cells["A1"].LoadFromDataTable(FinalResult, true);
foreach (var smallerTable in FinalResult.AsEnumerable().ToChunks(chunkSize))
{
objWorksheet.Cells["A" + rowCount].LoadFromDataTable(smallerTable, true);
rowCount += chunkSize;
}
Basically, rowCount is to keep track of where on the Excel sheet to add the rows, after each iteration it will be 1, 50001, 100001, etc to insert to A1, A50001, A100001, etc.
If there is still an error for memory, try smaller and smaller chunkSize until it works.