I have a CSV file with stock trading history, its size is 70 megabytes. I want to run my program on it, but do not want to wait for 30 seconds every start.
1. Just translate CSV file into Haskell source file like this:
From | TO
-------------------------------------------
1380567537,122.166,2.30243 | history = [
... | (1380567537,122.166,2.30243)
... | , ...
... | ]
2. Use Template Haskell to parse file compile-time.
Trying first approach I found my GHC eat up 12gb of memory after 3 hours of trying to compile one list (70 mb source code).
So is TH the single available approach? Or I can just use hard-coded large data structure in source file? And why GHC can't compile file? Does it go to combinatorial explosion because of complex optimizations or something?