I'm compiling a Haskell executable that, on startup, reads about 50MB of data from the file system that has been serialized using the serialise
package and then applies some transformations to it before continuing.
I'd like to improve the start up speed of the executable, and I can theoretically use template haskell to deserialize the files and write them as data
constructors. But I'm wondering if this would actually improve performance? If the bulk of the time the code takes is calling the data constructors (meaning if the file IO and deserialization is fast) then it wouldn't be worth it, whereas if calling the data constructors is fast then it may be worth it.
Also, does GHC have any notion of compile-time evaluation for large data structures. Ie if I have something of type [Foo]
that is known at compile time and contains ~50MB of data, is there any way that the executable can contain that precompiled in whatever the haskell equivalent of the stack is, or will it be lazily evaluated like everything else?
Thanks in advance for your help & advice!