I need the ability to import some fairly large text files (100Mb+) into CoreData in an application targeted towards mobile devices where memory is constrained. Each file contains a large number of small records which will be processed before being added to the database. Looking through many sources the recommended method for reading in a text file seems to be:
NSString *stringFromFileAtPath = [[NSString alloc]initWithContentsOfURL:url encoding:NSUTF8StringEncoding error:&error];
At first glance this seems like a very memory intensive way of doing what I require, but given that there seems to be no other recommended way to read the file would I be right in guessing that Apple have taken this into account and do their own memory management - perhaps faulting in data from the file only when necessary?
If not would the best way to proceed be using NSStream and NSScanner to retrieve and process one line of text at a time?
If the recommended method does handle memory well then the next step is often:
NSArray *lines = [stringFromFileAtPath componentsSeparatedByCharactersInSet:[NSCharacterSet newlineCharacterSet]];
If I use this method I'm assuming that it would need the complete text file in memory so again it would be memory intensive. To save memory would I be better off using NSScanner or, given the limited processing power of mobile devices (certainly some of the older ones) would it take forever to complete?
Thanks in advance for any help you can give me with this question.
Dave