I have a sql file that I am trying to parse. I can easily load the whole file, string.split(";")
and properly delimit the file. However, from a challenge and learning standpoint I would like to apply some functional foo and some streams to handle a file of arbitrary size. But that, gets a lot trickier.
Essentially I need to:
- Load an arbitrary buffer of data (let's say, a line since I can use
Java's
BufferedReader
to stream those - Continue to concatentate those line until there is a delimiter
;
- Emit the concatenated result
- Start a new sql command after the delimiter
- Final result:
List<String>
representing all sql commands to execute
Here is a sample:
CREATE SCHEMA HelloWorld;
USE HelloWorld;
CREATE TABLE HelloWorldTest (
id int PRIMARY KEY NOT NULL AUTO_INCREMENT,
message VARCHAR(32)
);
INSERT INTO HelloWorldTest (message) VALUES ('Hello world!');
INSERT INTO HelloWorldTest (message) VALUES ('this is a test!');
Inefficient Java to do the job:
String fileData = IOUtils.toString(sqlFile);
List<String> sqlStats = stream(fileData.split(";")).filter(s -> s.length() > 0)
.collect(toList());
How can I do this with stream and functional programming to handle any size file? Non-java answers are welcome as Java may not be able to do the job with the native libraries