0

I've a file called "integers.csv" where I have 20 million data (type: long). I create this function to store them into a Record class (which has only a long variable). It works, but is very slow because it stores about 1k numbers per second. Are there ways to store it fastly?

private static void loadArray(String filepath, Sorting<Record> orderedArray) throws IOException, SortingException{
    System.out.println("\nLoading data from file...\n");
    Path inputFilePath = Paths.get(filepath);
    try(BufferedReader fileInputReader = Files.newBufferedReader(inputFilePath, ENCODING)){
      String line = null;
      while((line = fileInputReader.readLine()) != null){  
        String[] lineElements = line.split("\n");       
        Record record1 = new Record(Long.parseLong(lineElements[0]));
        orderedArray.add(record1);
      }
    } 
    System.out.println("\nData loaded\n");
  }
  • What is `Sorting`? – xingbin Apr 09 '18 at 15:08
  • 1
    First of all, you could eliminate line.split("\n"). This because the readLine() method already strips the newline – Robert Kock Apr 09 '18 at 15:08
  • You can benchmark (even simple **system.currenttimemillis** will help) and check what the bottlenecks are. Is it reading from the file as an operation or the logic in while block (parse/split/create instance)? – Bojan Trajkovski Apr 09 '18 at 15:14
  • 1
    The `Sorting` suggests it's some kind of sorted array. Inserting a great amount of data in that array could be expensive. You might try to insert into a normal array and sort it only once at the end. – Robert Kock Apr 09 '18 at 15:25

0 Answers0