14

I have a flat file that has 339276 line of text in it for a size of 62.1 MB. I am attempting to read in all the lines, parse them based on some conditions I have and then insert them into a database.

I originally attempted to use a bufio.Scan() loop and bufio.Text() to get the line but I was running out of buffer space. I switched to using bufio.ReadLine/ReadString/ReadByte (I tried each) and had the same problem with each. I didn't have enough buffer space.

I tried using read and setting the buffer size but as the document says it actually a const that can be made smaller but never bigger that 64*1024 bytes. I then tried to use File.ReadAt where I set the starting postilion and moved it along as I brought in each section to no avail. I have looked at the following examples and explanations (not an exhaustive list):

Read text file into string array (and write) How to Read last lines from a big file with Go every 10 secs reading file line by line in go

How do I read in an entire file (either line by line or the whole thing at once) into a slice so I can then go do things to the lines?

Here is some code that I have tried:

                 file, err := os.Open(feedFolder + value)
                 handleError(err)
                 defer file.Close()
                 //              fileInfo, _ := file.Stat()
                 var linesInFile []string

             r := bufio.NewReader(file)
             for {
                     path, err := r.ReadLine("\n") // 0x0A separator = newline

                     linesInFile = append(linesInFile, path)
                     if err == io.EOF {
                             fmt.Printf("End Of File: %s", err)
                             break
                     } else if err != nil {
                             handleError(err) // if you return error
                     }
             }
             fmt.Println("Last Line: ", linesInFile[len(linesInFile)-1])

Here is something else I tried:

var fileSize int64 = fileInfo.Size()
    fmt.Printf("File Size: %d\t", fileSize)
    var bufferSize int64 = 1024 * 60
    bytes := make([]byte, bufferSize)
    var fullFile []byte
    var start int64 = 0
    var interationCounter int64 = 1
    var currentErr error = nil
         for currentErr != io.EOF {
            _, currentErr = file.ReadAt(bytes, st)
            fullFile = append(fullFile, bytes...)
            start = (bufferSize * interationCounter) + 1
            interationCounter++
          }
     fmt.Printf("Err: %s\n", currentErr)
     fmt.Printf("fullFile Size: %s\n", len(fullFile))
     fmt.Printf("Start: %d", start)

     var currentLine []string


   for _, value := range fullFile {
      if string(value) != "\n" {
          currentLine = append(currentLine, string(value))
      } else {
         singleLine := strings.Join(currentLine, "")
         linesInFile = append(linesInFile, singleLine)
         currentLine = nil
              }   
      }

I am at a loss. Either I don't understand exactly how the buffer works or I don't understand something else. Thanks for reading.

Jonathan Hall
  • 75,165
  • 16
  • 143
  • 189
rvrtex
  • 199
  • 1
  • 1
  • 12
  • 4
    Don't read it all in at once. Steam it. Use `bufio.Scanner` (since you seem do indicate it's line based), process the line, insert into your db, *then forget that line*. – Dave C Mar 28 '15 at 03:04
  • Thank you for the response. How do I forget that line? In my attempts to use bufio.Scanner when I hit line 63700 (roughly) in my file I stop reading in new lines. My understanding is that it's because I hit the MaxScanTokenSize (http://golang.org/pkg/bufio/#pkg-constants) of the scanner. I would love to read the line, parse it, and throw it away but I don't know how to do the throw it away part so the scanner keeps moving through the whole file. – rvrtex Mar 28 '15 at 11:05
  • @DaveC Hm... Steamed buffers. – fuz Mar 28 '15 at 13:03
  • @rvrtex You forget a line by not storing a reference to the corresponding string any more. For instance, if `line` is the only variable containing the line to forget, assign something else to `line` to forget the original content. – fuz Mar 28 '15 at 13:04

4 Answers4

8

bufio.Scan() and bufio.Text() in a loop perfectly works for me on a files with much larger size, so I suppose you have lines exceeded buffer capacity. Then

  • check your line ending
  • and which Go version you use path, err :=r.ReadLine("\n") // 0x0A separator = newline? Looks like func (b *bufio.Reader) ReadLine() (line []byte, isPrefix bool, err error) has return value isPrefix specifically for your use case http://golang.org/pkg/bufio/#Reader.ReadLine
Uvelichitel
  • 8,220
  • 1
  • 19
  • 36
  • 2
    This is the correct way to do this. After some re-factoring to forget each line input as per @DaveC suggested and using `.Scan()` and `.Text()`I ran it again and had the same problem. I then went and looked at the file I was actually running my program against and found the file was the problem. The program was doing exactly what it should be doing and I had bad files on my server side. Lesson learned, sometimes it is not bad programming but bad input. Thanks for your help, with it I made my program run much more efficiently. – rvrtex Mar 30 '15 at 23:23
6

It's not clear that it's necessary to read in all the lines before parsing them and inserting them into a database. Try to avoid that.

You have a small file: "a flat file that has 339276 line of text in it for a size of 62.1 MB." For example,

package main

import (
    "bytes"
    "fmt"
    "io"
    "io/ioutil"
)

func readLines(filename string) ([]string, error) {
    var lines []string
    file, err := ioutil.ReadFile(filename)
    if err != nil {
        return lines, err
    }
    buf := bytes.NewBuffer(file)
    for {
        line, err := buf.ReadString('\n')
        if len(line) == 0 {
            if err != nil {
                if err == io.EOF {
                    break
                }
                return lines, err
            }
        }
        lines = append(lines, line)
        if err != nil && err != io.EOF {
            return lines, err
        }
    }
    return lines, nil
}

func main() {
    // a flat file that has 339276 lines of text in it for a size of 62.1 MB
    filename := "flat.file"
    lines, err := readLines(filename)
    fmt.Println(len(lines))
    if err != nil {
        fmt.Println(err)
        return
    }
}
peterSO
  • 158,998
  • 31
  • 281
  • 276
2

It seems to me this variant of readLines is shorter and faster than suggested peterSO

func readLines(filename string) (map[int]string, error) {
    lines := make(map[int]string)

    data, err := ioutil.ReadFile(filename)
    if err != nil {
        return nil, err
    }

    for n, line := range strings.Split(string(data), "\n") {
        lines[n] = line
    }

    return lines, nil
}
serghei
  • 3,069
  • 2
  • 30
  • 48
-1
package main

import (
    "fmt"
    "os"
    "log"
    "bufio"
)

func main() {
    FileName := "assets/file.txt"
    file, err := os.Open(FileName)
    if err != nil {
        log.Fatal(err)
    }
    defer file.Close()

    scanner := bufio.NewScanner(file)

    for scanner.Scan() { 
        fmt.Println(scanner.Text()) 

    }
}
Divakarcool
  • 473
  • 6
  • 20