-1

I would like to read in delimited data (length unknown) embedded in a larger .txt file. The usual ways, using np.loadtxt, np.genfromtxt, or pd.read_csv don't seem to work as they throw an error when encountering a bad line. Of course, you can handle bad lines but I haven't found an option to just stop and return the already imported data.

Is there such an option which I overlooked, or do I have to go back and evaluate the file line by line.

Any suggestions would be appreciated :)

funnydman
  • 9,083
  • 4
  • 40
  • 55
  • Does this answer your question? [Python: read all text file lines in loop](https://stackoverflow.com/questions/17949508/python-read-all-text-file-lines-in-loop) – Fran Arenas Aug 24 '22 at 08:29
  • Please provide enough code so others can better understand or reproduce the problem. – Community Aug 24 '22 at 12:52

1 Answers1

0

Something like this should work, though it might well be better to pre-process the file to fix whatever is causing the issue instead of only reading in data up to that point.

import csv

with open('try.csv', newline='') as csvfile:
    rows = []
    reader = csv.reader(csvfile)

    try:
        for row in reader:
            rows.append(row)

    # You should change Exception to be more specific
    except Exception as e:
        print("Caught", e)

    # These are the rows that could be read
    print(rows)
FiddleStix
  • 3,016
  • 20
  • 21
  • Thx for the suggestion! However, performance wise I'm wondering whether it is actually worse to handle the exception than to simply write my own line by line reader... And it's not an issue, it's done by design :( – SpinOrbiter Aug 24 '22 at 09:20