I'm working with a CSV file received from outside my organization, so I have no control over how the file gets created. The file has ~108K records in it and I'm LOADing into a table specifically created to receive this file. There are two problem records in the file (wrong number of commas/fields) which I discovered by executing the LOAD and seeing the errors. "Awesome," I thought, "I'll just fix those two records manually and I'll get a clean LOAD." After re-creating the table (to LOAD into a virgin table) I executed the LOAD of the "fixed" csv. This time 0 (zero) records would LOAD. All I did was find the offending records, get the fields lined up the way they needed to be (to match the other 107,998 records/lines) and saved it. They are the ~58,000th and ~63,000th records in the file. I made no other changes.
As an experiment in my diagnostic process, I tried simply opening the a copy of the original file in my editor (Text Editor which comes standard in the Ubuntu 18.04 LTS distro) and saving it, with no changes. That file would not load either. In other words, the simple act of opening, then saving the file, "ruins it" as far as the MySQL LOAD command is concerned. When I reopen the "altered" file, it opens fine and appears to be a completely normal csv file.
In all my decades of coding and data management, I have never encountered this issue. I don't even know where to start. Clearly, the "save" is altering the file in some way that makes it unusable, but what could it possibly be?
This is my LOAD command: LOAD DATA LOCAL INFILE '/home/[user]/myfile.csv' INTO TABLE temp005 FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 LINES;