-3

I am reading data from excel file(which is actually a comma separated csv file) columns line-by-line, this file gets send by an external entity.Among the columns to be read is the time, which is in 00.00 format, so a split method is used read all the different columns, however the file sometimes comes with extra columns(commas between the elements) so the split elements are now always correct. Below is the code used to read and split the different columns, this elements will be saved in the database.

public void SaveFineDetails()
    {
        List<string> erroredFines = new List<string>();
        try
        {
            log.Debug("Start : SaveFineDetails() - Saving Downloaded files fines..");
            if (!this.FileLines.Any())
            {
                log.Info(string.Format("End : SaveFineDetails() - DataFile was Empty"));
                return;
            }

            using (RAC_TrafficFinesContext db = new RAC_TrafficFinesContext())
            {
                this.FileLines.RemoveAt(0);
                this.FileLines.RemoveAt(FileLines.Count - 1);

                int itemCnt = 0;
                int errorCnt = 0;
                int duplicateCnt = 0;
                int count = 0;

                foreach (var line in this.FileLines)
                {
                    count++;
                    log.DebugFormat("Inserting {0} of {1} Fines..", count.ToString(), FileLines.Count.ToString());

                    string[] bits = line.Split(',');
                    int bitsLength = bits.Length;
                    if (bitsLength == 9)
                    {
                        string fineNumber = bits[0].Trim();
                        string vehicleRegistration = bits[1];
                        string offenceDateString = bits[2];
                        string offenceTimeString = bits[3];
                        int trafficDepartmentId = this.TrafficDepartments.Where(tf => tf.DepartmentName.Trim().Equals(bits[4], StringComparison.InvariantCultureIgnoreCase)).Select(tf => tf.DepartmentID).FirstOrDefault();
                        string proxy = bits[5];
                        decimal fineAmount = GetFineAmount(bits[6]);
                        DateTime fineCreatedDate = DateTime.Now;
                        DateTime offenceDate = GetOffenceDate(offenceDateString, offenceTimeString);
                        string username = Constants.CancomFTPServiceUser;
                        bool isAartoFine = bits[7] == "1" ? true : false;
                        string fineStatus = "Sent";

                        try
                        {
                            var dupCheck = db.GetTrafficFineByNumber(fineNumber);
                            if (dupCheck != null)
                            {
                                duplicateCnt++;
                                string ExportFileName = (base.FileName == null) ? string.Empty : base.FileName;
                                DateTime FileDate = DateTime.Now;
                                db.CreateDuplicateFine(ExportFileName, FileDate, fineNumber);
                            }
                            else
                            {
                                var adminFee = db.GetAdminFee();
                                db.UploadFTPFineData(fineNumber, fineAmount, vehicleRegistration, offenceDate, offenceDateString, offenceTimeString, trafficDepartmentId, proxy, false, "Imported", username, adminFee, isAartoFine, dupCheck != null, fineStatus);
                            }

                            itemCnt++;
                        }
                        catch
                        {
                            errorCnt++;
                        }
                    }
                    else
                    {
                        erroredFines.Add(line);
                        continue;
                    }

                }

Now the problem is, this file doesn't always come with 9 elements as we expect, for example on this image, the lines are not the same(ignore first line, its headers) CSV lines

On first line FM is supposed to be part of 36DXGP instead of being two separated elements. This means the columns are now extra. Now this brings us to the issue at hand, which is the time element, beacuse of extra coma, the time is now something else, is now read as 20161216, so the split on the time element is not working at all. So what I did was, read the incorrect line, check its length, if the length is not 9 then, add it to the error list and continue.

But my continue key word doesn't seem to work, it gets into the else part and then goes back to read the very same error line.

I have checked answers on Break vs Continue and they provide good example on how continue works, I introduced the else because the format on this examples did not work for me(well the else did not made any difference neither). Here is the sample data,

NOTE the first line to be read starts with 96

 H,1789,,,,,,,,
96/17259/801/035415,FM,36DXGP,20161216,17.39,city hall-cape town,Makofane,200,0,0
MA/80/034808/730,CA230721,20170117,17.43,malmesbury,PATEL,200,0,0,

what is it that I am doing so wrong here

PaulF
  • 6,673
  • 2
  • 18
  • 29
Mronzer
  • 399
  • 1
  • 7
  • 18
  • 2
    please provide [mcve] it's really hard to understand what you're trying to achieve and what's you're code supposed to do – Felix Av Jul 27 '17 at 10:36
  • It looks like your code should be OK. Adding a sample of the data, rather than a screenshot would be far more useful - then we could do a cut&paste to write a bit of code to test. – PaulF Jul 27 '17 at 10:43
  • Have you tried stepping though the code with a debugger to see what may be happening? – PaulF Jul 27 '17 at 10:44
  • Nothing. You do not need the continue, then else does the same function. The Line with FM I do not think is an error. I think it indicates a different format. The right thing to do is to check if(bits[1] == "FM") and treat data different. – jdweng Jul 27 '17 at 11:04
  • @PaulF I have added sample data and removed few unnecessary information, also did some grammar fixes, and yes I did, when I step through the code, it just goes back to the start of the loop but instead of reading the second line, it reads the very same first line – Mronzer Jul 27 '17 at 11:10
  • Looking at your code - as there is no additional processing after the end of the else statement - then continue is not necessary - have you tried removing that statement to see what happens. – PaulF Jul 27 '17 at 11:25
  • @PaulF when I stepped in and looked closely at the code, it looks like the length is the issue here, it always comes back as 10 which is fine for first line but then it should be 9 for the second line, should it not? – Mronzer Jul 27 '17 at 11:34
  • You have a trailing comma on the end of the second line which causes that line to return 10. – PaulF Jul 27 '17 at 11:40
  • @PaulF yes I have realized that, I am now trying to figure out how to ignore the empty element(the space after the trailing comma), because now the only difference between the two lines is, the first one has 0 a data whereas the second line has " "(empty string) as its data – Mronzer Jul 27 '17 at 11:45
  • It is not the only difference - as you state in your question the date & time values are not in the same columns - so just checking the last data item may not resolve the issue. – PaulF Jul 27 '17 at 11:50

2 Answers2

0

I have found a way to solve my problem, there was an issue with the length of the line because of the trailing comma which caused an empty element, I then got rid of this empty element with this code and determined the new length

bits = bits.Where(x => !string.IsNullOrEmpty(x)).ToArray(); int length = bits.Length

All is well now

Mronzer
  • 399
  • 1
  • 7
  • 18
  • "suggest you use the following overload", you might have forgot to include the actual overload code(method) you're referring to here :) – Mronzer Jul 28 '17 at 06:13
0

I suggest you use the following overload for performance and readability reasons:

line.Split(new char[] {','}, StringSplitOptions.RemoveEmptyEntries)l
dnickless
  • 10,733
  • 1
  • 19
  • 34