251

My CSV data looks like this:

heading1,heading2,heading3,heading4,heading5
value1_1,value2_1,value3_1,value4_1,value5_1
value1_2,value2_2,value3_2,value4_2,value5_2
...

How do you read this data and convert to an array like this using JavaScript?:

[
    heading1: value1_1,
    heading2: value2_1,
    heading3: value3_1,
    heading4: value4_1
    heading5: value5_1
],[
    heading1: value1_2,
    heading2: value2_2,
    heading3: value3_2,
    heading4: value4_2,
    heading5: value5_2
]
....

I've tried this code but no luck!:

<script type="text/javascript">
    var allText =[];
    var allTextLines = [];
    var Lines = [];

    var txtFile = new XMLHttpRequest();
    txtFile.open("GET", "file://d:/data.txt", true);
    txtFile.onreadystatechange = function()
    {
        allText = txtFile.responseText;
        allTextLines = allText.split(/\r\n|\n/);
    };

    document.write(allTextLines);
    document.write(allText);
    document.write(txtFile);
</script>
Stephen Ostermiller
  • 23,933
  • 14
  • 88
  • 109
Mahesh Thumar
  • 4,257
  • 5
  • 22
  • 30

16 Answers16

240

No need to write your own...

The jQuery-CSV library has a function called $.csv.toObjects(csv) that does the mapping automatically.

Note: The library is designed to handle any CSV data that is RFC 4180 compliant, including all of the nasty edge cases that most 'simple' solutions overlook.

Like @Blazemonger already stated, first you need to add line breaks to make the data valid CSV.

Using the following dataset:

heading1,heading2,heading3,heading4,heading5
value1_1,value2_1,value3_1,value4_1,value5_1
value1_2,value2_2,value3_2,value4_2,value5_2

Use the code:

var data = $.csv.toObjects(csv):

The output saved in 'data' will be:

[
  { heading1:"value1_1",heading2:"value2_1",heading3:"value3_1",heading4:"value4_1",heading5:"value5_1" } 
  { heading1:"value1_2",heading2:"value2_2",heading3:"value3_2",heading4:"value4_2",heading5:"value5_2" }
]

Note: Technically, the way you wrote the key-value mapping is invalid JavaScript. The objects containing the key-value pairs should be wrapped in brackets.

If you want to try it out for yourself, I suggest you take a look at the Basic Usage Demonstration under the 'toObjects()' tab.

Disclaimer: I'm the original author of jQuery-CSV.

Update:

Edited to use the dataset that the op provided and included a link to the demo where the data can be tested for validity.

Update2:

Due to the shuttering of Google Code. jquery-csv has moved to GitHub

meJustAndrew
  • 6,011
  • 8
  • 50
  • 76
Evan Plaice
  • 13,944
  • 6
  • 76
  • 94
  • 4
    IOW, "toObject" is or can be thought of as "toJSON", no? And, is the colon following the call to toObjects(csv) a typo? IOW, shouldn't that be a semicolon? – B. Clay Shannon-B. Crow Raven Jul 31 '13 at 15:27
  • @ClayShannon toJSON() (ie JSON.stringify()) implies that the CSV data is parsed and outputted in string format (ie text/json). toObjects actually maps the CSV into memory as an array of objects where the column names are used as the key names. $.csv.toObjects() is the CSV equivalent to JSON.parse(). The output above is just the simplify what the data would look like in memory. – Evan Plaice Aug 01 '13 at 19:15
  • 11
    Fantastic library. FYI, parameter `csv` passed is a csv string - read the csv file as text to get the csv string. – callmekatootie Jan 17 '14 at 05:36
  • @callmekatootie Thanks and, yes input is a string. The input can come from anything from AJAX requests to butterfly wing flapping. All jquery-csv is concerned with is chomping on CSV and spitting out javascript data (or vice versa). – Evan Plaice Jan 17 '14 at 22:41
  • 4
    @Evan Plaice How to use this library for reading from a csv file? – Richa Sinha Jun 04 '15 at 13:15
  • 1
    @RichaSinha Read the file in as a text buffer via the HTML5 File API or AJAX. Then pass the string buffer into the parser. It'll spit out an array of data as a result. See the project page for examples. – Evan Plaice Sep 08 '15 at 16:39
  • @RichaSinha Also, if you need to parse a very large CSV file (ie > 10 MB) you'll beed a parser capable of stream parsing. PapaParse is a great alternative to jquery-csv capable of more, including stream parsing. – Evan Plaice May 04 '16 at 15:11
  • 1
    @GreySage Understandable, jquery-csv doesn't actually require jquery. It's just a set of add-on utility functions that get attached to the jquery namespace for consistency. The original intent was to extend the jquery (pseudo monad) object model. Maybe one day if the :: operator is added to Javascript, that will become a reality, for now the project scope is frozen in maintenance mode. – Evan Plaice Apr 06 '17 at 05:51
  • Using jquery in the name was a huge benefit in attracting a user base, which in total measures somewhere in the 500k-1m downloads. The end goal was to create a parser implementation that actually supported the entire RFC spec (at the time every other alternative fell short). Now there are many alternatives, and it seems that most if not all CSV parsers for JS now follow the trend of implementing RFC compliance as a baseline. In that sense the project was a success and if I kill it tomorrow I'm happy to say that there are now better alternatives available in the JS ecosystem. – Evan Plaice Apr 06 '17 at 05:58
  • Has anyone been able to get this to work. I can't even get the $.csv to work. It says $ is not defined. – JokerMartini Feb 14 '20 at 15:25
  • You need to include Jquery CDN in your html – Vishal Jain Jul 17 '20 at 08:26
  • The link gives a 404, kindly fix it – Eagle Aug 27 '20 at 00:39
  • Hi @JokerMartini, I think Vishal Jain is right. You are missing the Jquery, you can import it from [CDNjs](https://cdnjs.com/libraries/jquery). And I provide an [example](https://stackoverflow.com/a/67513624/9935654) you can reference. – Carson May 13 '21 at 03:57
140

NOTE: I concocted this solution before I was reminded about all the "special cases" that can occur in a valid CSV file, like escaped quotes. I'm leaving my answer for those who want something quick and dirty, but I recommend Evan's answer for accuracy.


This code will work when your data.txt file is one long string of comma-separated entries, with no newlines:

data.txt:

 heading1,heading2,heading3,heading4,heading5,value1_1,...,value5_2

javascript:

$(document).ready(function() {
    $.ajax({
        type: "GET",
        url: "data.txt",
        dataType: "text",
        success: function(data) {processData(data);}
     });
});

function processData(allText) {
    var record_num = 5;  // or however many elements there are in each row
    var allTextLines = allText.split(/\r\n|\n/);
    var entries = allTextLines[0].split(',');
    var lines = [];

    var headings = entries.splice(0,record_num);
    while (entries.length>0) {
        var tarr = [];
        for (var j=0; j<record_num; j++) {
            tarr.push(headings[j]+":"+entries.shift());
        }
        lines.push(tarr);
    }
    // alert(lines);
}

The following code will work on a "true" CSV file with linebreaks between each set of records:

data.txt:

heading1,heading2,heading3,heading4,heading5
value1_1,value2_1,value3_1,value4_1,value5_1
value1_2,value2_2,value3_2,value4_2,value5_2

javascript:

$(document).ready(function() {
    $.ajax({
        type: "GET",
        url: "data.txt",
        dataType: "text",
        success: function(data) {processData(data);}
     });
});

function processData(allText) {
    var allTextLines = allText.split(/\r\n|\n/);
    var headers = allTextLines[0].split(',');
    var lines = [];

    for (var i=1; i<allTextLines.length; i++) {
        var data = allTextLines[i].split(',');
        if (data.length == headers.length) {

            var tarr = [];
            for (var j=0; j<headers.length; j++) {
                tarr.push(headers[j]+":"+data[j]);
            }
            lines.push(tarr);
        }
    }
    // alert(lines);
}

http://jsfiddle.net/mblase75/dcqxr/

Community
  • 1
  • 1
Blazemonger
  • 90,923
  • 26
  • 142
  • 180
  • 4
    By the way, this assumes that the CSV file does in fact have multiple rows -- that's what the `allText.split(/\r\n|\n/)` splits on. If all your data is in fact one long string of comma-separated data with no newlines, it's not a real CSV file. – Blazemonger Sep 15 '11 at 13:28
  • 2
    Hi I've use this Code: But there is no output. Just a blank alert displayed. my file is look like: heading1,heading2,heading3,heading4,heading5,value1_1,value2_1,value3_1,value4_1,value5_1,value1_2,value2_2,value3_2,value4_2,value5_2 Both csv.html and data.txt are in same folder – Mahesh Thumar Sep 15 '11 at 13:29
  • If this is not correct file(or data) then how should my file look like?? – Mahesh Thumar Sep 15 '11 at 13:36
  • What do you really mean By the Word: "True CSV" ?? – Mahesh Thumar Sep 15 '11 at 13:40
  • 8
    The code may not handle all valid IETF standard CSV files, and may fail if there are strings which have embedded commas, line breaks or double quotes. For instance, `1, "IETF allows ""quotes"", commas and \nline breaks"` which is allowed since the string is surrounded with double quotes, and the double quotes are escaped. – prototype Apr 12 '12 at 01:39
  • You forgot to remove quote delimiters on quoted values, unescape special characters, remove excess whitespace, among other things. The IETF spec for CSV is more complicated than most people assume. You're right stating that entries should be separated by a (unquoted) newline char but otherwise this solution is lacking. – Evan Plaice Sep 05 '12 at 20:40
  • True. However, for the OP's simpler CSV, the above code does the job. – Blazemonger Sep 06 '12 at 12:57
  • 1
    I was trying to read a .csv file from a mac. I was only able to get this script to recognize newline characters when I changed the first split to this `var allTextLines = allText.split("\r");` After that it worked great! Thanks! – Joe Aug 14 '15 at 12:50
  • @Joe I'd suggest keeping it `var allTextLines = allText.split(/\r\n|\n|\r/);` to continue leveraging the regex functionality and not a predefined string that will only work with Mac-format line returns. – Edwin Jun 28 '16 at 21:13
  • @Blazemonger what if instead of creating an array for each row - if i wanted to create an array based on column so if i have 6 columns with 40 rows. Instead of 40 arrays with 6 arrays within... i want 6 arrays with an array of 40. – bluePearl Apr 26 '19 at 22:38
  • @Blazemonger Can specify in the answer that everyone should use this on localhost? Since I struggled with it so, I don't others do the same. Cheers. – Utkarsh May 28 '20 at 17:18
  • How do you add this part of code in a website? As `` or? How can you re-run this function in intervals? – FotisK Apr 15 '21 at 19:17
102

Don't split on commas -- it won't work for most CSV files, and this question has wayyyy too many views for the asker's kind of input data to apply to everyone. Parsing CSV is kind of scary since there's no truly official standard, and lots of delimited text writers don't consider edge cases.

This question is old, but I believe there's a better solution now that Papa Parse is available. It's a library I wrote, with help from contributors, that parses CSV text or files. It's the only JS library I know of that supports files gigabytes in size. It also handles malformed input gracefully.

1 GB file parsed in 1 minute: Parsed 1 GB file in 1 minute

(Update: With Papa Parse 4, the same file took only about 30 seconds in Firefox. Papa Parse 4 is now the fastest known CSV parser for the browser.)

Parsing text is very easy:

var data = Papa.parse(csvString);

Parsing files is also easy:

Papa.parse(file, {
    complete: function(results) {
        console.log(results);
    }
});

Streaming files is similar (here's an example that streams a remote file):

Papa.parse("http://example.com/bigfoo.csv", {
    download: true,
    step: function(row) {
        console.log("Row:", row.data);
    },
    complete: function() {
        console.log("All done!");
    }
});

If your web page locks up during parsing, Papa can use web workers to keep your web site reactive.

Papa can auto-detect delimiters and match values up with header columns, if a header row is present. It can also turn numeric values into actual number types. It appropriately parses line breaks and quotes and other weird situations, and even handles malformed input as robustly as possible. I've drawn on inspiration from existing libraries to make Papa, so props to other JS implementations.

Matt
  • 22,721
  • 17
  • 71
  • 112
  • +1 Good job on Papa Parse. I'd like to study it in detail someday to see how you handled large files and streaming. I'm very happy to see other developers writing fully featured parsers that pick up where jquery-csv left off. – Evan Plaice Jun 06 '15 at 01:37
  • 3
    @EvanPlaice Thanks. You might like this presentation I gave last night at a local meetup: https://docs.google.com/presentation/d/1bmK96ETMtUHG3LFU2sN05ztrdLDq-5WJwJOJttQ515Y/edit?usp=sharing – Matt Jun 06 '15 at 01:40
  • 1
    @ Matt That was an awesome presentation which describes about papa parse in more understanding way – siva Aug 25 '15 at 10:13
  • @Matt Interesting presentation, I wish I could have seen it in person. I have so many questions. The main one being, considering the memory limitations what would you do with the results after parsing a 1GB CSV file. Since you can't display it all in the client (ie due to memory constraints), do you stream it to a different format, filter/display a subset of the data, etc? – Evan Plaice Sep 08 '15 at 16:50
  • @EvanPlaice The answer to that depends on your application. The parser will get the data into memory for you, and it's up to you whether you keep it there. I know it's a long shot for you in SD, but I'm actually speaking about this in a couple weeks in Salt Lake City at [UtahJS Conf](http://conf.utahjs.com/schedule), and I'll cover at least one way to deal with the loaded data. – Matt Sep 08 '15 at 20:53
  • @Matt Too mad, I wish I could make it. The original use case I had to write jquery-csv was to create a client side application where the user could load a a CSV file with data to be edited and batch imported into a database. I didn't really come up with memory limitation issues until users posted complaints of Chrome throwing 'Aw Snap' messages. – Evan Plaice Sep 09 '15 at 01:27
  • The big downside with papa parse is that you can't point to local folders : https://threejs.org/docs/#manual/introduction/How-to-run-thing-locally – stallingOne Jul 19 '17 at 13:16
  • Hi @Matt! Papa Parse is fast and impressive! however, I am having trouble with parsing .csv files that came from MS Excel. It is formatted this way: `column1,column2 (enter) valueA1,valueA2 (enter) valueB1, valueB2(enter).....` Can Papa Parse parse dat string? – Malcolm Salvador Jul 31 '17 at 08:07
  • 1
    @Malky.Kid That's not valid CSV (ie spaces in a non-delimited value are not good). MS Excel's CSV format implementation sucks. If you still have access to the source file, there should be an option to enable quote delimiters. Once you do that, your data should work with any csv parser. – Evan Plaice Jan 16 '18 at 01:55
15

I am using d3.js for parsing csv file. Very easy to use. Here is the docs.

Steps:

  • npm install d3-request

Using Es6;

import { csv } from 'd3-request';
import url from 'path/to/data.csv';

csv(url, function(err, data) {
 console.log(data);
})

Please see docs for more.

Update - d3-request is deprecated. you can use d3-fetch

ZaidRehman
  • 1,631
  • 2
  • 19
  • 30
Bimal Grg
  • 7,624
  • 2
  • 24
  • 21
8

Here's a JavaScript function that parses CSV data, accounting for commas found inside quotes.

// Parse a CSV row, accounting for commas inside quotes                   
function parse(row){
  var insideQuote = false,                                             
      entries = [],                                                    
      entry = [];
  row.split('').forEach(function (character) {                         
    if(character === '"') {
      insideQuote = !insideQuote;                                      
    } else {
      if(character == "," && !insideQuote) {                           
        entries.push(entry.join(''));                                  
        entry = [];                                                    
      } else {
        entry.push(character);                                         
      }                                                                
    }                                                                  
  });
  entries.push(entry.join(''));                                        
  return entries;                                                      
}

Example use of the function to parse a CSV file that looks like this:

"foo, the column",bar
2,3
"4, the value",5

into arrays:

// csv could contain the content read from a csv file
var csv = '"foo, the column",bar\n2,3\n"4, the value",5',

    // Split the input into lines
    lines = csv.split('\n'),

    // Extract column names from the first line
    columnNamesLine = lines[0],
    columnNames = parse(columnNamesLine),

    // Extract data from subsequent lines
    dataLines = lines.slice(1),
    data = dataLines.map(parse);

// Prints ["foo, the column","bar"]
console.log(JSON.stringify(columnNames));

// Prints [["2","3"],["4, the value","5"]]
console.log(JSON.stringify(data));

Here's how you can transform the data into objects, like D3's csv parser (which is a solid third party solution):

var dataObjects = data.map(function (arr) {
  var dataObject = {};
  columnNames.forEach(function(columnName, i){
    dataObject[columnName] = arr[i];
  });
  return dataObject;
});

// Prints [{"foo":"2","bar":"3"},{"foo":"4","bar":"5"}]
console.log(JSON.stringify(dataObjects));

Here's a working fiddle of this code.

Enjoy! --Curran

curran
  • 1,261
  • 13
  • 8
  • Just what I'm looking for and far simpler than what I'd written. Thanks. I made it into this function on CodePen: https://codepen.io/rgraph/pen/NWBwbWp?editors=1010 – Richard Jan 18 '23 at 20:29
6

You can use PapaParse to help. https://www.papaparse.com/

Here is a CodePen. https://codepen.io/sandro-wiggers/pen/VxrxNJ

Papa.parse(e, {
            header:true,
            before: function(file, inputElem){ console.log('Attempting to Parse...')},
            error: function(err, file, inputElem, reason){ console.log(err); },
            complete: function(results, file){ $.PAYLOAD = results; }
        });
Sandro Wiggers
  • 4,440
  • 3
  • 20
  • 25
4

If you want to solve this without using Ajax, use the FileReader() Web API.

Example implementation:

  1. Select .csv file
  2. See output

function readSingleFile(e) {
  var file = e.target.files[0];
  if (!file) {
    return;
  }

  var reader = new FileReader();
  reader.onload = function(e) {
    var contents = e.target.result;
    displayContents(contents);
    displayParsed(contents);
  };
  reader.readAsText(file);
}

function displayContents(contents) {
  var element = document.getElementById('file-content');
  element.textContent = contents;
}

function displayParsed(contents) {
  const element = document.getElementById('file-parsed');
  const json = contents.split(',');
  element.textContent = JSON.stringify(json);
}

document.getElementById('file-input').addEventListener('change', readSingleFile, false);
<input type="file" id="file-input" />

<h3>Raw contents of the file:</h3>
<pre id="file-content">No data yet.</pre>

<h3>Parsed file contents:</h3>
<pre id="file-parsed">No data yet.</pre>
Robin
  • 41
  • 4
3
function CSVParse(csvFile)
{
    this.rows = [];

    var fieldRegEx = new RegExp('(?:\s*"((?:""|[^"])*)"\s*|\s*((?:""|[^",\r\n])*(?:""|[^"\s,\r\n]))?\s*)(,|[\r\n]+|$)', "g");   
    var row = [];
    var currMatch = null;

    while (currMatch = fieldRegEx.exec(this.csvFile))
    {
        row.push([currMatch[1], currMatch[2]].join('')); // concatenate with potential nulls

        if (currMatch[3] != ',')
        {
            this.rows.push(row);
            row = [];
        }

        if (currMatch[3].length == 0)
            break;
    }
}

I like to have the regex do as much as possible. This regex treats all items as either quoted or unquoted, followed by either a column delimiter, or a row delimiter. Or the end of text.

Which is why that last condition -- without it it would be an infinite loop since the pattern can match a zero length field (totally valid in csv). But since $ is a zero length assertion, it won't progress to a non match and end the loop.

And FYI, I had to make the second alternative exclude quotes surrounding the value; seems like it was executing before the first alternative on my javascript engine and considering the quotes as part of the unquoted value. I won't ask -- just got it to work.

Gerard ONeill
  • 3,914
  • 39
  • 25
  • Unfortunately I got into an infinite loop with this function. – Hauke Aug 06 '19 at 15:31
  • @Hauke -- if you could break the data down into a couple of columns and lines that still produce the infinite loop, I'd appreciate it -- it may give me insight into why I was failing before. – Gerard ONeill Aug 06 '19 at 20:03
3

This is an old question and in 2022 there are many ways to achieve this. First, I think D3 is one of the best alternatives for data manipulation. It's open sourced and free to use, but also it's modular so we can import just the fetch module.

Here is a basic example. We will use the legacy mode so I will import the entire D3 library. Now, let's call d3.csv function and it's done. This function internally calls the fetch method therefore, it can open dataURL, url, files, blob, and so on.

const fileInput = document.getElementById('csv')
const outElement = document.getElementById('out')
const previewCSVData = async dataurl => {
  const d = await d3.csv(dataurl)
  console.log({
    d
  })
  outElement.textContent = d.columns
}

const readFile = e => {
  const file = fileInput.files[0]
  const reader = new FileReader()
  reader.onload = () => {
    const dataUrl = reader.result;
    previewCSVData(dataUrl)
  }
  reader.readAsDataURL(file)
}

fileInput.onchange = readFile
<script type="text/javascript" src="https://unpkg.com/d3@7.6.1/dist/d3.min.js"></script>
<div>
  <p>Select local CSV File:</p>
  <input id="csv" type="file" accept=".csv">
</div>
<pre id="out"><p>File headers will appear here</p></pre>

If we don't want to use any library and we just want to use pain JavaScrip (Vanilla JS) and we managed to get the text content of a file as data and we don't want to use d3 we can implement a simple function that will split the data into a text array then we will extract the first line and split into a headers array and the rest of the text will be the lines we will process. After, we map each line and extract its values and create a row object from an array created from mapping each header to its correspondent value from values[index].

NOTE:

We also we going to use a little trick array objects in JavaScript can also have attributes. Yes so we will define an attribute rows.headers and assign the headers to it.

const data = `heading_1,heading_2,heading_3,heading_4,heading_5
value_1_1,value_2_1,value_3_1,value_4_1,value_5_1
value_1_2,value_2_2,value_3_2,value_4_2,value_5_2
value_1_3,value_2_3,value_3_3,value_4_3,value_5_3`

const csvParser = data => {
  const text = data.split(/\r\n|\n/)
  const [first, ...lines] = text
  const headers = first.split(',')
  const rows = []
  rows.headers = headers 
  lines.map(line => {
    const values = line.split(',')
    const row = Object.fromEntries(headers.map((header, i) => [header, values[i]]))
    rows.push(row)
  })

  return rows
}

const d = csvParser(data)
// Accessing to the theaders attribute
const headers = d.headers
console.log({headers})
console.log({d})

Finally, let's implement a vanilla JS file loader using fetch and parsing the csv file.

const fetchFile = async dataURL => {
  return await fetch(dataURL).then(response => response.text())
}

const csvParser = data => {
  const text = data.split(/\r\n|\n/)
  const [first, ...lines] = text
  const headers = first.split(',')
  const rows = []
  rows.headers = headers 
  lines.map(line => {
    const values = line.split(',')
    const row = Object.fromEntries(headers.map((header, i) => [header, values[i]]))
    rows.push(row)
  })

  return rows
}

const fileInput = document.getElementById('csv')
const outElement = document.getElementById('out')
const previewCSVData = async dataURL => {
  const data = await fetchFile(dataURL)
  const d = csvParser(data)
  console.log({ d })
  outElement.textContent = d.headers
}

const readFile = e => {
  const file = fileInput.files[0]
  const reader = new FileReader()
  reader.onload = () => {
    const dataURL = reader.result;
    previewCSVData(dataURL)
  }
  reader.readAsDataURL(file)
}

fileInput.onchange = readFile
<script type="text/javascript"  src="https://unpkg.com/d3@7.6.1/dist/d3.min.js"></script>
<div>
  <p>Select local CSV File:</p>
  <input id="csv" type="file" accept=".csv">
</div>
<pre id="out"><p>File contents will appear here</p></pre>

I used this file to test it

Teocci
  • 7,189
  • 1
  • 50
  • 48
2

Per the accepted answer,

I got this to work by changing the 1 to a 0 here:

for (var i=1; i<allTextLines.length; i++) {

changed to

for (var i=0; i<allTextLines.length; i++) {

It will compute the a file with one continuous line as having an allTextLines.length of 1. So if the loop starts at 1 and runs as long as it's less than 1, it never runs. Hence the blank alert box.

Liam
  • 27,717
  • 28
  • 128
  • 190
Adam Grant
  • 12,477
  • 10
  • 58
  • 65
2
$(function() {

      $("#upload").bind("click", function() {
            var regex = /^([a-zA-Z0-9\s_\\.\-:])+(.csv|.xlsx)$/;
            if (regex.test($("#fileUpload").val().toLowerCase())) {
              if (typeof(FileReader) != "undefined") {
                var reader = new FileReader();
                reader.onload = function(e) {
                    var customers = new Array();
                    var rows = e.target.result.split("\r\n");
                    for (var i = 0; i < rows.length - 1; i++) {
                      var cells = rows[i].split(",");
                      if (cells[0] == "" || cells[0] == undefined) {
                        var s = customers[customers.length - 1];
                        s.Ord.push(cells[2]);
                      } else {
                        var dt = customers.find(x => x.Number === cells[0]);
                        if (dt == undefined) {
                          if (cells.length > 1) {
                            var customer = {};
                            customer.Number = cells[0];
                            customer.Name = cells[1];
                            customer.Ord = new Array();

                            customer.Ord.push(cells[2]);
                            customer.Point_ID = cells[3];
                            customer.Point_Name = cells[4];
                            customer.Point_Type = cells[5];
                            customer.Set_ORD = cells[6];
                            customers.push(customer);
                          }
                        } else {
                          var dtt = dt;
                          dtt.Ord.push(cells[2]);

                        }
                      }
                    }
double-beep
  • 5,031
  • 17
  • 33
  • 41
  • While this code may solve the question, [including an explanation](//meta.stackexchange.com/q/114762) of how and why this solves the problem would really help to improve the quality of your post, and probably result in more up-votes. Remember that you are answering the question for readers in the future, not just the person asking now. Please [edit] your answer to add explanations and give an indication of what limitations and assumptions apply. [From Review](/review/late-answers/26218139) – double-beep May 24 '20 at 18:05
2

Actually you can use a light-weight library called any-text.

  • install dependencies
npm i -D any-text
  • use custom command to read files
var reader = require('any-text');
 
reader.getText(`path-to-file`).then(function (data) {
  console.log(data);
});

or use async-await :

var reader = require('any-text');
 
const chai = require('chai');
const expect = chai.expect;
 
describe('file reader checks', () => {
  it('check csv file content', async () => {
    expect(
      await reader.getText(`${process.cwd()}/test/files/dummy.csv`)
    ).to.contains('Lorem ipsum');
  });
});
Abhinaba
  • 376
  • 1
  • 8
1

Here is another way to read an external CSV into Javascript (using jQuery).

It's a little bit more long winded, but I feel by reading the data into arrays you can exactly follow the process and makes for easy troubleshooting.

Might help someone else.

The data file example:

Time,data1,data2,data2
08/11/2015 07:30:16,602,0.009,321

And here is the code:

$(document).ready(function() {
 // AJAX in the data file
    $.ajax({
        type: "GET",
        url: "data.csv",
        dataType: "text",
        success: function(data) {processData(data);}
        });

    // Let's process the data from the data file
    function processData(data) {
        var lines = data.split(/\r\n|\n/);

        //Set up the data arrays
        var time = [];
        var data1 = [];
        var data2 = [];
        var data3 = [];

        var headings = lines[0].split(','); // Splice up the first row to get the headings

        for (var j=1; j<lines.length; j++) {
        var values = lines[j].split(','); // Split up the comma seperated values
           // We read the key,1st, 2nd and 3rd rows 
           time.push(values[0]); // Read in as string
           // Recommended to read in as float, since we'll be doing some operations on this later.
           data1.push(parseFloat(values[1])); 
           data2.push(parseFloat(values[2]));
           data3.push(parseFloat(values[3]));

        }

    // For display
    var x= 0;
    console.log(headings[0]+" : "+time[x]+headings[1]+" : "+data1[x]+headings[2]+" : "+data2[x]+headings[4]+" : "+data2[x]);
    }
})

Hope this helps someone in the future!

FredFury
  • 2,286
  • 1
  • 22
  • 27
  • Hello from the future, so I tried this answer out and it was missing a `)` sign at line 45 so I added it, but now on line 9 it's giving me a console error `Uncaught ReferenceError: $ is not defined at index.html:9` Could you assist in this? – Lasagna Cat Apr 18 '17 at 17:13
1

A bit late but I hope it helps someone.

Some time ago even I faced a problem where the string data contained \n in between and while reading the file it used to read as different lines.

Eg.

"Harry\nPotter","21","Gryffindor"

While-Reading:

Harry
Potter,21,Gryffindor

I had used a library csvtojson in my angular project to solve this problem.

You can read the CSV file as a string using the following code and then pass that string to the csvtojson library and it will give you a list of JSON.

Sample Code:

const csv = require('csvtojson');
if (files && files.length > 0) {
    const file: File = files.item(0);
    const reader: FileReader = new FileReader();
    reader.readAsText(file);
    reader.onload = (e) => {
    const csvs: string = reader.result as string;
    csv({
        output: "json",
        noheader: false
    }).fromString(csvs)
        .preFileLine((fileLine, idx) => {
        //Convert csv header row to lowercase before parse csv file to json
        if (idx === 0) { return fileLine.toLowerCase() }
        return fileLine;
        })
        .then((result) => {
        // list of json in result
        });
    }
}
bhavya_karia
  • 760
  • 1
  • 6
  • 12
1

I use the jquery-csv to do this.

evanplaice/jquery-csv

and I provide two examples as below

async function ReadFile(file) {
  return await file.text()
}

function removeExtraSpace(stringData) {
  stringData = stringData.replace(/,( *)/gm, ",")  // remove extra space
  stringData = stringData.replace(/^ *| *$/gm, "") // remove space on the beginning and end.
  return stringData  
}

function simpleTest() {
  let data = `Name, Age, msg
  foo, 25, hello world   
  bar, 18, "!!  !!"
  `
  data = removeExtraSpace(data)
  console.log(data)
  const options = {
      separator: ",", // default "," . (You may want to Tab "\t" or somethings.
      delimiter: '"', // default "
      headers: true // default true
  }
  // const myObj = $.csv.toObjects(data, options)
  const myObj = $.csv.toObjects(data) // If you want to use default options, then you can omit them.
  console.log(myObj)
}

window.onload = () => {
  const inputFile = document.getElementById("uploadFile")
  inputFile.onchange = () => {
    const inputValue = inputFile.value
    if (inputValue === "") {
      return
    }

    const selectedFile = document.getElementById('uploadFile').files[0]
    const promise = new Promise(resolve => {
      const fileContent = ReadFile(selectedFile)
      resolve(fileContent)
    })

    promise.then(fileContent => {
      // Use promise to wait for the file reading to finish.
      console.log(fileContent)
      fileContent = removeExtraSpace(fileContent)
      const myObj = $.csv.toObjects(fileContent)
      console.log(myObj)
    })
  }
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.6.0/jquery.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery-csv/1.0.11/jquery.csv.min.js"></script>

<label for="uploadFile">Demo 1</label>
<input type="file" id="uploadFile" accept=".csv"/>
<button onclick="simpleTest()">Demo 2</button>
Carson
  • 6,105
  • 2
  • 37
  • 45
1

With this function csvToObjs you can transform data-entries from format CSV to an array of objects.

function csvToObjs(string) {
    const lines = data.split(/\r\n|\n/);
    let [headings, ...entries] = lines;
    headings = headings.split(',');
    const objs = [];
    entries.map(entry=>{
        obj = entry.split(',');
        objs.push(Object.fromEntries(headings.map((head, i)=>[head, obj[i]])));
    })
    return objs;
}


data = `heading1,heading2,heading3,heading4,heading5
value1_1,value2_1,value3_1,value4_1,value5_1
value1_2,value2_2,value3_2,value4_2,value5_2`
console.log(csvToObjs(data));
XMehdi01
  • 5,538
  • 2
  • 10
  • 34