Read file csv and convert to array nodejs

Read file csv and convert to array nodejs
0

#1
var fs = require('fs');
var csv = require('fast-csv');
var list = [];
const data = "";
var read = fs.createReadStream('my.csv')
.pipe(csv())
.on('data',function(data){
     list.push(data[1]); // I already tried to get out data in here but data don't get out the function
        console.log(data);
})
.on('end', function(data){
    console.log('Read finished');
})

console.log("test"); // why it run the first
console.log(data); // empty
for (var i = 0; i < list.length; i++){
     console.log(list[i]); // list is empty.Why?
}

I want to get out data(the variable at .on(‘data’,function(data){}) and save it into list. Can you explain why it don’t run and why console.log(“test”) run the first? Can you edit it and give me the way to solve my problem?

link my.csv:https://ufile.io/p50et

input: my.csv
output: list [“0xdD7Af25bCC73f6583136858c687C44aD35aad3bC”,“0xfa634311F614A24e4Ef9fe82699800054Af291e4”]


#2

If console.log("test") is running then it sounds like your program is doing something.
If it is running first then it must be doing at least two somethings.

What output are you seeing on the console? Is the contents of your csv file displayed as an array(s)?

What are you expecting the program to do or wanting it to do?


#3

Sorry my question isn’t clear. I edited my post.


#4

Hi,

This part of your program runs asynchronously, which means the rest of your program continues while your fast-csv is busy retrieving the data. That is why console.log(“test”) runs first. That is also why console.log(data) is empty - that line runs before the callback has finished retrieving data.

.on('data',function(data){
     list.push(data[1]); // I already tried to get out data in here but data don't get out the function
        console.log(data);
})

Also, you never actually put the data into your list array. You probably don’t really need to because data is already an array.

Here is an example code that should display your csv file. In this example I did assign your data array to list.

var fs = require('fs');
var csv = require('fast-csv');
var list = [];
let myData ;
var read = fs.createReadStream('my.csv')
.pipe(csv())
.on('data',function(data){  // this function executes once the data has been retrieved
    console.log(data);  // see, data is already an array
    list = data; // so you might not need to do this

    for(let i = 0; i < list.length; i++){
       console.log(list[i]);
    }
})
.on('end', function(data){
    console.log('Read finished');
})

console.log("test");

To fully understand this it would be best if you read up on javascript asynchronous programming


#5

Well, I know that this question has been asked a long back but just now I got to work with CSV file for creating API with node js. Being a typical programmer I googled “Reading from a file with fast-csv and writing into an array” well something like this but till date, there isn’t any proper response for the question hence I decided to answer this.
Well on is async function and hence execution will be paused in main flow and will be resumed only after nonasync function gets executed

var querryParameter = ()=> new Promise( resolve =>{
   let returnLit = []
   csv.fromPath("<fileName>", {headers : true})
      .on('data',(data)=>{
          returnLit.push(data[<header name>].trim())
      })
      .on('end',()=>{
          resolve(hotelList)
      })
})
var mainList = [];
querryParameter().then((res)=>mainList = res)

If you want to validate something pass argument into querryParameter() and uses the argument in validate method.