1

I am using a Jupiter Notebook for making a machine learning model using turicreate. When ever I upload a big .csv file, I get a message: kernel died. Since I am new in python is there an other way to Batch loading file or anyone nows how to fix this issue ?

The csv file is 52.9 MB

Thanks

P S
  • 527
  • 4
  • 18

1 Answers1

0

53MB is not a big file ! You should try to load this in a ipython terminal to test.

Load as you would in a Jupyter to see if you have any issue. If there is no issue, This could be a bad installation of Jupyter.

Note : The kernel dies mostly when you're out of RAM. But 53MB is not that big, assuming you have at least 2 or 4G or RAM on your laptop, an error like this shouldn't happend

LaSul
  • 2,231
  • 1
  • 20
  • 36
  • Hi @Alexandre Sullet. I read about that, but I use a 2018 16GB MacBook Pro so I guess is not a RAM issue. – P S Dec 07 '18 at 14:44
  • 1
    Did you try to load it into a _ipython_ terminal ? – LaSul Dec 07 '18 at 15:29
  • No, I am completely new in Python and I am not sure how to do all those steps. But I will try soon or later since I am facing the same issue with the same son file also. – P S Dec 10 '18 at 09:29
  • Open your terminal, type "ipython". The python interpreter will load. Just load your csv as it was a jupyter notebook, e.g : import pandas as pd pd.read_csv(YOUR_DATA, sep = YOUR_SEP) and see if there if any issue. You can also replace the last line by pd.read_csv(YOUR_DATA, sep = YOUR_SEP, nrows = 10) to see if it load correctly on the first 10 lines. If not, your file might be corrupted – LaSul Dec 10 '18 at 09:51
  • Hi Alexander I posted a more detailed question since the above is not so detailed. Here is the link https://stackoverflow.com/questions/53702855/the-kernel-appears-to-have-died-always-when-i-am-loading-a-large-file – P S Dec 10 '18 at 09:58