I just started a new job and have been given a Dell XPS 13 7390 laptop and it is really struggling with big data files / processing.
I'm currently working with a 1.5gb csv and I get a memory error when I try to open it with Python in a Jupyter Notebook.
Error tokenizing data. C error: out of memory
I was sure I'd opened files like this with ease on my personal laptop, a 10 year old Macbook, so i tested it on the same file and it opened.
Why is my Dell laptop struggling despite having lots of RAM available? Could settings be adjusted to allocate more memory to Jupyter Notebooks? What tests could I run to look into this further?
Hardware details below. The obvious difference is processor speed - does that explain it?
Dell laptop:
Ram- 16gb
Processor- Intel Core i7-10510U CPU @ 1.80GHz
Macbook:
Ram- 4gb
Processor- 2.7 GHz Intel Core i7
Code used to open the file:
import pandas as pd
data = pd.read_csv('data.csv')
data.shape
The shape = (2250493, 218)