I am trying to load a 1 GB Pandas Dataframe in GCP AI Platform with a 100 GB disk and 15GB RAM virtual machine but I have the following error:
MemoryError: Unable to allocate 1.16 GiB for an array with shape (20, 7762852) and data type object
Do you know why the allocation does not execute while the Virtual Machine is a 100 gb disk for 15 gb RAM ?
df_event = pd.concat([pd.read_csv(os.getcwd() + '/data/lead_inscrit_train.csv.gz',
compression='gzip',
sep=';',
quotechar='"',
quoting=csv.QUOTE_ALL,
dtype=str,
parse_dates=["date"]
),
pd.read_csv(os.getcwd() + '/data/lead_inscrit_test.csv.gz',
compression='gzip',
sep=';',
quotechar='"',
quoting=csv.QUOTE_ALL,
dtype=str,
parse_dates=["date"]
)]
, axis=0)