I had a big problem with python lists and NumPy arrays. I want to convert my python list to NumPy arrays but list are so big that ram cannot handle them. So is there a method that does this job batch by batch?
Asked
Active
Viewed 405 times
0
-
1Does this answer your question? [Working with big data in python and numpy, not enough ram, how to save partial results on disc?](https://stackoverflow.com/questions/16149803/working-with-big-data-in-python-and-numpy-not-enough-ram-how-to-save-partial-r) – Ant Jul 14 '21 at 11:29
-
1What do you intend to do with the array(s)? – hpaulj Jul 14 '21 at 11:42
-
i used arrays for machine learning. The model expect to have np arrays as type of input – Jul 14 '21 at 18:37
-
Some machine-learning packages have a way of defining inputs as batches. For home-brewed `numpy` based learning we can't help without details. – hpaulj Jul 14 '21 at 18:47
-
the model can handle batches but data has a type of python list. first I have to convert python lists(data) to np arrays – Jul 14 '21 at 19:29
1 Answers
0
You can use the split method from the
Splitting a list into N parts of approximately equal length
and this is the function from the link above:
def split(a, n):
k, m = divmod(len(a), n)
return (a[i*k+min(i, m):(i+1)*k+min(i+1, m)] for i in range(n))
This should work
import numpy as np
def split(a, n):
k, m = divmod(len(a), n)
return (a[i*k+min(i, m):(i+1)*k+min(i+1, m)] for i in range(n))
for i in (list(split(yourList,n))):
print(np.array(i))
This should print out the split np arrays
Remember that you need to split the list equally
-
-
Wait can you check this answer as correct so that people do not misunderstand that this is a unsolved question. – Aug 11 '21 at 12:32