I'm writing a python program to perform certain processing on data for couple of millions (N) users. Outcome of the process would be several 1D arrays for each user (the process would apply on each user separately). I need set and get functions for each user output data.
I have two options for implementing this: 1. Create a class with attributes of size N by column_size, so one object that contains big arrays 2. Create a class for each user and store instances of this class in a list, so a list of N objects
My question is that what are pros and cons of each approach in terms of speed and memory consumption?