0

I have a large amount of data with which it is necessary to carry out some calculations. And i wonder what is the most efficient way to iterrate through data.

For now i tried to put the data into the pandas DataFrame and use iterrows. But i wonder is there any faster and efficient way.

For example i have the following data in DataFrame:

DataFrameData

I need to compare columns "sum" in each row. And coclude if some 'event' happend at that point of time. If it did i'm adding the event to the event list. What else but the iterrows might be helpful to do this?

Leteers
  • 3
  • 3
  • 1
    Don't use iterrows for large amounts of data if you want speed. Use the Pandas or Numpy vectorized functionality. You would get a better answer if you gave an example of the data and the calculations you want. – user19077881 Jun 25 '23 at 14:21
  • 1
    The most effective way is not to. Never iterate through large amounts of data. Find a way to let numpy or pandas do it for you. But outside that remark, there isn't any generic recipe. It heavily depends on what the iterations are. – chrslg Jun 25 '23 at 14:24

0 Answers0