0

I would like to separate a list of 80 sets of coordinates in backets of 8 sets each.

This is what I tried. I found the indexes where the backets start. I sliced the list of coordinates between one index and the next. FInally, I used an if statement to create the final backet, since there is no 'next' index for the last index. Any ideas to improve this approach? Thank you.

nested_lst = [[0.5, 11.3, 5.1]]*80
indexes = list(range(len(nested_lst)))[::8]
backets = []
for i in range(len(indexes)):
    if i != len(indexes) - 1:
        backet = nested_lst[indexes[i]:indexes[i+1]]
    else:
        backet = nested_lst[indexes[i]:]
    backets.append(backet)
Rea Kalampaliki
  • 124
  • 3
  • 11

3 Answers3

1

The coordinates list can be flattened, and a simple iteration should work.

coordinates = [i for i in coordinates]
backets = []

i = 0
while(i < len(coordinates)):
    l = []
    for _ in range(8):
        l.append(coordinates[i])
        i += 1
    backets.append(l)
1

Could this work for you? Reference answer here

def batch(iterable, n=1):
    l = len(iterable)
    for ndx in range(0, l, n):
        yield iterable[ndx:min(ndx + n, l)]

nested_lst = [[0.5, 11.3, 5.1]]*80
backets = list(batch(nested_lst, n=8))

print(backets)

The results are matching yours, but this might be a more efficient and better-looking way to do it

ALai
  • 739
  • 9
  • 18
0

Use numpy! It'll be much faster and simpler

import numpy as np

coordinates = np.array([0.5, 11.3, 5.1]*80)

# Number of buckets
chunks = 8

# It is a must to have equally sized buckets, but reshape will also fail is this is not ok...
assert(coordinates.size % chunks == 0)

# Size of each bucket
chunksize = coordinates.size // chunks

# There you go, 8 buckets with the same size
res = coordinates.reshape((chunks, chunksize))
simre
  • 647
  • 3
  • 9
  • Why do you think this will be faster? Constructing a numpy area costs as much as iterating through the array once. – erip Apr 20 '22 at 11:59
  • If you have gone to "numpy world" once, you beat the sh*t out of python for these kind of list manipulations... The only drawback is looping through a numpy array, but it can be avoided most of the time, and also looping in python can be really slow anyway.... And you don't know the size of that thing. For small data python can be faster. But on real world problems numpy will save you a lot of time. – simre Apr 20 '22 at 12:04
  • The "list manipulation" is a single iteration through the list, which you already incur by line 2 of your code. I promise it's not faster. If you're confident, feel free to post a microbenchmark. :-) – erip Apr 20 '22 at 12:11
  • As I said, "if you have gone to numpy world once".... If you do these kind of stuff purely in numpy (because you can do nearly anything), numpy will beat python like hell... Thats what I said, and this is the sad truth beleive me or not. In this case for this one manipulation python can be faster, but i have doubts that this is not the whole program, and from that array creation step you will just beat python with every single step by a significant margin... ;) – simre Apr 20 '22 at 12:20
  • I know a lot about numpy and in many cases it is faster. This is not one of them. – erip Apr 20 '22 at 18:28