My attempt is to help you get the fastest, and most readable version possible. To do so my suggestion is to first create a generator which will yield the values you want. And then to perform the builtin max()
function on this generator. The reason this is faster/more efficient is almost the same as embedding the generator inside the max()
function, only using local variables is faster in Python than using global ones, once the max()
function does not have to lookup indexes like x[0][1]
it is faster.
vals = (abs(x[0][1]) for x in dCF3v)
print max(vals)
Timing:
I timed the difference between mine and mgilsons answer using the following code:
import time
dCF3v = [[(1.90689635276794, -44706.76171875)], [(1.90689635276794, -44706.76171875)], [(1.90689635276794, -44706.76171875)], [(1.90689635276794, -44706.76171875)]]
def method_inbar(l):
vals = (abs(x[0][1]) for x in l)
max(vals)
def method_mgilson(l):
max(abs(x[0][1]) for x in l)
def timer(multiplier=[1,10,100,1000]):
for m in multiplier:
print "timing the speed using multiplier: %s" % m
now = time.time()
for i in range(100000):
method_inbar(dCF3v*m)
print "inbar's method: %r" % (time.time() - now)
now = time.time()
for i in range(100000):
method_mgilson(dCF3v*m)
print "mgilson's method: %r" % (time.time() - now)
timer()
This will run the test each time on a larger set of data:
>>>
timing the speed using multiplier: 1
inbar's method: 0.18899989128112793
mgilson's method: 0.192000150680542
timing the speed using multiplier: 10
inbar's method: 0.8540000915527344
mgilson's method: 0.8229999542236328
timing the speed using multiplier: 100
inbar's method: 7.287999868392944
mgilson's method: 7.45199990272522
timing the speed using multiplier: 1000
inbar's method: 71.42099976539612
mgilson's method: 77.18499994277954
As you can see, on larger amounts of data. It is faster. The only reason is is slower is because it takes time to initiate vals, and since I run the functions many, many times, it seems much slower, but if you are running this only once, then you should feel no difference for smaller data sets, but you should feel a large difference for large data sets. (a few seconds at only 1000 times)