1

In general, log and exp functions should be roughly the same speed. I would expect the numpy and scipy implementations to be relative straightforward wrappers. numpy.log() and scipy.log() have similar speed as expected. However, I found that numpy.log() is ~60% slower than these exp() functions and scipy.log() is 100% slower. Does anyone one know the reason for this?

Bitwise
  • 7,577
  • 6
  • 33
  • 50
  • In my simple tests, `np.log` is 60% faster. Yes, list comprehensions using `math.log` and `math.exp` give the same times. But the numpy code isn't (necessarily) a simple compiled loop using the `math` equivalents. – hpaulj May 21 '15 at 16:15
  • 1
    Why do you think that log and exp should be "roughly the same speed"? – tmyklebu May 21 '15 at 17:57

1 Answers1

2

Not sure why you think that both should be "roughly the same speed". It's true that both can be calculated using a Taylor series (which, even by itself means little without analyzing the error term), but then the numerical tricks kick in.

E.g., an algebraic identity can be used to transform the original exp. Taylor series into a more efficient 2-jump power series. However, for the power series, see here a discussion of by-case optimizations, some of which involve a lookup table.


Which arguments did you give the functions - the same? the worst one for each?

What was the accuracy of the results? And how do you measure the accuracy for each: absolutely, relatively?


Edit It should be noted that these libraries can also have different backends.

Community
  • 1
  • 1
Ami Tavory
  • 74,578
  • 11
  • 141
  • 185