3

I'm wondering how to get functionality similar to numpy.einsum in Julia.

Specifically, I have a 3rd order tensor that I'm looking to multiply by a 2nd tensor (matrix), contracting both of the dimensions to yield a 1st order tensor (vector).

Currently, I'm using PyCall so that I can use the numpy.einsum function like so:

using PyCall
@pyimport numpy as np

a = rand(5,4,3)
b = rand(5,4)

c = np.einsum("ijk,ij", a,b)
size(c) == (3,)

It feels kind of silly to rely on calling python in order to do tensor math. I also imagine that a julia implementation would have speed advantages. However, I haven't any function for this in julia, and the brute force summation is 1-2 orders of magnitude slower. What functions can I use?

stords
  • 270
  • 2
  • 9
  • 2
    This isn't really a function – it's an entire tensor library stuffed into a single-function interface! You can do a lot of these kinds of computations with comprehensions, but you might want to open an issue requesting this function as a feature. – StefanKarpinski Mar 20 '14 at 03:14
  • @stords - I just created a small package to do exactly this. I posted it in response to this duplicate question. http://stackoverflow.com/questions/32094338/numpy-einsum-for-julia-2?lq=1 – digbyterrell Apr 17 '16 at 18:10

2 Answers2

3

Doesn't sum(a.*b,(1,2)) do what you want?

tholy
  • 11,882
  • 1
  • 29
  • 42
  • `a.*b` throws an error because the dimensions don't match – stords Mar 21 '14 at 00:02
  • What version of Julia are you using? `a.*b` is broadcasting, and for your example it works fine on 0.3. (Probably 0.2 too.) – tholy Mar 21 '14 at 22:21
  • Ah, you're right. I quickly tested it on my windows machine where I have a really old version of Julia, which is why it wasn't working. This does work! Its still a little slower, though. – stords Mar 22 '14 at 07:01
  • 1
    An explicit implementation would avoid the allocation of the temporary, which for small arrays is surely the bottleneck. Since Julia has fast loops, you can write this yourself if you really need it. I'm not sure what you mean by "brute-force summation" being slow; if you weren't creating temporaries, brute force/devectorized should be the fastest approach. – tholy Mar 22 '14 at 13:13
  • The string argument to numpy's `einsum` e.g. `""j,ijl->il""` is compact and easy to read, and makes it clear what the expected shapes of the inputs are and how they will be broadcasted/multiplied. Julia's `sum` is more opaque. There is the [einsum](https://stackoverflow.com/questions/32094338/numpy-einsum-for-julia-2) package, but it only provides a macro for assignment. I'm new to Julia, but it would seem that it can't support einsum in expressions (e.g. `exp(einsum("j,ijl->il",x,y))`). So, on balance, my vote is for importing einsum from numpy. – MRule Aug 28 '20 at 15:12
3

There is Tullio. Tullio is a pretty flexible einsum macro. It understands array operations written in index notation such as just permutations and matrix multiplication, but also convolutions, stencils, scatter/gather, and broadcasting.

Andrew Hardy
  • 232
  • 1
  • 8