14

I have a matrix P with shape MxN and a 3d tensor T with shape KxNxR. I want to multiply P with every NxR matrix in T, resulting in a KxMxR 3d tensor.

P.dot(T).transpose(1,0,2) gives the desired result. Is there a nicer solution (i.e. getting rid of transpose) to this problem? This must be quite a common operation, so I assume, others have found different approaches, e.g. using tensordot (which I tried but failed to get the desired result). Opinions/Views would be highly appreciated!

Steve Tjoa
  • 59,122
  • 18
  • 90
  • 101
osdf
  • 818
  • 10
  • 20

2 Answers2

6
scipy.tensordot(P, T, axes=[1,1]).swapaxes(0,1)
Steve Tjoa
  • 59,122
  • 18
  • 90
  • 101
  • 1
    Ha! I stared at the result of `scipy.tensordot(P, T, axes=[1,1])` for hours yesterday, despairing over the swapped dimensions. Didn't know about `swapaxes`, thanks! – osdf Dec 21 '10 at 10:21
  • You're welcome. I also checked that swapping the axes gives the correct numerical answer, and it does. – Steve Tjoa Dec 21 '10 at 14:00
2

You could also use Einstein summation notation:

P = numpy.random.randint(1,10,(5,3))
P.shape
T = numpy.random.randint(1,10,(2,3,4))
T.shape

numpy.einsum('ij,kjl->kil',P,T)

which should give you the same results as:

P.dot(T).transpose(1,0,2)
cwitte
  • 140
  • 4