2

We all know that dot product between vectors must return a scalar:

import numpy as np
a = np.array([1,2,3])
b = np.array([3,4,5])
print(a.shape) # (3,)
print(b.shape) # (3,)
a.dot(b) # 26
b.dot(a) # 26

perfect. BUT WHY if we use a "real" (take a look at Difference between numpy.array shape (R, 1) and (R,)) row vector or column vector the numpy dot product returns error on dimension ?

arow = np.array([[1,2,3]])
brow = np.array([[3,4,5]])
print(arow.shape) # (1,3)
print(brow.shape) # (1,3)
arow.dot(brow) # ERROR
brow.dot(arow) # ERROR

acol = np.array([[1,2,3]]).reshape(3,1)
bcol = np.array([[3,4,5]]).reshape(3,1)
print(acol.shape) # (3,1)
print(bcol.shape) # (3,1)
acol.dot(bcol) # ERROR
bcol.dot(acol) # ERROR
Mad Physicist
  • 107,652
  • 25
  • 181
  • 264
arj
  • 713
  • 2
  • 12
  • 26
  • You forgot to post (or read?) the whole error message. It tells you which dimensions it's trying match up. `dot` despite the name is a matrix product. The docs make it clear that the handling of 1d arrays is a special case. – hpaulj Jan 04 '19 at 10:20
  • If `a` and `b` are not 1d, the `dot` is: "a sum product over the last axis of `a` and the second-to-last axis of `b`". A (3,1) can work with a (1,3), but not with a (3,1). – hpaulj Jan 04 '19 at 21:34

3 Answers3

3

Because by explicitly adding a second dimension, you are no longer working with vectors but with two dimensional matrices. When taking the dot product of matrices, the inner dimensions of the product must match.

You therefore need to transpose one of your matrices. Which one you transpose will determine the meaning and shape of the result.

A 1x3 times a 3x1 matrix will result in a 1x1 matrix (i.e., a scalar). This is the inner product. A 3x1 times a 1x3 matrix will result in a 3x3 outer product.

Mad Physicist
  • 107,652
  • 25
  • 181
  • 264
  • Then the answer at https://stackoverflow.com/questions/22053050/difference-between-numpy-array-shape-r-1-and-r should be reviewed ? – arj Jan 04 '19 at 08:42
  • @arj. That answer is definitely very nice, but not completely relevant to your question. It will help you understand that a 3-element array is always going to be the same thing in memory no matter what shape or number of dimensions you give it. It should also help you understand how the matrix dimensions work out for multiplications. – Mad Physicist Jan 04 '19 at 09:05
  • Your 1x1 inner is doing the math as your 3x3 outer, 'ij,jk->ik` in `einsum` notation. The `j` dimension is 3 in one case and 1 in the other. – hpaulj Jan 04 '19 at 10:32
2

You can also use the @ operator, which is actually matrix multiplication. In this case, as well as in dot product, you need to be aware to the matrices sizes (ndarray should always be dim compatible ), but it's more readable:

>>> a = np.array([1,2,3])
>>> a.shape
(3,)
>>> b= np.array([[1,2,3]])
>>> b.shape
(1, 3)
>>> a@b
Traceback (most recent call last):
  File "<input>", line 1, in <module>
ValueError: shapes (3,) and (1,3) not aligned: 3 (dim 0) != 1 (dim 0)
>>> a@b.T
array([14])
YoniChechik
  • 1,397
  • 16
  • 25
0

You can also do like this

import numpy as npy
Vector1 = npy.array([0,2,3])
Vector2 = npy.array([3,5,1])
print("Dot Product of", Vector1, "and", Vector2,)
def DotProduct(a,b):
  NetValue = 0
  for i in range(len(a)):
   NetValue += a[i]*b[i]
 return NetValue
ans = DotProduct(Vector1,Vector2)
print("The answer is =",ans)
  • Thank you for your effort to contibute to SO. However you are not answering the question. Question was why error, and you're providing a bypass to the error by custom made dot product. Please read question more carefully next time. – MjH Jan 23 '22 at 10:04