np.transpose
, np.dot
and np.matmul
all document their behavior when given 1d arrays. broadcasting
is also a good thing to understand.
Your two arrays are both 1d:
In [250]: x=np.array([1,2])
...: y=np.array([3,4])
In [251]: x.shape
Out[251]: (2,)
transpose
does not change the dimensions of a 1d array; it does not add a dimension. It just reorders existing dimensions:
In [252]: y.T.shape
Out[252]: (2,)
outer
explicitly says it works with 1d arrays (and ravels others):
In [254]: np.outer(x,y)
Out[254]:
array([[3, 4],
[6, 8]])
As a long term numpy
user, I prefer to use the broadcasted multiply:
In [255]: x[:,None]*y # (n,1)*(m) => (n,1)*(1,m) => (n,m)
Out[255]:
array([[3, 4],
[6, 8]])
outer
also likens its action to np.einsum
, but I'll skip that for now.
dot/matmul
explicitly says what it does with 1d arrays:
In [256]: np.dot(x,y) # dot of 1d arrays is inner product
Out[256]: 11
In [257]: np.matmul(x,y) # x@y
Out[257]: 11
broadcasting
doesn't work with @/matmul
(for the last 2 dimensions, that is):
In [260]: x[:,None]@y
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Input In [260], in <cell line: 1>()
----> 1 x[:,None]@y
ValueError: matmul: Input operand 1 has a mismatch in its core dimension 0, with gufunc signature (n?,k),(k,m?)->(n?,m?) (size 2 is different from 1)
But (n,1) with (1,m) => (n,m), with the shared size 1 dimension as the sum-of-products dimension:
In [261]: x[:,None]@y[None,:]
Out[261]:
array([[3, 4],
[6, 8]])
In this case using the broadcasted multiply does the same thing:
In [262]: x[:,None]*y[None,:] # "column vector" * "row vector"
Out[262]:
array([[3, 4],
[6, 8]])
Another way to control dimension is to use np.einsum
:
In [263]: np.einsum('i,i',x,y)
Out[263]: 11
In [264]: np.einsum('i,j',x,y)
Out[264]:
array([[3, 4],
[6, 8]])
In [265]: np.einsum('i,i->i',x,y) # x*y
Out[265]: array([3, 8])
Because 1d arrays are really that, not row/column vectors in disguise, we may need to add a dimension here or there to match behavior that we are used seeing with matrix-oriented languages like MATLAB.