Following the example here, I am able to find the column indices of a 2D numpy array and get back an array of column indices of all occurrences of the max value.
But now I want to do the same thing but on a sparse csr_matrix.
x = np.array([[0,0,1,0,0,0,2],[0,0,0,4,0,0,0],[0,9,1,0,0,0,2],[0,0,1,0,0,9,2]])
max_col_inds = np.argwhere(x == np.max(x))[:,1]
# array([1, 5], dtype=int64)
Then I want to get the 1st and 5th elements of a 1D array using that result:
words[max_col_inds]
If x
is a 2D numpy array and words
is a 1D numpy array, this works.
But now if I replace x
with a scipy.sparse.csr.csr_matrix, I get this on the call to np.argwhere()
:
TypeError: tuple indices must be integers, not tuple