6

I want to convert a vector of indices into a matrix with ones in the columns of the indices.

x = [2;1;3;1];
m = someFunc(x,3)
% m =
%
%   0   1   0
%   1   0   0
%   0   0   1
%   1   0   0
Jacob Eggers
  • 9,062
  • 2
  • 25
  • 43
  • possible duplicate of [How can I change the values of multiple points in a matrix?](http://stackoverflow.com/questions/6850368/how-can-i-change-the-values-of-multiple-points-in-a-matrix) – gnovice Mar 30 '12 at 17:26

4 Answers4

15

I tested the sub2ind function, but on the coursera Machine Learning forum I was pointed to this beauty.

m = eye(num_cols)(x,:);

It uses the identity matrix to select the appropriate column based on the value in x.

Hans Then
  • 10,935
  • 3
  • 32
  • 51
  • I don't understand the syntax for this – Dan May 09 '19 at 13:03
  • Play with it. Try with a small identity matrix and first run single values instead of a vector x. E.g. `eye(4)(3,:)` This will take the third row of the identity matrix. Then try with a vector `eye(4)([2,3,1], :)`. This will take the second, third and first row of the identity matrix. – Hans Then May 17 '19 at 11:48
3

One way is to use SUB2IND function:

colN = 3;
assert(max(x)<=colN,'Not enough columns') %# check that you have enough columns
%# other checks that x is valid indices

m = zeros(numel(x),colN);
m(sub2ind(size(m),1:numel(x),x')) = 1;
yuk
  • 19,098
  • 13
  • 68
  • 99
1

I had a very similar question, so I didn't want to open a new one. I wanted to convert a row vector of indices into a matrix with ones in the rows (instead of columns) of the indices. I could have used the previous answer and inverted it, but I thought this would perform better with very large matrices.

octave> x = [2 1 3 1];
octave> m = setRowsToOne(x, 3)
m =

   0   1   0   1
   1   0   0   0
   0   0   1   0

I couldn't see how to use sub2ind to accomplish this, so I calculated it myself.

function matrixResult = setRowsToOne(indexOfRows, minimumNumberOfRows)
   numRows = max([indexOfRows minimumNumberOfRows]);
   numCols = columns(indexOfRows);
   matrixResult = zeros(numRows, numCols);
   assert(indexOfRows > 0, 'Indices must be positive.');
   matrixResult(([0:numCols-1]) * numRows + indexOfRows) = 1;
end

x = [2 1 3 1];
m = setRowsToOne(x, 3)
Charity Leschinski
  • 2,886
  • 2
  • 23
  • 40
0

You can use accumarray which makes this very easy, like so:

accumarray([ (1:length(x))', x ], 1, [4, 3])

The 1:length(x) part specifies into which rows the ones go, and x into which columns.

jlh
  • 4,349
  • 40
  • 45