Given I have an array like this:
array = [Array[8], Array[8], Array[8], ...]
# array.length is 81; each octet represents a point on a 9x9 grid
where each nested array contains 8 numeric elements ranging from -2
to 2
, how would I apply the following step to get a vector in Javascript?
Step 5. The signature of an image is simply the concatenation of the 8-element arrays corresponding to the grid points, ordered left-to-right, top-to-bottom. Our signatures are thus vectors of length 648. We store them in 648-byte arrays, but because some of the entries for the first and last rows and columns are known to be zeros and because each byte is used to hold only 5 values, signatures could be represented by as few as ⌈544 log2 5⌉ = 1264 bits.
(Towards the end, those are supposed to be ceiling notations; best I could do given SO's lack of Latex formatting)
I have the array ready to go and ordered properly, but my knowledge of matricies and vectors is a little rusty, so I'm not sure how to tackle this next step. I'd appreciate any clarifications!
Background: I'm trying to create a JS implementation of an image processing algorithm published by the Xerox Palo Alto Research Center for a side-project I'm currently working on.