6

I am trying to look for trends in a large data matrix, say 50,000 x 1000 elements. If I use the image command the matrix is printed to the screen. I would like to know how Matlab's image function decides which elements of the matrix to display considering there are not enough pixels on my screen to display so many elements? In other words, which downsampling algorithm is it applying to the matrix?

mehmet
  • 1,631
  • 16
  • 21
user1860389
  • 101
  • 3
  • You are using `imshow()` I presume? – kkuilla Jan 06 '15 at 12:05
  • 4
    `nearest`-interpolation (as described in the `interp2` function) is used. For alternatives check this question: http://stackoverflow.com/questions/25342693/how-to-avoid-image-display-artifacts-in-matlab – Daniel Jan 06 '15 at 12:38
  • Thank you for your reply. Do you know if print will print the interpolated or the raw image? – user1860389 Jan 07 '15 at 13:55
  • Don't know. I would export at full resolution using `imsave`. – Daniel Jan 31 '15 at 15:37

1 Answers1

0

Nearest interpolation (as described in the interp2 function) is used. For alternatives check this question: How to avoid image display artifacts in Matlab?

Community
  • 1
  • 1
Tokkot
  • 1,215
  • 7
  • 22