3

heatmap and pheatmap of R failed to produce a cluster by throwing this error.

Error in vector("double", length) : vector size specified is too large

Does any one know how could I visualize that big matrix ?

ferrelwill
  • 771
  • 2
  • 8
  • 20
  • 1
    You could work on a subset, create a heatmap (and export it) and add them together with tools like gimp, imagemagick.. – Roman Luštrik Nov 23 '11 at 15:07
  • 5
    A little more context please? You may get a lot of random ideas thrown around, but you will almost certainly have to reduce the data in some way prior to visualizing it to get anything sensible. For example, are you interested in the correlations among the columns, or ... ? – Ben Bolker Nov 23 '11 at 15:08
  • yes, I need correlation and I want to see the clusters that are highly correlated. – ferrelwill Nov 23 '11 at 15:19
  • possible duplicate of [R: How can I make a heatmap with a large matrix?](http://stackoverflow.com/questions/5667107/r-how-can-i-make-a-heatmap-with-a-large-matrix) – Andrie Nov 23 '11 at 15:27
  • 1
    There are likely folks here on SO that could help you with this, but if you don't get the answer you're looking for, you might try asking at stats.stackexchange.com. In either case, you're going to want to follow the advice you've been given and provide more concrete details. – joran Nov 23 '11 at 15:28

1 Answers1

6

The hexbin package is designed to efficiently summarize large count 2D displays. Only with more than ten million points does this slow down somewhat on my 3 year-old Mac. (Honest, @James, I did think of this independently.) This is a minor mod to the example in one of its help pages:

 require(hexbin)
 h <- hexbin(rnorm(10000),rnorm(10000))
  plot(h, colramp= function(n){magent(n,beg=15,end=225)})

enter image description here

IRTFM
  • 258,963
  • 21
  • 364
  • 487