I have a large table to plot (rows are measurements at different places and columns are different samples) and the table is huge with 30 million rows and 60 columns. I can draw density plot for each group with a smaller data set but this full file is just too huge to handle if I read everything into memory at once.
The data looks like this:
variable value
1 V5 0.95
2 V5 0.98
3 V5 0.98
4 V5 0.95
5 V5 0.98
6 V5 0.98
The R code to draw is:
ggplot(df2.m,aes(x=value,colour=variable))+geom_density(alpha=.2)+theme_bw()+theme(text=element_text(size=30),panel.border=element_rect(linetype="solid",colour="black",size=2.8),panel.grid.major=element_line(size=1.2),axis.ticks=element_line(size=1.4),axis.ticks.length=unit(.5,"cm"),legend.position="none")
I'm wondering how can I draw density plot for each group first and save the object or something into a temporary thing and do all the samples and them combine them into one plot?