I am using scipy.cluster.hierarchy.fclusterdata
function to cluster a list of vectors (vectors with 384 components).
It works nice, but when I try to cluster large amounts of data I run out of memory and the program crashes.
How can I perform the same task without running out of memory?
My machine has 32GB RAM, Windows 10 x64, python 3.6 (64 bit)