I am using thrust with cuda 5.5 to make integer vector sort. Sorting 100*1024*1024 int's should allocate 400MB memory,but nvidia-smi shows always "Memory-Usage 105MB / 1023MB".(my test GPU is GTX260M)
sorting 150*1024*1024 gives allocation error:
terminate called after throwing an instance of 'thrust::system::detail::bad_alloc'
what(): std::bad_alloc: out of memory
Aborted (core dumped)
before array allocation I am checking memory using cudaMemGetInfo it returns:
GPU memory usage: used = 105.273682, free = 918.038818 MB, total = 1023.312500 MB
Can I check maximum memory available for my integer array before starting gpu analysis?
EDIT:
Ok, before sort my memory usage is about this. GPU memory usage: used = 545.273682, free = 478.038818 MB, total = 1023.312500 MB
seems to me sort algorithm needs some additional memory.