I am trying to build arrays of histograms of unsigned char corresponding to each pixel in an image for the gPb algorithm implementation. I have a crash on a cudaMalloc call which I cannot solve. I have looked through other similar questions and I tested always if the previous operations returned cudaSuccess or not. Here is my code:
First I allocate this structure in constructor of my class CudaImage:
bool CudaImage::create2DHistoArray()
{
//preparing histograms
m_LastCudaError = cudaMalloc((void**)&m_dHistograms, (m_Height + 2 * m_Scale) * sizeof(unsigned int*));
if (m_LastCudaError != cudaSuccess)
return false;
//set all histograms to nullptr
m_LastCudaError = cudaMemset(m_dHistograms, 0, (m_Height + 2 * m_Scale) * sizeof(unsigned int*));
if (m_LastCudaError != cudaSuccess)
return false;
return true;
}
then at some point I would call a member function to allocate some of m_dHistograms[i] as follows:
bool CudaImage::initializeHistoRange(int start, int stop)
{
for (int i = start; i < stop; ++i) {
m_LastCudaError = cudaMalloc((void**)&m_dHistograms[i], 256 * 2 * m_ArcNo * (m_Width + 2 * m_Scale) * sizeof(unsigned int));
if (m_LastCudaError != cudaSuccess) {
return false;
}
//set all pixels in the gradient images to 0
m_LastCudaError = cudaMemset(m_dHistograms[i], 0, 256 * 2 * m_ArcNo * (m_Width + 2 * m_Scale) * sizeof(unsigned int));
if (m_LastCudaError != cudaSuccess)
return false;
}
return true;
}
The first cudaMalloc in this last function crashes without a single warning. When running with cuda-memcheck I get the following message:
"The application may have hit an error when dereferencing Unified Memory from the host. Please rerun the application under a host debugger to catch such errors."
Can anyone help ? Another question would be if the array allocation was correctly implemented. I do not want to allocate all memory from the beginning because it will be too much so I allocate in constructor (first function) only the pointers to the rows of the array and then in the application I allocate memory when I need it and free what I do not need.