When rendering the depth buffer with the iOS simulator Getting the true z value from the depth buffer , all is fine. But the rendering on real device gives bad result: the depth is rendered with only few values (there is no fade) like when you display an image into a 256 values color range.
Here is the code for the fbo generation:
glGenFramebuffers(1, &sceneFBO);
glGenTextures(2, textures_scene);
glBindTexture(GL_TEXTURE_2D, textures_scene[0]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, widthResolution, heightResolution, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glBindTexture(GL_TEXTURE_2D, textures_scene[1]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, widthResolution, heightResolution, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_INT, NULL);
//TODO: GL_DEPTH_COMPONENT cannot be GL_UNSIGNED_BYTE ?
glBindFramebuffer(GL_FRAMEBUFFER, sceneFBO);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, textures_scene[0], 0);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, textures_scene[1], 0);
It seems that the only working value for GL_DEPTH_ATTACHMENT fbo is GL_UNSIGNED_INT. But it is not enough to render the depth...
PS: I don't want to display again the scene into a GL_COLOR_ATTACHMENT0 with the z-values in order to have a correct depth (for performance reasons) nor use OpenGL ES3.0 or the new metal (iPhone4s support).
Any idea?