1

I have a 3D scan of a rock that I have represented as a point cloud using Three.js, and I wanted to more explicitly show the features of the rock, perhaps using lighting to show depths with Three. Viewing the rock from the top, I see this: Top image of the rock. When looking closely from the side, you can see rigid features that I would like to show more closely:

Side image

I am unsure in how to approach this, and was hoping for some help in showing these rock features within this visualization. For context, this is my current shader setup:

vertexShader: `
    precision mediump float;
    varying vec3 vColor;
    attribute float alpha;
    varying float vAlpha;
    
    void main() {
        vAlpha = alpha;
        vColor = color;
        vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );

        gl_PointSize = 2.0;

        gl_Position = projectionMatrix * mvPosition;
    }
`

fragmentShader: `
    #ifdef GL_OES_standard_derivatives
    #extension GL_OES_standard_derivatives : enable
    #endif
    precision mediump float;
    varying vec3 vColor;
    varying float vAlpha;

    
    void main() {
        float r = 0.0, delta = 0.0, alpha = 1.0;
        vec2 cxy = 2.0 * gl_PointCoord.xy - 1.0;
        r = dot(cxy, cxy);

        //#ifdef GL_OES_standard_derivatives
        delta = fwidth(r);
        alpha = vAlpha - smoothstep(vAlpha - delta, vAlpha + delta, r);
        //#endif
        if (r > 1.0) {
            discard;
        }
        gl_FragColor = vec4(vColor, alpha);
    }
    //varying vec3 vColor;
    //varying float vAlpha;
    //void main() {
        //gl_FragColor = vec4( vColor, vAlpha );
    //}
`

I am creating my point cloud using THREE.Points, with a BufferGeometry geometry and ShaderMaterial material.

Is there a way to go about this more explicitly showing my point cloud depths?

Thank you!

Community
  • 1
  • 1
Imas
  • 177
  • 1
  • 1
  • 12
  • The question is unclear. What do you mean by "depth"? Do you mean the z-coordinate of the point? – Rabbid76 Dec 31 '19 at 09:43
  • I am unsure if 'depth' is the best description of what I want - mostly, the ability to view the features of the rock more clearly given lighting or a similar effect. – Imas Dec 31 '19 at 15:27
  • Lighting is no option, because for any light model you have to know the normal vector of the fragment, but points have no normal vectors. To calculate some kind of normal vector you would need a surface. So you would have to do some triangulation and to generate a concave hull somehow. THREE provides a [`ConvexGeometry`](https://threejs.org/docs/#examples/en/geometries/ConvexGeometry), but you would need something concave. – Rabbid76 Dec 31 '19 at 15:54
  • An option would be to do some kind of [Screen Space Ambient Occlusion](https://de.wikipedia.org/wiki/Screen_Space_Ambient_Occlusion) in a post process. But a standard algorithm won't work, because your object consists of dots and there is some empty space between the dots. – Rabbid76 Dec 31 '19 at 16:00

1 Answers1

2

The depth of the current fragment is stored in the .z component of gl_FragCoord. The depth is stored in range [0.0, 1.0] (except this range is changed by glDepthRangef):

With that information you can set the alpha channel of the point, with decreasing opacity, by its depth:

float depth = gl_FragCoord.z;
gl_FragColor = vec4(vColor, 1.0 - depth);

Since the depth at perspective projection is not linear (see How to render depth linearly in modern OpenGL with gl_FragCoord.z in fragment shader?), it would be nice to have a value in rage [0.0, 1.0], which linearly represents the depth of the point between the near and the far plane.
THis can be done by the function LinearizeDepth in the following example:

uniform vec2 u_depthRange;

float LinearizeDepth(float depth, float near, float far)
{
  float z = depth * 2.0 - 1.0; // Back to NDC 
  return (2.0 * near * far / (far + near - z * (far - near)) - near) / (far-near);
}

void main()
{
    // [...]

   float lineardepth = LinearizeDepth(gl_FragCoord.z, u_depthRange[0], u_depthRange[1]);
   gl_FragColor = vec4(vColor, 1.0 - lineardepth);
}

To make the example tun, the depth range has to be set to the uniform u_depthRange. The near plane of the PerspectiveCamera is stored in the .x component and the far plane in .y component:

var near = 1, far = 10
camera = new THREE.PerspectiveCamera(fov, width / height, near, far);
var uniforms = {
      // [...]
      u_depthRange: {type: 'v2', value: {x:near, y: far}} 
};

var material = new THREE.ShaderMaterial({  
      uniforms: uniforms,
      // [...]
});

Note, for a "good" effect, the near and the far plane of the camera, have to be as close to the geometry as possible!

Rabbid76
  • 202,892
  • 27
  • 131
  • 174