6

I am trying to ray trace a sphere inside a cube. The cube is simply constructed out of 12 triangles with normals.

The cube has unit coordinates and unit normals. So within its local space (between -1 and 1), there should be a sphere of radius 0.5.

So I thought I should calculate the ray in the vertex shader: the ray origin is the interpolated vertex position, the ray direction is the vertex normal (or its opposite direction but that shouldn't matter I think). Interpolation should do the rest.

Then in the fragment shader, I should then calculate the ray-sphere intersection points and if there is any, change the color of the fragment.

On the front and back side of the cube, the result seems to be correct, but on the left, right, top and bottom sides, the result seems to be coming from the wrong angle. I should see the sphere in the middle all the time and that is not the case on those sides.

Can someone tell me what I am doing wrong?

Here is the shader code:

Vertex shader:

#version 400

layout(location = 0) in vec3 aPos;
layout(location = 1) in vec3 aNor;

uniform mat4 uProj;
uniform mat4 uView;
uniform mat4 uModel;

out vec3 vRayPos;
out vec3 vRayDir;

void main(void)
{
  gl_Position = uProj * uView * uModel * vec4(aPos, 1);
  vRayPos = aPos;
  vRayDir = inverse(mat3(uModel)) * aNor;
}

Fragment shader:

#version 400

in vec3 vRayPos;
in vec3 vRayDir;

out vec4 oFrag;

void main(void)
{
  const vec3 sphereCenter = vec3(0, 0, 0);
  const float sphereRadius = 0.5;

  vec3 rayPos = vRayPos;
  vec3 rayDir = normalize(vRayDir);
  float a = dot(rayDir, rayDir); // TODO: rayDir is a unit vector, so: a = 1.0?
  float b = 2 * dot(rayDir, (rayPos - sphereCenter));
  float c = dot(rayPos - sphereCenter, rayPos - sphereCenter) - sphereRadius * sphereRadius;
  float d = b * b - 4 * a * c;
  float t = min(-b + sqrt(max(0, d)) / 2, -b - sqrt(max(0, d)) / 2);

  vec3 color = (1.0 - step(0, d)) * vec3(0.554, 0.638, 0.447) + step(0, d) * abs(t) * vec3(0.800, 0.113, 0.053);

  oFrag = vec4(color, 1);
}

Notes: The factor t is actually not necessary, but it gives an idea of how far away from the side the ray touches the sphere which gives it a shady look. The step(0, d) function is used to see if there are any intersection points and the max(0, d) is used to prevent the shader from halting on a sqrt(<0) fault, both to prevent code branching.

Reference: I got the calculations from https://en.wikipedia.org/wiki/Line%E2%80%93sphere_intersection

Edit: here is a video of the problem: Video

user541686
  • 205,094
  • 128
  • 528
  • 886
scippie
  • 2,011
  • 1
  • 26
  • 42
  • this [Reflection and refraction impossible without recursive ray tracing?](https://stackoverflow.com/a/45140313/2521214) might interest you. Anyway unit sphere has radius `1.0` instead of `0.5` You can cross check your intersection with this [ray and ellipsoid intersection accuracy improvement](https://stackoverflow.com/q/25470493/2521214). Anyway do you have screenshots of the problem? Also your ray direction/poisition description sounds off to me look at the first link How I do it (Quad covering whole screen,position is the Vertex position but direction is position-focus instead of normal !) – Spektre Jul 22 '19 at 07:33
  • I never said I wanted a unit sphere. I know it's only a half sphere so it would not touch the edges of the cube. I will look into those links you provided in more detail but a quick look doesn't seem to help. The ray pos/dir are not the usual kind where you render a big screen filling quad (I have done that a lot before). This is different: I have a scene inside a cube that is rendered in the usual way. Every side of the cube is the camera lens itself viewing in the opposite direction of the normal of that side. But I agree that there is probably something wrong in that part of my code. – scippie Jul 22 '19 at 16:35
  • heh do not know where I saw the unit sphere yesterday :) sorry probably a mix-up as 2 ray tracing questions where asked yesterday [second QA](https://stackoverflow.com/q/57134950/2521214) after long time or I just too quick read this ... What you describing is like 360deg scene raytraceing similar to CUBEMAP texture rendering ?. That can get tricky do you get some screenshots? – Spektre Jul 22 '19 at 17:51
  • 1
    I added a video in my OP! – scippie Jul 22 '19 at 19:25
  • 1
    Oh that's entirely different than I taught. Anyway for the stuff you're doing `vRayDir = inverse(mat3(uModel)) * aNor;` look suspicious. I would use your cube center and translate it by cube half size + focal length in normal direction to get the focal point and then `vRayDir = vRayPos - focal_point` ... – Spektre Jul 22 '19 at 19:33
  • Also: in one of your other comments, you mentioned not using a full-screen quad because it doesn't work when you rotate it. If you rotate your ray origin (viewpoint) rather than rotating the quad, purely for the purpose of ray calculation, you'll likely get the effect you're looking for. (Rotating the raytraced / raycast contents rather than the quad.) I use this for some interesting effects, like lensing: https://www.youtube.com/watch?v=kTliozdVeh0 – 3Dave Jul 23 '19 at 14:01
  • 1
    I never said that but I understand where that thought may have come from. What I said was that I can't use simple billboards for the 3D particles. But if I am reading your comment correctly, I think it may still be possible: keep the billboard unrotated and change its viewport angle. Thanks. Cool video by the way! – scippie Jul 23 '19 at 19:31

1 Answers1

2

Your rays should be calculated by taking the direction between a given fragment and the camera position. (In view space, that would be the origin.) The vertex normals have absolutely nothing to do with it.

You can technically calculate rays in the vertex shader and pass it to the fragment shader as an interpolant. However, this has the potential to give incorrect results since the output will be linear, which is incorrect.

A better approach is to output the view space position of your vertex in the vertex shader. In the fragment shader, calculate a ray from the origin to the fragment's view space position. Then, perform your ray intersection tests using that ray. The rasterizer will correctly interpolate the view space position. You could also calculate that yourself in the fragment shader, but the hardware is pretty good at this so it makes sense to let it do that for you.

Having said all of that, the major issue with your current implementation is using the vertex normals to calculate rays. That's wrong. All you need is the camera position and the fragment position. If you look carefully at your video, you'll see that the same thing is being drawn on all sides, regardless of position relative to the camera.

For a simple sphere, all you need is the camera-fragment ray. Calculate the distance from the line containing that to the center of the sphere. If it's less than the radius of the sphere, it's a hit.

3Dave
  • 28,657
  • 18
  • 88
  • 151
  • I do not think OP is going for traditional ray tracer in which case your answer would be relevant. Its a different effect with 6 (or 3) cameras not one ... – Spektre Jul 22 '19 at 21:13
  • @Spektre That's not what his video indicates, and that is not mentioned in the question or the comments. Regardless of how many cameras are in use, the ray needs to point from the camera to the fragment. The vertex normals are not useful here unless the camera was at the center of the cube. They fan out, not in. – 3Dave Jul 22 '19 at 21:16
  • its in the comments: `The ray pos/dir are not the usual kind where you render a big screen filling quad (I have done that a lot before). This is different: I have a scene inside a cube that is rendered in the usual way. Every side of the cube is the camera lens itself viewing in the opposite direction of the normal of that side.` My understanding is Its the opposite of the environment cubemap ... projecting scene on it instead it on objects of the scene – Spektre Jul 22 '19 at 21:19
  • @Spektre I see where you're coming from, but OP also says: `I should see the sphere in the middle all the time and that is not the case on those sides.` This sounds like a classic ray-traced volume. If `Every side of the cube is the camera lens itself viewing in the opposite direction of the normal of that side` is really the case, OP should choose a point along that face's normal - NOT the vertex normal - and calculate the ray from that position to the fragment, and proceed in the usual way. The question is a bit poorly phrased. :| – 3Dave Jul 22 '19 at 21:27
  • 1
    Yep I agree that is why I did not answer in the first place ... and just commenting/hinting – Spektre Jul 22 '19 at 21:29
  • @Spektre And if the latter is the case, OP should just make a texture with a big red dot in the middle and apply it to all sides. :) – 3Dave Jul 22 '19 at 21:30
  • 1
    May be or may be not ... the centered ball might be just wrong assumption from the OP's side about the effect behavior. – Spektre Jul 22 '19 at 21:32
  • 1
    You are both right. Your discussion has made me realize that I don't know what I want and I am solving problem B while it is A I need. The reason I started this project is because I want to have some kind of 3D particles based on mathematical calculations (I started with a sphere because it is easy). I am abusing the cube to have depth because a billboard will be invisble when rotated around the Y axis. I now believe it is just normal ray tracing I need, but my screen quad is limited to the rasterized surface of the cube and the object is always relative to the cube. I'll report back. – scippie Jul 22 '19 at 22:41
  • @scippie You can totally raytrace within a cube. It's been done many times over, and it's a valid approach used in many applications. Grab some paper, draw an overhead view of your rotated cube, put a dot somewhere for the camera, draw a line from the camera to a point on a side of the cube and figure out what your ray needs to be, for the effect you're trying to accomplish. This method has been documented out the yin-yang. You're on the right path, assuming your latest comment is indicative of what you want to draw. – 3Dave Jul 23 '19 at 01:04
  • 1
    @3Dave after clarification looks like you where more right :) +1 – Spektre Jul 23 '19 at 07:03
  • I accepted this answer as it gives all the correct information needed to get started. Thanks! – scippie Jul 23 '19 at 19:34