I'm writing a basic Ray-tracer in effort to better understand the whole thing. I've come across an issue that's been holding me back for a little while now, Diffuse shading of a sphere. I've used a formula from the following source to calculate Sphere intersections, and the diffuse shading.
http://www.ccs.neu.edu/home/fell/CSU540/programs/RayTracingFormulas.htm
My code which calculates the shading (An attempted replication of the source code at the link) is shown before. For the most part the calculations appear correct on some spheres, at times, however depending on the lights position depends upon how correct/broken the spheres shading appears to be.
TVector intersect (ray.getRayOrigin().getVectX() + t * (ray.getRayDirection().getVectX() - ray.getRayOrigin().getVectX()),
ray.getRayOrigin().getVectY() + t * (ray.getRayDirection().getVectY() - ray.getRayOrigin().getVectY()),
ray.getRayOrigin().getVectZ() + t * (ray.getRayDirection().getVectZ() - ray.getRayOrigin().getVectZ()));
//Calculate the normal at the intersect point
TVector NormalIntersect (intersect.getVectX() - (position.getVectX()/r),
intersect.getVectY() - (position.getVectY()/r),
intersect.getVectZ() - (position.getVectZ()/r));
NormalIntersect = NormalIntersect.normalize();
//Find unit vector from intersect(x,y,z) to the light(x,y,z)
TVector L1 (light.GetPosition().getVectX() - intersect.getVectX(),
light.GetPosition().getVectY() - intersect.getVectY(),
light.GetPosition().getVectZ() - intersect.getVectZ());
L1 = L1.normalize();
double Magnitude = L1.magnitude();
TVector UnitVector(L1.getVectX() / Magnitude,
L1.getVectY() / Magnitude,
L1.getVectZ() / Magnitude);
//Normalized or not, the result is the same
UnitVector = UnitVector.normalize();
float Factor = (NormalIntersect.dotProduct(UnitVector));
float kd = 0.9; //diffuse-coefficient
float ka = 0.1; //Ambient-coefficient
Color pixelFinalColor(kd * Factor * (color.getcolorRed()) + (ka * color.getcolorRed()) ,
kd * Factor * (color.getcolorGreen()) + (ka * color.getcolorGreen()) ,
kd * Factor * (color.getcolorBlue()) + (ka * color.getcolorBlue()) ,1);
As you can see from the picture, some spheres appear to be shaded correctly, while others are completely broken. At first I thought the issue may lie with the UnitVector Calculation, however when I looked over it I was unable to find issue. Can anyone see the reason as to the problem?
Note: I'm using OpenGl to render my scene.
Update: I'm still having a few problems however I think they've mostly been solved thanks to the help of you guys, and a few alterations to how I calculate the unit vector. Updates shown below. Many thanks to everyone who gave their answers.
TVector UnitVector (light.GetPosition().getVectX() - intersect.getVectX(),
light.GetPosition().getVectY() - intersect.getVectY(),
light.GetPosition().getVectZ() - intersect.getVectZ());
UnitVector = UnitVector.normalize();
float Factor = NormalIntersect.dotProduct(UnitVector);
//Set Pixel Final Color
Color pixelFinalColor(min(1,kd * Factor * color.getcolorRed()) + (ka * color.getcolorRed()) ,
min(1,kd * Factor * color.getcolorGreen()) + (ka * color.getcolorGreen()) ,
min(1,kd * Factor * color.getcolorBlue()) + (ka * color.getcolorBlue()) ,1);