5

I'm trying to simulate lens distortion effect for my SLAM project. A scanned color 3D point cloud is already given and loaded in OpenGL. What I'm trying to do is render 2D scene at a given pose and do some visual odometry between the real image from a fisheye camera and the rendered image. As the camera has severe lens distortion, it should be considered in the rendering stage too.

The problem is that I have no idea where to put the lens distortion. Shaders?

I've found some open codes that put the distortion in the geometry shader. But this one I guess the distortion model is different from the lens distortion model in Computer Vision community. In CV community, lens distortion usually occurs on the projected plane.

This one is quite similar to my work but they didn't used distortion model.

Anyone have a good idea?

I just found another implementation. Their code implemented the distortion in both of fragment shader and geometry shader. But fragment shader version can be applied in my situation. Thus, I guess the following will work:

# vertex shader
p'=T.model x T.view x p
p_f = FisheyeProjection(p') // custom fish eye projection
Machavity
  • 30,841
  • 27
  • 92
  • 100
Chanoh Park
  • 254
  • 2
  • 16
  • 1
    Another way it's often done is by rendering to a texture which you then map onto a rectangle, which you then distort the shape of – p10ben Jun 11 '17 at 23:43

2 Answers2

8

Lens distortion usually turns straight lines into curves. When rasterizing lines and triangles using OpenGL, the primitives' edges however stay straight, no matter how you transform the vertices.

If your models have fine enough tesselation, then incorporating the distortion into the vertex transformation is viable. It also works if you're rendering only points.

However when your aim is general applicability you have to somehow deal with the straight edged primitives. One way is by using a geometry shader to further subdivide incoming models; or you can use a tesselation shader.

Another method is rendering into a cubemap and then use a shader to create a lens equivalent for that. I'd actually recommend that for generating fisheye images.

The distortion itself is usually represented by a polynomial of order 3 to 5, mapping undistorted angular distance from the optical center axis to the distorted angular distance.

datenwolf
  • 159,371
  • 13
  • 185
  • 298
  • Hi! Thanks for the ideas. I'm using surfel based map representation where each point has a disk with fixed radius around 1cm or less which is small enough. So, I guess your first solution without subdividing them will work for me. The second idea also sounds good. I will check the cubemap. Thanks! – Chanoh Park Jun 12 '17 at 07:06
  • Hi, what do you mean by creating using a shader to create a lens equivalent for the cubemap? – HelloGoodbye Feb 19 '23 at 23:43
  • @HelloGoodbye: see https://stackoverflow.com/a/36211337/524368 – datenwolf Feb 20 '23 at 07:59
  • That just looks like a general introduction to shaders, and I know what a shader is. What I meant was, what do you mean by _create a lens equivalent for the cubemap_? It seems to me like you need to learn to expand more on what you mean, because you're being so brief that your leaving out a lot of details and leaving people having to guess what you mean. – HelloGoodbye Feb 20 '23 at 12:47
  • 1
    @HelloGoodbye cliff's note version: Cubemaps are addressed by a direction vector. So the problem is to come up with a function that maps a screen space location to the direction that a ray would be sent into if a fisheye lens would be used to project an image located in the back focal plane of such a lens. The default perspective transformation that's resembling a pinhole camera is purely linear. So what you need is some mapping that's nonlinear. The lowest hanging fruit is using a polynomial to map an "flat" radial distance into a distorted one. – datenwolf Feb 20 '23 at 17:53
  • 1
    @HelloGoodbye: you can then use this vector for a cubemap lookup. By choosing the coefficients for fhe polynomial appropriately, you can create many kinds of lens distortions, including pin cushion and barrel. – datenwolf Feb 20 '23 at 18:04
  • 1
    @HelloGoodbye: So say you've got a fragment shader uniform vec2 pv that has the viewport center at 0,0, and the corners at (+/-1,+/-1), then you can arrive at a radial distortion by putting the length of this vector through the a 4th order polynomial with coefficients vec4 a by evaluating float `r = length(pv); r = a[0] + r*(a[1] + r*(a[2] +r*a[4]));`. We want to arrive at a direction vector of length 1; coefficients are chosen so that the new `r < 1`, so to obtain a unit length vector the 3rd component would be `sqrt(1-r*r)`, i.e. `vec3 dir = vec3(normalize(pv)*r, sqrt(1-r*r));` – datenwolf Feb 20 '23 at 18:14
  • @datenwolf Aha, that was a very good explanation! Thank you! – HelloGoodbye Feb 23 '23 at 14:04
8

Inspired by the VR community I implemented the distortion via vertex displacement. For high resolutions this is computionally more efficient but requires a mesh with a good vertex density. You might want to apply tessellation before distorting the image.

Here is the code that implements the OpenCV rational distortion model (see https://docs.opencv.org/4.0.1/d9/d0c/group__calib3d.html for the formulas):

#version 330 core
layout (location = 0) in vec3 position;
layout (location = 1) in vec3 normal_in;
layout (location = 2) in vec2 texture_coordinate_in;
uniform mat4 model_matrix;
uniform mat4 view_matrix;
uniform float dist_coeffs[8];
uniform mat4 projection_matrix;
uniform vec3 light_position;
out vec2 texture_coordinate;
out vec3 normal;
out vec3 light_direction;

// distort the real world vertices using the rational model
vec4 distort(vec4 view_pos)
{
  // normalize
  float z = view_pos[2];
  float z_inv = 1 / z;
  float x1 = view_pos[0] * z_inv;
  float y1 = view_pos[1] * z_inv;
  // precalculations
  float x1_2 = x1*x1;
  float y1_2 = y1*y1;
  float x1_y1 = x1*y1;
  float r2 = x1_2 + y1_2;
  float r4 = r2*r2;
  float r6 = r4*r2;
  // rational distortion factor
  float r_dist = (1 + dist_coeffs[0]*r2 +dist_coeffs[1]*r4 + dist_coeffs[4]*r6) 
    / (1 + dist_coeffs[5]*r2 + dist_coeffs[6]*r4 + dist_coeffs[7]*r6);
  // full (rational + tangential) distortion
  float x2 = x1*r_dist + 2*dist_coeffs[2]*x1_y1 + dist_coeffs[3]*(r2 + 2*x1_2);
  float y2 = y1*r_dist + 2*dist_coeffs[3]*x1_y1 + dist_coeffs[2]*(r2 + 2*y1_2);
  // denormalize for projection (which is a linear operation)
  return vec4(x2*z, y2*z, z, view_pos[3]);
}

void main()
{
  vec4 local_pos = vec4(position, 1.0);
  vec4 world_pos  =  model_matrix * local_pos;
  vec4 view_pos = view_matrix * world_pos;
  vec4 dist_pos = distort(view_pos);
  gl_Position = projection_matrix * dist_pos;
  // lighting on world coordinates not distorted ones
  normal = mat3(transpose(inverse(model_matrix))) * normal_in;
  light_direction = normalize(light_position - world_pos.xyz);
  texture_coordinate = texture_coordinate_in;
}

It is important to note, that the distortion are calculated in z-normalized coordinates but are denormalized into view coordinates in the last line of distort. This allows to use a projection matrix like the one from this post: http://ksimek.github.io/2013/06/03/calibrated_cameras_in_opengl/

Edit: For anyone interested to see the code in context, I have published the code in a small library the distortion shader is used in this example.

Tuebel
  • 81
  • 1
  • 4
  • Hello mate,I implemented the code in my shader and its works very well except when I get too close to the object, sometimes I get weird artifacts. It's like when a vertex is out of a certain domain it will be projected wrongly. – user6138759 Apr 21 '20 at 19:14
  • Are you possibly closer to your object than the frustum allows? – Tuebel Apr 23 '20 at 05:44
  • No it's not a frustum problem because when I do the projection without distortion there are no problems. – user6138759 Apr 23 '20 at 07:04
  • I know that projectPoints of OpenCV has its limits, sometimes when the projection of undistorted points is out of a certain ellipse around the image, the distortion gives weird results. So I think this is the same problem. – user6138759 Apr 23 '20 at 07:07