0

Is it possible to "look at" an OpenGL scene which was rendered at e.g. 60 degrees vertical FoV through a frustum/corridor/hole that has a smaller FoV - and have that fill the resulting projection plane?

I thinks that's not the same thing as "zooming in".

Background:

I'm struggling with the OpenGL transform pipeline for an optical see-through AR display here. It could be I have my understanding of what the OpenGL transform pipeline of my setup really needs mixed up... I'm creating graphics that are meant to appear properly located in the real world when being overlaid through AR glasses. The glasses are properly tracked in 3D space. For rendering the graphics, I'm using OpenGL's legacy fixed function pipeline. Results are good but I keep struggling with registration errors that seem to have their root in my combination of glFrustum() plus glLookAt() not recreating the "perspective impression" correctly.

These AR displays usually don't fill the entire field of view of the wearer but the display area appears like a smaller "window" floating in space, usually ~3-6 feet in front of the user, pinned to head movement.

In OpenGL, I use a layout very similar to Bourke's where (I hope I summarize it correctly) the display's aspect ratio (e.g. 4:3) with windowwidth and windowheight defines the vertical Field of View. So FoV forms a fixed link with window dimensions and the "transform frustum" used by OpenGL - while I need to combine two frustums (?):

My understanding is that the OpenGL scene must be rendered with parameters equivalent to "parameters" of the human eye in order to match up - as the AR glasses allow the user to look through. Let's assume the focal length of the human eye is 22mm (Clark, R.N. Notes on the Resolution and Other Details of the Human Eye. 2007.) and the eyes' "sensor size" is 16mm w x 13mm h (my estimate). The calculated vertical FoV is ~33 degrees then - which we feed into the OpenGL pipeline.

The output of such a pipeline would be that I get either the application window filled with this "view" or I can get a scaled down version of if, depending on my glViewport settings.

But as the AR glasses need input for only a sub-section, a "smaller window", of the whole field of view of the human wearer, I think I need a way to "look at" a smaller sub-area of the whole rendered scene - as if I was looking through a tiny hole onto the scene. These glasses, with their "display window", provide a vertical field of view of around under 20 degrees - but feeding that into the OpenGL pipeline would be wrong. So, how can I combine these conflicting FoVs? ...or am I on the wrong track here?

isync
  • 537
  • 6
  • 15
  • 1
    "These glasses, with their "display window", provide a vertical field of view of around under 20 degrees - but feeding that into the OpenGL pipeline would be wrong." What makes you think so? You have to render the perspective with a FoV which matches the FoV which is displayed, the FoV of the eye is bigger because the glasses' displays don't cover the full area, and the "clipping area" you seek in the question is actually exactly this change in the FOV angle. "Cropping" a perspectively projected image _is_ a change in FoV. – derhass Nov 16 '20 at 14:34
  • @derhass ..because I tried both and both felt wrong. Admittedly, it could be something else producing these registration errors I'm seeing, but whatever FoV I tried, so far I wasn't able to match actual perspective with my overlay - and thought there is a more general design flaw. .. Also: I'm pondering if the offset of the "virtual view plane" in front of the user has to be taken into account. So far the eye is the "pivot".. but isn't the actual display plane..? See, I'm getting more lost by the hour .. Are you talking from experience? Did you successfully implement such a system/design? – isync Nov 16 '20 at 17:27
  • 1
    The "virtual display plane" isn't really a thing in this context, as you want your objects to appear freely in 3D space. That plane is usually just a reference for the opitcal focus point, or just as another means of expressing FoV in a more marekting-friendly way ("40inch at 2m" which just means a tiny FoV). I have written software for VR powerwalls / CAVE enivronments as well as VR HMDs, and recently, I ported one of our apps to the Epson Moverio series AR glasses. – derhass Nov 16 '20 at 18:53
  • 1
    For a realistic AR overlay, the FoV is the least of your problems, you usually have to take the asymmetry of the projection, the user's _actual_ eye distance and the properties of the optics (distortion) into account, and this usually means you have to do some more or less elaborate calibration process,a and usually, the SDK for the particular AR solution does offer some insight on how to do that. – derhass Nov 16 '20 at 18:53
  • 1
    Also note that the link to Bourke you gave is totally valid, but only for cases where you look at a stereo display where the image for both eyes are presented on the very same display surface - like 3D TVs/projectors or 3D cinema. For HMDs / AR glasses, you typically have a separate display / display region per eye, and the actual field of view that you look at each display is totally custom to the device, but assuming a symmetrical frustum usually is a good first bet. – derhass Nov 16 '20 at 19:02
  • 1
    If you apply Bourke's method there, you get complete crap, as the zero parallax plane which would be at the depth where the display is would actually be at optical infinity for such HMDs / glasses. – derhass Nov 16 '20 at 19:03
  • @derhass Thank you so much!! Many important pointers in there already! For example, you've confirmed that an asymmetric frustum parallel axis projection is wrong for my use case. I had a hunch while studying Bourke's code but only found out after implementing it that in stereo view, on the display, it simply does not work. And got far better results with the simple symmetric frustum toe-in layout. Also the note about the irrelevance of the "virtual display plane" is gold. With that out of the way, I can now concentrate on fine-tuning FoV, Inter-Ocular Distance and implement some calibration. – isync Nov 16 '20 at 20:36
  • @derhass I've sent you an email. – isync Nov 16 '20 at 21:10
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/224642/discussion-between-derhass-and-isync). – derhass Nov 16 '20 at 21:43

0 Answers0