I have an OpenGL application that outputs stereoscopic 3D video to off-the-shelf TVs via HDMI, but it currently requires the display to support the pre-1.4a methods of manually choosing the right format (side-by-side, top-bottom etc). However, now I have a device that I need to support that ONLY supports HDMI 1.4a 3D signals, which as I understand it is some kind of packet sent to the display that tells it what format the 3D video is in. I'm using an NVIDIA Quadro 4000 and I would like to know if it's possible to output my video (or tell the video card how to) in a way that a standard 3DTV will see the correct format, similar to a 3D Blu-ray or other 1.4a-compatible device, without having to manually select a certain 3D mode. Is this possible?
-
I answered it here: http://stackoverflow.com/questions/6827737/how-do-i-output-3d-images-to-my-3d-tv/6828590#6828590 – datenwolf Aug 11 '11 at 23:14
-
That question is not related to HDMI 1.4a, it only talks about using quad-buffer which I am not using. I am sending a left/right or top/bottom signal to the TV, but the TV needs to know what format I am sending it somehow, WITHOUT manual intervention (as described in the 1.4a standard) The video card has to give the display that information somehow, I just don't know how to do that or if it's possible. – bparker Aug 26 '11 at 05:12
-
You're not supposed to do the frame stacking yourself. It's the task of the graphics card to generate a HDMI-1.4 frame stacking from images rendered to quad buffers. The whole HDMI-1.4 thing is completely irrelevant to application programmers. It's a thing driver developers and electrical engineers have to care about. Not you. – datenwolf Aug 26 '11 at 08:15
-
@datenwolf When I render to quad-buffers, the graphics card does not output a HDMI 1.4 signal... any idea what I'm doing wrong? – bparker Apr 07 '12 at 15:29
-
Possible duplicate of [How do I output 3D images to my 3D TV?](https://stackoverflow.com/questions/6827737/how-do-i-output-3d-images-to-my-3d-tv) – user2284570 Jul 16 '17 at 10:19
3 Answers
I don't see a direct answer for the question.
HDMI 1.4a defines meta data to describe 3D format. video_format 010 means 3D 3d_structure 0000 frame packing, 0110 top-bottom, 1000 side-by-side
But, if the driver doesn't have an api for that, you need to change its code (assuming it's open or you have access)

- 96
- 5
If your drivers allow it, you can create a quad-buffer stereo rendering context. This context has two back buffers and two front buffers, one pair for the left eye and one pair for the right. You render to one back buffer (GL_BACK_LEFT), then the other (GL_BACK_RIGHT), then swap them with the standard swap function.
Creating a QBS context requires platform-specific coding. If you're on Windows, you need to pick a pixel format with quad-buffers.
This is only possible if your drivers allow it. They may not. And if they don't there is nothing you can do.

- 449,505
- 63
- 781
- 982
-
-
1I do not see how quad-buffer is related at all. HDMI 1.4a says that you have to specify the exact layout of your frames (side-by-side, top-bottom, frame sequential etc.) inside of packets that are sent to the monitor, which I assume only the video card driver gets to do. – bparker Aug 26 '11 at 05:09
-
1@bparker: All that HDMI 1.4a stuff is irrelevant if you can't actually render to multiple buffers so that the driver can then bundle them up the way that HDMI needs them to be. Or to put it another way, if you can make a QBS context, then the driver will figure out how to get it on to the screen. If you can't make a QBS context, then you will _not_ be able to get quad-buffering. Not through OpenGL. – Nicol Bolas Aug 26 '11 at 06:03
-
2@NicolBolas Simply rendering the left/right eye views via quad-buffering does not seem to be enough for the nvidia driver to output a HDMI 1.4 signal... any idea what I'm doing wrong? – bparker Apr 07 '12 at 15:28
If your OpenGL application happens to use a sufficiently simple subset of OpenGL, the following might work:

- 9,160
- 7
- 46
- 61
-
Okay, so first you're advising to lose cross-platformness, and second to even lose cross-GPU'ness. Great answer. – Hi-Angel Mar 02 '16 at 12:52