How would it be possible to exclusively drive the HDMI output from an application, without allowing the OS to automatically configure it for display output?
For example, using the standard DVI/VGA as the primary display, but sending Mplayer video output to the HDMI using a device file.
It's a difficult question to answer via Google. Almost every result has to do with making audio work over HDMI.
(edited here)
A comment, below, mentioned using separate Xorg servers. Although this is a helpful idea, it doesn't answer one question that I asked, and one that I implied:
1) How do I keep Linux from wanting to put the console on that display, if it's loaded before the other display, or if it's the only display (when only SSH is used for login)? 2) What if there is no X? I want to drive graphics to the adapter directly. Can I do this from code using standard functionality, without interacting with the drivers directly (probably outdated, but using SVGALib or some other non-X graphics layer)?
(edited here)
I looked at SVGALib (which is old) and SDL. The latter works both in and outside of X, and even provides access to OpenGL. I found version 1.3 via a forum link, somewhere, but both the website and the FTP only seem to have up to 1.2 . SDL is a beautiful solution, in general, but it has the following two, specific disadvantages:
1) The general create-device call accepts a device index, but completely ignores it:
(src/video/bwindow/SDL_bvideo.cc)
BE_CreateDevice(int devindex)
The driver-specific call seems to have the same flaw. For example, DirectFB (which, I assume, provides graphics under the console):
(src/video/directfb/SDL_DirectFB_video.c)
DirectFB_CreateDevice(int devindex)
Neither of these functions' bodies seems to have an existing place to set the device-index either... No doubt due to lack of support in the standard interface that they're built for.
2) On whatever adapter happens to be elected, SDL seems to automatically attach all displays together. The example "testsprite2.c" (came with library), accepts a "--display" parameter, which is processed within "common.c" (common functionality for all examples). You can see that all it does with the "--display" parameter is calculates the X/Y coordinate of that screen within one large, combined canvas:
if (SDL_strcasecmp(argv[index], "--display") == 0) {
++index;
if (!argv[index]) {
return -1;
}
state->display = SDL_atoi(argv[index]);
if (SDL_WINDOWPOS_ISUNDEFINED(state->window_x)) {
state->window_x = SDL_WINDOWPOS_UNDEFINED_DISPLAY(state->display);
state->window_y = SDL_WINDOWPOS_UNDEFINED_DISPLAY(state->display);
}
if (SDL_WINDOWPOS_ISCENTERED(state->window_x)) {
state->window_x = SDL_WINDOWPOS_CENTERED_DISPLAY(state->display);
state->window_y = SDL_WINDOWPOS_CENTERED_DISPLAY(state->display);
}
return 2;
}
So, there's no way to isolate one display from a another if they're on the same adapter. SDL will not work.
Unless there's a comparable solution to SDL, or it turns out to be trivial to set the particular device (devindex) in the appropriate place (which is probably not the case, and, therefore, probably the reason that it was left unimplemented), it seems like the best option for exclusive and completely dedicated use of the screen is to write your own window manager under a separate Xorg instance assigned to your second device.