I would like to determine the maximum possible windowed size on the current display during initialization to determine if settings are valid. The only way I could figure out to do this is to create a temporary maximized window and call SDL_GetWindowSize()
. This works as expected on macOS, but fails on Linux.
Here is my code:
int max_width, max_height;
SDL_Window* test_window = SDL_CreateWindow( "test_window",
SDL_WINDOWPOS_CENTERED_DISPLAY( current_display_id ),
SDL_WINDOWPOS_CENTERED_DISPLAY( current_display_id ),
20, 10, // No display is this small
SDL_WINDOW_RESIZABLE | SDL_WINDOW_MAXIMIZED
);
SDL_GetWindowSize( test_window, &max_width, &max_height );
On macOS max_width and max_height are properly set to the maximum windowed size. On Linux however, they are set to 20 and 10.
Could anyone tell me why this doesn't work on Linux? Is there a better way to accomplish the same thing?
Edit: Maybe I wasn't clear enough about what I mean by maximum windowed size. Most operating systems have some kind of taskbar/dock that limits the usable area of the display, so "maximum windowed size" is not equal to the display size obtained with SDL_GetDesktopDisplayMode()