I am working on an application (WPF) that gets monitor information for all attached displays. To do a little unit testing, quick results, I spun up a console application project in the same solution. Here is the code that I use to get MonitorInfo (called the exact same way by both projects, and the code is located in a class library).
[DllImport("user32.dll")]
static extern bool GetMonitorInfo(IntPtr hmon, ref MonitorInfo mi);
[DllImport("user32.dll")]
static extern bool EnumDisplayMonitors(IntPtr hdc, IntPtr lprcClip, MonitorEnumDelegate lpfnEnum, IntPtr dwData);
public DisplayInfoCollection GetDisplays()
{
DisplayInfoCollection col = new DisplayInfoCollection();
EnumDisplayMonitors(IntPtr.Zero, IntPtr.Zero,
delegate(IntPtr hMonitor, IntPtr hdcMonitor, ref Rect lprcMonitor, IntPtr dwData)
{
MonitorInfo mi = new MonitorInfo();
mi.size = (uint)Marshal.SizeOf(mi);
bool success = GetMonitorInfo(hMonitor, ref mi);
if (success)
{
DisplayInfo di = new DisplayInfo();
di.ScreenWidth = (mi.monitor.right - mi.monitor.left);
di.ScreenHeight = (mi.monitor.bottom - mi.monitor.top);
di.MonitorArea = new Rectangle(mi.monitor.left, mi.monitor.top, di.ScreenWidth, di.ScreenHeight);
di.Availability = mi.flags.ToString();
col.Add(di);
}
return true;
}, IntPtr.Zero);
return col;
}
I noticed that the output of the monitor dimensions for my laptop screen (I have a laptop, and a secondary display) is different between the WPF and Console App.
Laptop Resolution
- WPF - 1920x1080
- Console Application - 1536x864
The resolution that I have through my Display-->Screen Resolution settings is the same as the WPF resolution. I imagine that this is something to do with DPI, but I am confused why this is the case? I have not set any DPI awareness on either application. Are WPF apps by default DPI aware to some extent? If not, what can I do to get the output the same no matter what the consumer is?