In my opinion, Twips is a measurement factor comparable to mm or cm or inch. I thought, it is possible to get with this measurement an output on the screen, which can be measured by a lineal.
In this example, my windows system tells me, the DPI of the monitor is 96. The monitor has a resolution of 3840px x 1600px, which is the native resolution. The scaling factor for output is 100%.
Now I put a button with the width of 1890 Twips on screen. I measured the displayed pixel on screen, which is in fact 126 px. It can be reproduced with 1890 = 1440/96 * 126px
When I calculate 1890 Twips to centimeters, I get roundabout 3.33 cm. I hoped to find this measure on the screen, but I failed. When I measure the button width on the screen, I get 2.85 cm.
So anything is wrong here. The system is lying about the true dpi of my monitor, which is in fact 3,33cm / 2,85cm * 96 dpi = 112,16 dpi
Can anybody explain the results? To make WYSIWYG I thought, I can measure directly from screen.