16

Whenever a non-integer pixel value is used for the border of an element, the browser simply truncates the value to become an integer. Why is this the case?

I'm aware that the border will not actually take up part of a pixel, but these types of values are sometimes used in combination with others to form full pixels. For example, the left and right border having widths of 1.6px should cause the total width of the element to increase by 3px. This works because the full value is stored in memory and used for calculations.

However, this seems to not be the case when rendering the border even though width, padding, and margin all behave correctly.

var div = document.getElementsByTagName('div'),
    len = div.length,
    style;
for (var i = 0; i < len; i++) {
    style = getComputedStyle(div[i]);
    div[i].innerHTML = div[i].className + ': ' + style.getPropertyValue(div[i].className) + '<br>height: ' + style.getPropertyValue('height');
}
div {
    width: 300px;
    border: 1px solid black;
    padding: 50px 0;
    text-align: center;
    -webkit-box-sizing: border-box;
    -moz-box-sizing: border-box;
    box-sizing: border-box;
}
div.width {
    width: 300.6px;
}
div.padding-top {
    padding-top: 50.6px;
}
div.margin-top {
    margin-top: 0.6px;
}
div.border-top-width {
    border-top-width: 1.6px;
}
<div class="width"></div>
<div class="padding-top"></div>
<div class="margin-top"></div>
<div class="border-top-width"></div>

When tested, the code produced the same results (disregarding exact precision) consistently. Most major browsers (Chrome, Firefox, Opera) behaved the same. The exceptions were Safari 5.1 (which rendered padding and margin similar to border, but this is probably just due to the version) and Internet Explorer (which calculated the border-top-width correctly).

Width, padding, and margin all were remembered as decimal values and allowed for padding to affect height accordingly, but border was not. It was truncated to an integer. Why is this specifically only the case width border? Would there be any way to make the border value be remembered in a fuller form so that the true height of the element could be retrieved using JavaScript?

Community
  • 1
  • 1
Anonymous
  • 11,748
  • 6
  • 35
  • 57
  • 1
    Why not just do a calculation? – StackSlave Jun 14 '15 at 23:49
  • @PHPglue The border value is in an included css file that is occasionally overwritten. To do a calculation, I would need to know what the border value is. The problem is that it returns 1px when it should return roughly 1.6px (in terms of the example). – Anonymous Jun 14 '15 at 23:51
  • Don't use decimals for pixels. – StackSlave Jun 14 '15 at 23:59
  • @PHPglue A blanket statement like that can't be definitively made. It's similar to saying "don't parse HTML with regex". Like I said, "these types of values are sometimes used in combination with others to form full pixels". There can be cases where it makes sense and cases where it's ridiculous. – Anonymous Jun 15 '15 at 00:21
  • 1
    Are you sure it's not just the typo in your example (border-top-width vs. border-left-width) that's causing this? I tested it in IE10, and sub-pixel border-widths work as expected. – m69's been on strike for years Jun 17 '15 at 06:20
  • @m69 Good catch. Must have accidentally switched it during my testing. Same problem though. – Anonymous Jun 17 '15 at 10:36
  • @Anonymous _"Width, padding, and margin all were remembered as decimal values and allowed for padding to affect height accordingly, but border was not. It was truncated to an integer."_ Not certain interpret question correctly ? Where was `border` value returned as truncated to an integer ? – guest271314 Jun 21 '15 at 03:10
  • @Anonymous `border-top-width` appear to return `1.6px` ? – guest271314 Jun 21 '15 at 03:13
  • [Relevant discussion between Anonymous and guest271314](http://chat.stackoverflow.com/rooms/81098/discussion-between-anonymous-and-guest271314) – Anonymous Jun 21 '15 at 03:42

2 Answers2

9

The simple explanation is that the browser uses integers for border widths internally (or at least exposes them publicly as such).

An example of this is the source code of Chrome (Chromium) which in the file ComputedStyle.h defines all border-widths as integers (line 508):

source code snapshot

There is little we can do with that and as to why: there is very little information about border widths in the W3C specification for CSS Backgrounds and Borders. It only states line-width with no units, type or definition about how to treat this unit except it is absolute (non-negative):

Value: <line-width>
[...]
Computed value: absolute length; ‘0’ if the border style is ‘none’ or ‘hidden’

And:

The lengths corresponding to ‘thin’, ‘medium’ and ‘thick’ are not specified, but the values are constant throughout a document and thin ≤ medium ≤ thick. A UA could, e.g., make the thickness depend on the ‘medium’ font size: one choice might be 1, 3 & 5px when the ‘medium’ font size is 17px or less. Negative values are not allowed.

The same information is found in the box model document with no new details.

As all values eventually end up as pixel values (as our screens are pixel-devices) the number coming through em, vw, % etc. seems to end up as an integer when it comes to border widths without considering sub-pixeling.

Not even transforms (scale) seem to affect this in the browsers which use integers for border widths.

In the end, it seems to be up to the browser vendor how to treat these values (it could simply be aesthetic reasons for doing so, performance, .. we can only guess..).

  • I doubt it's performance since it already uses maintains float values for other properties. As for aesthetics, the pixel value wouldn't necessarily make a change in how it *looks*, just how it is remembered. I can't quite see any good reason for the inconsistency between property handling, but this answer is quite good otherwise. I'll leave the bounty open just to see if any other answers come, but this looks to be the unfortunate truth. – Anonymous Jun 17 '15 at 12:47
  • It's probably not performance, but more developer habits. Developers tend to favour the most efficient data type (with good reasons). And since the developers didn't see a reason to include non-integer values for borders, they chose integers. – light Jun 18 '15 at 11:22
  • 2
    it could be an artifact of html3/4 `border=3` attributes, which only took an integer. – dandavis Jun 23 '15 at 21:29
1

As far as I can tell from a few simple tests, subpixel border widths work exactly as they should. I think a typo in you example ("border-top-width" vs. "border-left-width") may be the cause of the discrepancy. This example works as expected for me:

var div = document.getElementsByTagName("div");
for (var i = 0; i < div.length; i++)
{
    div[i].innerHTML += getComputedStyle(div[i]).getPropertyValue("width");
}
DIV
{
    width: 300px;
    border: 1px solid black;
    -webkit-box-sizing: border-box;
    -moz-box-sizing: border-box;
    box-sizing: border-box;
}
DIV.subpixel
{
    border-width: 1.23px;
}
<DIV>border-width: 1px<BR>width: </DIV>
<DIV CLASS="subpixel">border-width: 1.23px<BR>width: </DIV>