50

I am struggling to understand exactly what the point size in UIFont means. It's not pixels and it doesn't appear to be the standard definition of point which is that they relate to 1/72th inch.

I worked out the pixel size using -[NSString sizeWithFont:] of fonts at various sizes and got the following:

| Point Size | Pixel Size |
| ---------- | ---------- |
| 10.0       | 13.0       |
| 20.0       | 24.0       |
| 30.0       | 36.0       |
| 40.0       | 47.0       |
| 50.0       | 59.0       |
| 72.0       | 84.0       |
| 99.0       | 115.0      |
| 100.0      | 116.0      |

(I did [@"A" sizeWithFont:[UIFont systemFontOfSize:theSize]])

And looking at the 72.0 point size, that is not 1-inch since this is on a device with a DPI of 163, so 1-inch would be 163.0 pixels, right?

Can anyone explain what a "point" in UIFont terms is then? i.e. is my method above wrong and really if I used something else I'd see something about the font is 163 pixels at 72 point? Or is it purely that a point is defined from something else?

mattjgalloway
  • 34,792
  • 12
  • 100
  • 110
  • Did u check this already: http://stackoverflow.com/questions/1059101/font-size-in-pixels ? – tiguero Aug 02 '12 at 11:56
  • Yup and if you read that you'll see why I expected the value for 72 in my table above to come out as 163 pixels. But it doesn't, it's *way* off. – mattjgalloway Aug 02 '12 at 11:59
  • I don't follow your argument; just because you measure the size, in pixels, of the letter "A" using a 72-point font is just showing that it will take-up approx half an inch. You seem to assume that the letter "A" should take up an inch of screen. – trojanfoe Aug 02 '12 at 13:05
  • 1
    The height in pixels from `sizeWithFont:` includes the ascender and descender, so it's the whole line height in pixels. Also, my question still stands - what is the relationship between point size and anything real world. Again note that the point and pixel size seem to be related in no sensible way. – mattjgalloway Aug 02 '12 at 13:27
  • 1
    **CRITICAL** ... http://stackoverflow.com/a/38315462/294884 – Fattie Jul 13 '16 at 18:40

5 Answers5

13

A font has an internal coordinate system, think of it as a unit square, within which a glyph's vector coordinates are specified at whatever arbitrary size accommodates all the glyphs in the font +- any amount of margin the font designer chooses.

At 72.0 points the font's unit square is one inch. Glyph x of font y has an arbitrary size in relation to this inch square. Thus a font designer can make a font that appears large or small in relation to other fonts. This is part of the font's 'character'.

So, drawing an 'A' at 72 points tells you that it will be twice as high as an 'A' drawn at 36 points in the same font - and absolutely nothing else about what the actual bitmap size will be.

ie For a given font the only way to determine the relationship between point size and pixels is to measure it.

hooleyhoop
  • 9,128
  • 5
  • 37
  • 58
  • 3
    All good explanations, but not what I am seeing. Notice that doubling point size does not double pixel size, which contrary to what you're saying. Also, the size measured is the entire line height from what I can see, which I assumed is what the 1 inch refers to in 72 point per 1 inch? What I measure is ~0.5 inches at 72 point font on an iPhone. – mattjgalloway Aug 02 '12 at 20:52
  • If anything is unclear please let me know, but no - 72 point/inch has nothing to do with line height, except relatively, as i have tried to explain. Also allowing for anti-aliasing (+-2 pixels)(remember it is the vector artwork size that is doubled) your table does show that doubling point size doubles pixel size. I don't understand how you are interpreting this differently. – hooleyhoop Aug 02 '12 at 21:20
  • 1
    I still don't quite understand what you mean. You say 72 point/inch has nothing to do with line height, but what *does* 72 point/inch relate to then? Also, notice the value for 99.0 - you'd expect it to be close to the one at 10 point multiplied by 10 by your explanation, but it's way off. – mattjgalloway Aug 03 '12 at 08:18
  • Infact I added one for 100.0 point and that is also way off the size at 10 point multiplied by 10. – mattjgalloway Aug 03 '12 at 08:25
  • As i said, 72 point/inch relates to the transform of the fonts interior coordinate system. At 72points, 1 font unit (the font coordinate system doesn't have a name) equals 72points. The important bit is that there is no way to go from this information (I know that one font unit will be 72 points) to knowing what the pixel size of a glyph will be - that is entirely font dependent. Did you try different fonts? Try Geneva and Courier. – hooleyhoop Aug 03 '12 at 08:50
  • Also, at 13 pixels high, the height of your 10pt string is perhaps 20-30% anti-aliasing artefacts, which you are multiplying by 10, so no, 100pt won't be 130px high. You don't seem to get that it is the vector description that is scaled - then this is rasterized. I bet the size at 100pt is within the range 13+-2.5 *10 - (105-155px) right? why don't you compare 50pt and 100pt (please add this to the table)? – hooleyhoop Aug 03 '12 at 09:02
  • Ah right yep that does make more sense now, thanks. OK so there really is no relation to physical height when rendered, it's basically just points are another unit of measure which are proportional to pixel size (give or take anti-aliasing) but the proportion is just system dependent and not defined at all? – mattjgalloway Aug 03 '12 at 09:08
  • No, points are a unit of measurement absolutely related to pixel size and the relationship is clearly defined (this is your screen resolution). It's just that the relationship between font size and character size is font dependant. 72pt "Lol" in Verdana is a different size than 72pt "Lol" in Courier. – hooleyhoop Aug 03 '12 at 09:20
  • I'm sorry then but I don't understand. What I want to know is, what can I absolutely guarantee about the pixel size of a 72 point font on a 163 DPI screen? Be that a certain character, a certain metric of the font, or whatever. What is it that is guaranteed to be a certain size? – mattjgalloway Aug 03 '12 at 09:39
  • ok, at 72point font on 163dpi screen - a unit square drawn in the font's internal coordinate system, before rasterisation, will be exactly 163 pixels high. Guaranteed. – hooleyhoop Aug 03 '12 at 09:59
  • Ok excellent. So why is it that Helvetica's line height comes out at 84 pixels? I don't see how any glyph is going to draw *that* much outside of the line such that the font's "unit square" is 1 inch. Put another way - what is the font's unit square defined as? – mattjgalloway Aug 03 '12 at 10:44
  • the unit square is the coordinate system 0 to 1 on the x-axis and 0 to 1 on the y - axis. a vertical line specified with coords (0.0,0.0) to (0.0,0.5) in a 72pt font on a 163dpi screen will be roughly (plus or minus font smoothing and anti aliasing) 84 pixels high. – hooleyhoop Aug 03 '12 at 11:02
  • Ok, makes sense. So it's just that Helvetica decides that a "line" should be as high as ~0.5 font units then? It seems very small to me you see. But if that's the case, then that's the case. Also, is there anywhere I can find out the value of the line height in font units? – mattjgalloway Aug 03 '12 at 12:57
6

I am not sure how -[NSString sizeWithFont:] measures the height. Does it use line height or the difference between the peaks of the beziers? What text did you use?

I believe -[UIFont lineHeight] would be better to measure the height.

Edit: Also, note that none of the measurement methods returns the size in pixels. It returns the size in points. You have to multiply the result by [UIScreen mainScreen].scale.

Note the difference between typographic points used when constructing the font and points from iOS default logical coordinate space. Unfortunately, the difference is not explained very clearly in the documentation.

Sulthan
  • 128,090
  • 22
  • 218
  • 270
  • `lineHeight` is also in points, so irrelevant to the question unfortunately. I used the text `@"A"`. But remember, that's also irrelevant to the question. – mattjgalloway Aug 02 '12 at 12:11
  • @mattjgalloway Edited the answer. There are two different units, both referred to as `point`. – Sulthan Aug 02 '12 at 12:19
  • But that's not right either surely? I know about points vs pixels in terms of the screen scale, this is very different. Notice my table of results - the ratio is tending toward 1.16 looking at the results as far as I went. Seems an odd ratio. – mattjgalloway Aug 02 '12 at 12:22
  • 1
    Weirdly the points => pixels of my results seems to be approximately `pixels = ROUND((1 + points) * (115 / 99))`. – mattjgalloway Aug 02 '12 at 12:28
4

I agree this is very confusing. I'm trying to give you some basic explanation here to make the things clearer.

First, the DPI (dot-per-inch) thing comes from printing, on physical papers. So does font. The unit point was invented to discribe physical printing size of text, just because inch is too large for usual text sizes. Then people invented point, that is the length of 1/72 inch (actually evolved in the history), to describe text size easily. So yes, if you are writing a document in Word or other word processing software for printing, you will get absolutely one-inch-height text if you use 72pt font.

Second, the theoretical text height is usually different from the rendered strokes you can actually see by your eyes. The original text height idea came from the actual glyphs used for printing. All letters are engraved on glyph blocks, which share the same height – which matches the font point height. However, depending on different letters and different font design, the actual visible part of the text may a little bit shorter than the theoretical height. Helvetica Neue is actually very standard. If you measure the top of a letter "k" to the bottom of a letter "p", it will match the font height.

Third, computer display screwed up DPI, as well as the definition of point at the same time. The resolution of computer displays are described by their native pixels, such as 1024 x 768 or 1920 x 1080. Software actually doesn't care the physical size of your monitors, because everything would be very fuzzy if they scale screen content like printing on paper — just the physical resolution is not high enough to make everything smooth and legit. Software uses a very simple and dead way: Fixed DPI for whatever monitor you use. For Windows, it's 96DPI; for Mac, it's 72DPI. That's said, no matter how many pixels make an inch on your monitor, software just ignores it. When the operating system renders text in 72pt, it would be always 96px high on Windows and 72px high on Mac. (That's why Microsoft Word documents always look smaller on Mac and you usually need zoom to 125%.)

Finally on iOS, it's very similar, no matter it's iPhone, iPod touch, iPad or Apple Watch, iOS uses the fixed 72DPI for non-retina screen, 144DPI for @2x retina display, and 216DPI for @3x retina display used on iPhone 6 Plus.

Forget about the real inch. It only exists on actual printing, not for displaying. For software displaying text on your screen, it's just an artificial ratio to physical pixels.

Richard Bao
  • 415
  • 2
  • 8
3

I first wondered if this had something to do with the way [CSS pixels are defined at 96 per "inch"][1] while UI layout points are defined at 72 per "inch". (Where, of course, an "inch" has nothing to do with a physical inch.) Why would web standards factor into UIKit business? Well, you may note when examining stack traces in the debugger or crash reports that there's some WebKit code underlying a lot of UIKit, even when you're not using UIWebView. Actually, though, it's simpler than that.

First, the font size is measured from the lowest descender to the highest ascender in regular Latin text -- e.g. from the bottom of the "j" to the top of the "k", or for convenient measure in a single character, the height of "ƒ". (That's U+0192 "LATIN SMALL LETTER F WITH HOOK", easily typed with option-F on a US Mac keyboard. People used it to abbreviate "folder" way back when.) You'll notice that when measured with that scheme, the height in pixels (on a 1x display) matches the specified font size -- e.g. with [UIFont systemFontOfSize:14], "ƒ" will be 14 pixels tall. (Measuring the capital "A" only accounts for an arbitrary portion of the space measured in the font size. This portion may change at smaller font sizes; when rendering font vectors to pixels, "hinting" modifies the results to produce more legible onscreen text.)

However, fonts contain all sorts of glyphs that don't fit into the space defined by that metric. There are letters with diacritics above an ascender in eastern European languages, and all kinds of punctuation marks and special characters that fit in a "layout box" much larger. (See the Math Symbols section in Mac OS X's Special Characters window for plenty of examples.)

In the CGSize returned by -[NSString sizeWithFont:], the width accounts for the specific characters in the string, but the height only reflects the number of lines. Line height is a metric specified by the font, and related to the "layout box" encompassing the font's largest characters.

rickster
  • 124,678
  • 26
  • 272
  • 326
  • Ok, great explanation. But I still don't quite fully understand what a "point" is in UIFont terms. What part of what font size will measure a certain size when on screen? I like the idea about ƒ though. I will play with that! – mattjgalloway Aug 03 '12 at 08:16
  • I tried an `ƒ` at 72.0 point and it wasn't 72.0 pixels high. Is it just a magic number at 14 where it works and then it diverges? Still begs the question - "what is a point". – mattjgalloway Aug 03 '12 at 08:30
  • A point as pertains to `UIFont` is the same as a point for elsewhere in UIKit geometry. How it corresponds to rendered text is what's variable. The "point size" of a font sets up a bounding height which is generally connected to the ascender-descender height of the glyphs (and a subrange of the total line height, some of which may be used by other glyphs). However, *font designers can do whatever they want within that geometry*, so glyph size may not match point size, and line height will always be (and should, within a given font, consistently be) a bit more than point size. – rickster Aug 03 '12 at 23:20
  • Thanks for this post; You may be interested in this question .. http://stackoverflow.com/questions/38203209/font-sizes-in-uiwebview-does-not-match-ios-is-72-96-magic-correction-best-sol @rickster – Fattie Jul 05 '16 at 12:24
0

The truth, as far as I have been able to ascertain, is that UIFont lies. All of UIKit takes liberties with fonts. If you want the truth you need to use CoreText, but in a lot of cases it will be slower! (So in the case of your pixel height table I think it was that it adds some sort of a + bx factor where x is point size.

So why does it do this? Speed! UIKit rounds up stuff and fiddles with spacing so that it can cache bitmaps. Or at least that was my take away!

idz
  • 12,825
  • 1
  • 29
  • 40