44

With the iPhone 4, the Retina display's resolution is so high that most people cannot distinguish the pixels from one another (supposedly). If this is the case, do apps that support the Retina display still need anti-aliasing to make fonts and images smooth, or is this no longer necessary?


Edit: I'm interested in more detailed information. Started a bounty.

Jake
  • 4,829
  • 2
  • 33
  • 44

8 Answers8

34

There's no question at all - you still need to do antialiasing mathematics, because of the complexity of curves, second order curves, intersecting curves, and different types of joins.

(Note too that, very simply, since this question appeared two years ago. Retina displays are now ubiquitous and - indeed - antialiasing is, in fact, done everywhere on every Retina display.)

Sure, straight lines (perhaps at 45 degrees) may conceivably test as well in A/B tests. But just look at a shallower line, or a changing differential.

And wait - there's a knock-down argument here............

Don't forget that you can display typography really, really small on a retina display!!!

One could say that you need antialiasing, whenever letter are less than (let's say) 50 pixels high. Thus if you had a crappy 10 dot per inch display ... but the letters were 80 feet high (8000 pixels high) you would NOT need antialiasing. We've just proved you "don't need" antialiasing on a 10 ppi display.

Conversely, let's say Steve's next display has 1000 pixels per inch. You would STILL need antialiasing for very small type -- and any very small detail -- that is 50 pixels or less!

Furthermore: don't forget that the detail in type ... which is a vector image ... is infinite!

You might be saying, oh the "body" of a baskerville "M" looks fine with no antialiasing, on a retina display. Well, what about the curves of the serifs? What about the chipping on the ends of the serifs? And so on down the line.

Another way to look at it: ok, on your typical Mac display, you don't need antialiasing on flat lines, or maybe 45degree lines. further, on a retina display you can get away with no atialiasing on maybe 22.5 degree lines, and even 12.25 degree lines.

But so what? If you add antialiasing, on a retina display, you can successfully draw ridiculously shallow lines, much shallower than on for example a pre-retina MacBook display.

Once again as in the previous example, say the next iPhone has one zillion pixels per inch. Still, adding antialiasing will let you have EVEN SHALLOWER good-looking lines -- by definition, yes, it will always make it look better because it will always improve detail.

Note that the "eye resolution" business from the magazine articles is total and complete nonsense.

Even on say 50 dpi displays, you're only seeing a fuzzy amalgam created by the mathematics of the pixel display strategy.

If you don't believe this is so, look at this writing right now on your Mac, and count the pixels in the letter "r". Of course, it's inconceivable you could do that!! You could maybe "resolve" pixels on a 10 dpi display. What matters is the mathematics of the fuzz created by the display strategy.

Antialiasing always creates "better fuzz," as it were. If you have more pixels to begin with, antialiasing just gives even better again fuzz. Again, simply put under consideration even smaller features, and of course you'd want to antialias them.

That seems to be the state of affairs!

Fattie
  • 27,874
  • 70
  • 431
  • 719
  • I upvoted this answer. Someone else voted it down. I'd be interested to know why. – Jake Jan 19 '11 at 19:38
  • 3
    I hate those "strategic" downvotes; they go against the spirit of the forum. – Jake Jan 19 '11 at 19:57
  • 10
    There is no question that antialiasing will always produce more accurate results. The question is, at what resolution does the difference become indistinguishable to the human eye? There is a cost to doing antialiasing, and if it can be avoided you'll be better off. I'd agree with you that a very shallow angle line is the best test. – Mark Ransom Jan 20 '11 at 04:48
  • 1
    Nice answer. But practically speaking, will the end user be able to appreciate the effect of anti-aliasing filter if the pixel density is considerable . ( will not the device, in this case Iphone, not perform a preprocessing to improve sampling anyways.) – uncaught_exceptions Jan 20 '11 at 18:21
  • @Mark: I think that Joe point is that it isn't the resolution that matters, it's the ratio between the size of the display and the resolution. In other words, what matters is the size of the pixel. – Bruno Brant Jan 20 '11 at 19:42
  • This is exactly the type of detailed information I was seeking. Thanks. – Jake Jan 20 '11 at 20:05
  • @Bruno, the term "resolution" can refer to either the total number of pixels or the pixels (dots) per inch. In this case I was referring to DPI. I've been using the term this way for over 20 years, but it seems the other definition is more common - I'll have to watch myself in future. – Mark Ransom Jan 22 '11 at 05:27
  • @Joe Blow, I agree with your answer for the iPhone 4, hence the upvote from me. But if a screen actually had a gazillion pixels per inch, no human could actually tell the difference. Our universe also sets a theoretical upper limit to the value of antialiasing: http://xkcd.com/878/ (Look for the Matryoshka Limit in the image.) – Prof. Falken Apr 07 '11 at 06:13
  • @Amiga - hi Amiga. You know what the thing is: really **shallow lines**. You can make pixels-per-inch to abiltiy-to-do-shallow-lines. If you tell me "well a zillion pixels per inch" I merely say "ok, make the line twice as shallow". You then say 100 zillion, I say, ok another twice as shallow. Quite simply "If you have more pixels to begin with, antialiasing just gives **even better again** fuzz." At 100 zillion PPI, antialiasing would give astoundingly beautifully better fuzz than non-anti-aliasing! Plus the shallow-lines observation. – Fattie Apr 07 '11 at 19:46
  • PS novel display concepts would have non-regular (not on an XY grid) pixels - they'd be way the hell over the place, randomly. This would end the special place of horizontal and vertical lines. Like in the real world everything could be slightly rotated everywhere. – Fattie Apr 07 '11 at 19:48
  • @Joe Blow, I have read about these screens, very interesting. But no practical way to manufacture or address these pixels yet. But I wouldn't be surprised to see it eventually. One approach is to make a "random" matrix of pixels and then map the screen out in a scanner. A ROM or something like it would be programmed with the unique "fingerprint" of the screen. – Prof. Falken Apr 07 '11 at 23:04
  • @Joe Blow, I don't know if you are joking or not, but some scifi authors I consider very much smarter than I. :-) – Prof. Falken Apr 08 '11 at 08:00
  • This answer doesn't seem to be complete. Obviously mathematically the curve could always be smoother, but that *does not* mean you would be able to tell the difference with the human eye. If you printed two copies of a document anyone would say they're the same, but technically microscopically there would be differences in size of the ink on the page. This doesn't matter because it's irrecognizable. Is the retina display that good? I don't know- but you could theoretically go high enough to where anti-aliasing would be irrelevant in our ability to perceive corners in certain scenarios. (games) – KTF Sep 25 '12 at 16:23
  • hi KTF, it's already explained a number of times in the answer. simply consider the "ever shallower line" test. plus anti-aliasing is not about "making eyes not see steps". it is about creating a "fuzz" that the eye sees. an interesting point is that, now that retina displays are everywhere .... simply create a test to turn AA on and off and you'll see the answer. indeed all of this question could be replaced with a few words - "grab a retina display, test it, and you'll see the obvious answer" – Fattie Sep 27 '12 at 11:25
  • I cannot believe this was voted as best answer because it's so wrong. The purpose of antialiasing is to improve readability and appearance on low resolution displays but when we reach the resolution that a pixel is hardly observable then we don't need antialiasing at all. The mathematics behind antialiasing are not encoded into vector or font formats. It is dependent on applications or OS's to "smooth" the curves. – netrox Jul 04 '18 at 03:08
  • hi @netrox. I don't follow you. Very simply, **yes**, antialiasing **is done** as a matter of fact on current Retina iPhones! You said: *"when we reach the resolution that a pixel is hardly observable then we don't need antialiasing at all"*. That sentence is, very simply, wrong. If we had a trillion physical pixels per inch, you'd still antialias. – Fattie Jul 04 '18 at 15:26
  • @Fattie my point is that when you're looking at a display where the resolution is so high that you cannot perceive a single pixel, you just don't need antialiasing for that viewing. Fonts and vectors themselves do not carry information about antialiasing as far as I am aware, it's entirely the job of OS or application to anti-alias fonts and vectors. Even CSS has several antialiasing algorithms that can be selected for antialiasing fonts for low DPI monitors but it's unlikely mobile devices will support it given that mobiles have high DPI displays anyway. – netrox Jul 05 '18 at 02:07
  • @Fattie one more thing, all iOS devices and MacOS anti-alias fonts and vectors but on iOS, I am sure it uses "grayscale" anti-aliasing where surrounding pixels are given different shades to give an appearance of smooth edges. On MacOS, it uses subpixel antialiasing by default but can be changed to grayscale antialiasing which I use because it just looks better on Retina. I still can see a pixel on its own so it's not "high enough" yet. But I've seen mobiles with >400 dpi and I am thinking that having antialiasing on that screen would not make a difference. – netrox Jul 05 '18 at 02:19
  • it's just not correct, Netrox. "when you're looking at a display where the resolution is so high that you cannot perceive a single pixel" .. simply then display *smaller type* or straight lines on even *shallower angles* and you need antialiasing. indeed antialiasing mathematics works *even better* on higher resolution screens. you'r simulating a light field. – Fattie Jul 05 '18 at 03:08
11

The resolution at which the eye/brain will detect a discontinuity or stair edge is higher than the resolution at which it can resolve individual pixels. The Retina display appears to be high enough for the latter.

But throw in image animation, hand motion, vehicle vibration, imperfect eyesight, display reflections, et.al. and you may have to experiment to determine whether the former makes any difference in your particular application.

Moshe
  • 57,511
  • 78
  • 272
  • 425
hotpaw2
  • 70,107
  • 14
  • 90
  • 153
  • 1
    All of the criteria you listed should make anti-aliasing less necessary, not more. And you forgot to include the distance from the display to your head. I hope you're not really using the iPhone in your vehicle! – Mark Ransom Jan 18 '11 at 04:30
  • 1
    IMHO the most straight-to-the point comment! Especially when people are saying that the resolution does not have any impact perceived quality of AAed vs non-AAed lines, which it obviously does. – Cray Jan 19 '11 at 23:08
6

I did some quick tests on an iPhone 4 from a friend with an OpenGL application. Without multisampling, there were still stairs and other artifacts on the output, however, with multisampling they were gone.

Thats not really surprising as you can still build hard edges with a lot of pixels, so just putting more pixels into one device won't solve the problem (however, it clearly can help to reduce the need of multisampling)

JustSid
  • 25,168
  • 7
  • 79
  • 97
  • When you did the test, how close were you holding the iPhone? It makes a big difference. – Mark Ransom Jan 18 '11 at 04:25
  • @Mark Ransom: I hold it normal which is for me about 40-50cm, maybe a bit more, I'm really bad at guessing such things. – JustSid Jan 18 '11 at 11:26
  • Sid is (obviously) completely correct. Indeed, it's worth noting that even more pixels simply means ***even harder edges*** ...!!!!!!!! A lot of people seem to have totally the wrong idea about antialiasing. It takes 1 minute to do the test Sid did and "resolve" the issue. – Fattie Jan 19 '11 at 19:09
  • Harder does not mean more aliased! – Cray Jan 19 '11 at 23:11
  • @Cray, well, but as long as the pixels are visible, you can build aliased edges with them. You can of course reduce the size of the pixels until no one sees them anymore, but I guess you will have a hard time telling this to your customers. – JustSid Jan 20 '11 at 08:35
6

Do a test app, with two images side by side, one antialiased and the other one not. Let users pick the one they think looks better on their retina display and draw your conclusions from the results. If a clear majority of participants pick the antialiased image, then you certainly have a significant difference, otherwise it would be safe to assume said difference not to matter to people who use the app.

luvieere
  • 37,065
  • 18
  • 127
  • 179
5

Here's an article that suggests you need a resolution of 477 DPI to eliminate the ability to see pixels, higher than the 326 DPI of the IPhone 4 Retina display. Be sure also to follow the rebuttal link in the article. http://www.wired.com/gadgetlab/2010/06/iphone-4-retina/

I also remember reading an argument some time ago that anti-aliasing works better at higher resolutions up to a certain point; unfortunately I can't come up with a reference.

Edit: I still can't find the original reference I was thinking of, but John Gruber has compared the 326 DPI screen of the IPhone 4 to the 220 DPI of the Retina MacBook Pro and found the MacBook superior because of the text anti-aliasing. Look about halfway down in the article: http://daringfireball.net/2012/08/pixel_perfect

Mark Ransom
  • 299,747
  • 42
  • 398
  • 622
  • This is completely wrong, Mark. Rather, antialiasing is the mathematics which allows the representation (photon generation) of smooth curves using fixed square pixels. it has nothing to do with resolution - it works at all resolutions. – Fattie Jan 19 '11 at 19:08
  • Furthermore note that "resolution" simply has no relationship to antialiasing. An extremely simple example: look at **this type** on your Mac screen currently. Can you resolve one pixel within the type? Not a bat's chance in hell. ***Maybe*** at 10 pixels per inch per foot-viewing, you ***might*** be able to resolve individual pixels, if you have the world's best eyesight. Reoslution simply has zero, no, none, relationship to antialiasing. it would be ***exactly like asking if color is better or worse with more resolution***. Antialiasing, like "color pixels", is technology that ... – Fattie Jan 19 '11 at 19:12
  • ... aims to improve the "fuzz" (photo generation) of displays. It applies (just like color) regardless of resolution. Another example from the old print days ... if anyone still knows about such things ... in "four colour" (aka "process") printing, it would be utterly absurd to suggest not using a stipple just because you have a high resolution printer. This is a good example of the observation that just because someone gets an article in Wired (or _Nature_), it doesn't mean the article is not nonsensical or has a completely pointless basis! – Fattie Jan 19 '11 at 19:14
  • 1
    @Joe, when I said "works better" I meant that the results would be more effective in creating an illusion of a sharp line with no visible stairstepping or fuzziness. Of course the math does not change. – Mark Ransom Jan 19 '11 at 20:30
2

At some point the number of DPI is high enough that a "pixelated" line at such a high resolution will still look smooth. I'm not sure if Retina would be it. For application like gaming, if you had a screen with 300 DPI or more, you would not need anti-aliasing for geometry. (though things like textures and sprites would still need it since when you approach objects in a 3D world (or even look at them from different angles) the textures are stretched or shrunk)

Here's a great article on the subject: http://gamintafiles.wordpress.com/2012/03/12/when-anti-aliasing-is-no-longer-needed/

KTF
  • 1,349
  • 2
  • 14
  • 21
1

Yes, you still needs it. If you really want to take advantage of the higher PPI, you will use antialiasing. The point of it is to provide the "bleed" that is necessary to make the image appear as best as it can in it's analog form. The only reason the magical 300 PPI or DPI number makes a difference in print is that the dots bleed together some. When you're dealing with the hard edges of an LCD pixel, you have to use antialiasing or you're still dealing with the digital attempt to communicate in the analog.

Since we're dealing with light emitting pixels, instead of light reflecting pixels, the need is even higher, since the contrast of the hard boundaries in the screen are even more noticeable. Reflective light blends and bleeds to gather better than the same light intensity be very directly from the emitting source.

Antialiasing will be needed until we have high resolution organic, non-grid based displays, preferably reflective in nature.

Benjamin Anderson
  • 1,376
  • 10
  • 15
0

Nice question!

When I think of anti-aliasing, I think of a technique that was invented to compensate for pixels that were too big. Image details are spread to surrounding pixels because they are cut off on the pixel edge prematurely. Since you cannot see individual pixels on the retina display (from a certain distance anyway) I think anti-aliasing becomes irrelevant by definition.

xastor
  • 482
  • 5
  • 17
  • This is wrong. "that was invented to compensate for pixels that were too big" ... rather, antialiasing is the mathematics to allow the representation (photon generation) of smooth curves using fixed square picels. it has nothing to do with resolution - it works at all resolutions. – Fattie Jan 19 '11 at 19:08
  • yes, it's just a big hack to convert real world's floats to ints. – chunkyguy Mar 08 '12 at 00:45