2

Im not a huge fan of iOS graphics APIs and their documentation and have been trying for a while now to form a high level view and structure of the rendering process but only have bits and pieces of information. Essentially, I am trying to understand (again, in high level);

1) The role of Coregraphics and CoreAnimation APIs in the rendering pipeline all the way from CGContext to the front frame buffer.

2) And along the way(this is has been the most confusing and least elaborate in the documentation), which tasks are performed by the CPU and GPU.

With Swift and Metal out, I'm hoping the APIs would be revisited.

genpfault
  • 51,148
  • 11
  • 85
  • 139
swiftlee
  • 23
  • 4
  • 3
    When you override `drawRect:`, it doesn't switch to software rendering. Your graphics hardware is still used from all drawing functions you invoke in your implementation. However, in some cases in the design of classes, a method is not implemented in a base class (I assume that drawRect is not implemented in UIView). When your views are being drawn, the library code can check to see if your class implements that method, and if not, it can skip over that step. It's a low-tech sort of optimization. – Justin Johns Jun 10 '14 at 20:06
  • I was never sure about this part and assumed that base View objects(Views, Controls etc) do not draw themselves using drawRect but with some optimization that is hidden. These are all I could find; – swiftlee Jun 11 '14 at 21:16
  • "System views typically implement private drawing methods to render their content. " https://developer.apple.com/library/ios/documentation/windowsviews/conceptual/viewpg_iphoneos/WindowsandViews/WindowsandViews.html So I assumed there must be some hidden optimization when they draw themselves as opposed to overriding drawRect which forces software drawing. – swiftlee Jun 11 '14 at 21:26

2 Answers2

4

Have you started with the WWDC videos? They cover many of the details extensively. For example, this year's Advanced Graphics & Animations for iOS Apps is a good starting point. The Core Image talks are generally useful as well (I haven't watched this year's yet). I highly recommend going back to previous years. They've had excellent discussions about the CPU/GPU pipeline in previous years. The WWDC 2012 Core Image Techniques was very helpful. And of course learning to use Instruments effectively is just as important as understanding the implementations.

Apple does not typically provide low-level implementation details in the main documentation. The implementation details are not interface promises, and Apple changes them from time to time to improve performance for the majority of applications. This can sometimes degrade performance on corner cases, which is one reason you should avoid being clever with performance tricks.

But the WWDC videos have exactly what you're describing, and will walk you through the rendering pipeline and how to optimize it. The recommendations they make tend to be very stable from release to release and device to device.

Rob Napier
  • 286,113
  • 34
  • 456
  • 610
3

1) The role of Coregraphics and CoreAnimation APIs in the rendering pipeline all the way from CGContext to the front frame buffer.

Core Graphics is a drawing library that implements the same primitives as PDF or PostScript. So you feed it bitmaps and various types of path and it produces pixels.

Core Animation is a compositor. It produces the screen display by compositing buffers (known as layers) from video memory. While compositing it may apply a transform, moving, rotating, adding perspective or doing something else to each layer. It also has a timed animation subsystem that can make timed adjustments to any part of that transform without further programmatic intervention.

UIKit wires things up so that you use CoreGraphics to draw the contents of your view to a layer whenever the contents themselves change. That primarily involves the CPU. For things like animations and transitions you then usually end up applying or scheduling compositing adjustments. So that primarily involves the GPU.

2) And along the way(this is has been the most confusing and least elaborate in the documentation), which tasks are performed by the CPU and GPU.

Individual layer drawing: CPU

Transforming and compositing layers to build up the display: GPU

iOS: why does overriding drawRect resort to software rendering?

It doesn't 'resort' to anything. The exact same pipeline is applied whether you wrote the relevant drawRect: or Apple did.

With Swift and Metal out, I'm hoping the APIs would be revisited.

Swift and Metal are completely orthogonal to this issue. The APIs are very well formed and highly respected. Your issues with them are — as you freely recognise — lack of understanding. There is no need to revisit them and Apple has given no indication that it will be doing so.

Tommy
  • 99,986
  • 12
  • 185
  • 204
  • Thanks. Please correct me if I am wrong. CG uses the CPU for rasterization (of offscreen or onscreen contexts) and the subsequent layers are composited by the GPU into a back frame buffer which are swapped and displayed at the next scan.So after the first draw call, these layers are cached(in iOS) in the VRAM and unless they are marked dirty(ex:setNeedsDisplay), drawRect is not called again which would otherwise require sending the cached layer back to CPU for redrawing and uploading back to the GPU for caching and compositing? Also what if I have to redraw every sec? – swiftlee Jun 12 '14 at 20:26
  • There's no reason to assume a double buffer but otherwise you're correct with one additional observation: there's only one RAM chip. It's a shared memory architecture. The CPU can draw directly into texture RAM and I'd be surprised if Core Graphics doesn't do that. You can do it yourself via the Core Video / OpenGL bridge. It's definitely not a problem to redraw once a second. You can probably redraw thousands of times a second. Depending on how complicated you make your Core Graphics calls, of course, but e.g. when you type text it's rasterising, caching and then displaying upon every input. – Tommy Jun 12 '14 at 21:56