9

I'm looking for help with a performance issue in an Objective-C based iOS app.

I have an iOS application that captures the screen's contents using CALayer's renderInContext method. It attempts to capture enough screen frames to create a video using AVFoundation. The screen recording is then combined with other elements for research purposes on usability. While the screen is being captured, the app may also be displaying the contents of a UIWebView, going out over the network to fetch data, etc... The content of the Web view is not under my control - it is arbitrary content from the Web.

This setup is working but as you might imagine, it's not buttery smooth. Since the layer must be rendered on the main thread, there's more UI contention than I'd like. What I'd like to do is to have a setup where the responsiveness of the UI is prioritized over the screen capture. For instance, if the user is scrolling the Web view, I'd rather drop frames on the recording than have a terrible scrolling experience.

I've experimented with several techniques, from dispatch_source coalescing to submitting the frame capture requests as blocks to the main queue to CADisplayLink. So far they all seem to perform about the same. The frame capture is currently being triggered in the drawRect of the screen's main view.

What I'm asking here is: given the above, what techniques would you suggest I try to achieve my goals? I realize the answer may be that there is no great answer... but I'd like to try anything, however wacky it might sound.

NOTE: Whatever techniques need to be App Store friendly. Can't use something like the CoreSurface hack that Display Recorder used/uses.

Thanks for your help!

Hunter
  • 4,343
  • 5
  • 42
  • 44
  • Is this being done in Objective-C? Regardless, would it be feasible to time the recording process and if the last request took too long stop the recording for X (pick a number) frames? I have no iOS exp, but this is the first thing that came to mind. – Jared Jan 19 '13 at 18:40
  • Hi - yes, all Obj-C... Since 'too long' varies significantly between devices due to CPU differences, this might end up being less precise than required. – Hunter Jan 19 '13 at 18:51
  • Possibly do something similar to a javascript debounce? https://github.com/pixelspring/NKBlueprint-Obj-C-Mix-Edition or possibly monitor the CPU usage before the call? http://stackoverflow.com/questions/12889422/ios-cpu-usage-for-each-process-using-sysctl – Jared Jan 19 '13 at 19:09
  • I'll look at the debounce thing - I should have mentioned that the web content in the app is not under my control. – Hunter Jan 19 '13 at 20:45
  • Yea, and I didn't necessarily mean to do an actual js de-bounce...rather the concept. I'd also think (and again this is with no iOS dev exp) that there would be a way to setup an array/enum with the max fps rate to use based on the device. Test by putting the device under moderate load and test till acceptable. Again, this is from a web dev not an app dev, but I'm assuming something similar to this should be possible. – Jared Jan 19 '13 at 21:26
  • you could try duplicating your root layer using `-presentationLayer`, then rendering that using -renderInContext on a background thread... – nielsbot Jan 26 '13 at 07:10
  • The following may help you to give the video lower scheduling priority: http://stackoverflow.com/questions/7356820/specify-to-call-someting-when-main-thread-is-idle – fishinear Jan 26 '13 at 11:33

2 Answers2

1

"Since the layer must be rendered on the main thread" this is not true, as long as you don't touch UIKit.

Please see https://stackoverflow.com/a/12844171/136305

Community
  • 1
  • 1
Deniz Mert Edincik
  • 4,336
  • 22
  • 24
  • Have you used this method with success yourself? – Hunter Jan 21 '13 at 23:28
  • My initial tests are suggesting that CALayer's renderInContext: does *not* work reliably on a background thread. Random crashes abound. – Hunter Jan 24 '13 at 17:44
  • I have had no troubles rendering in a background thread. In fact, rendering a CALayer in a background thread is standard practice for the CATiledLayer approach, used in many apps. Just make sure you stick to the CG... methods and avoid the UI... methods. But then again, I have never done a video in the background. – fishinear Jan 26 '13 at 11:22
  • @fishinear Good point, re: CATiledLayer. My first tests crashed and burned but I'll go again and ensure I'm not leaking any UIKit methods. By 'rendering in a background thread' you mean renderInContext (creating a bitmap of the layer), not some other CALayer drawing operation, yes? – Hunter Jan 31 '13 at 00:18
  • @Hunter. Correct. I use CGBitmapContextCreate to create the context. But I just realized that you need to render the same UI both to the main screen and in a background thread at the same time. I have never done that. – fishinear Jan 31 '13 at 11:44
0

Maybe you can record at half resolution to speed up things, if that fits the requirements?

  • How would you capture the screen at half resolution? – Hunter Jan 19 '13 at 20:44
  • @Hunter: if you're not calling `UIGraphicsBeginImageContextWithOptions` before you call `UIGraphicsGetCurrentContext`, then you're already capturing a non-retina (i.e. half resolution) image when running on a retina device. – MusiGenesis Jan 24 '13 at 20:16
  • @MusiGenesis Are you referring to the scale property? My tests seem to indicate that while the context may get different contents, CALayer's renderInContext seems to perform the same for both 1.0 and 2.0 when set for the scale. Do you have different information? – Hunter Jan 25 '13 at 23:13