1

I'm running an mlmodel that is coming from keras on an iPhone 6. The predictions often fails with the error Error computing NN outputs. Does anyone know what could be the cause and if there is anything I can do about it?

do {
    return try model.prediction(input1: input)
} catch let err {
    fatalError(err.localizedDescription) // Error computing NN outputs error
}

Model inputs and outputs

EDIT: I tried apple's sample project and that one works in the background so it seems it's specific to either our project or model type.

Simon Bengtsson
  • 7,573
  • 3
  • 58
  • 87
  • I've seen this reported by other people too (either here on Stack Overflow or the Apple Dev Forms). I have no idea what causes this, but note that you cannot run an app in the background indefinitely. – Matthijs Hollemans Jan 28 '18 at 14:02
  • We are developing an exercise app that tracks location and motion in the background which works great. – Simon Bengtsson Feb 23 '18 at 08:22

2 Answers2

6

I got the same error myself at similar "seemingly random" times. A bit of debug tracing established that it was caused by the app sometimes trying to load its coreml model when it was sent to background, then crashing or freezing when reloaded into foreground.

The message Error computing NN outputs error was preceded by:

Execution of the command buffer was aborted due to an error during execution. Insufficient Permission (to submit GPU work from background) (IOAF code 6)

I didn't need (or want) the model to be used when the app was in background, so I detected when the app was going in / out of background, set a flag and used a guard statement before attempting to call the model.

  1. Detect when going into background using applicationWillResignActive within the AppDelegate.swift file and set a Bool flag e.g. appInBackground = true. See this for more info: Detect iOS app entering background

  2. Detect when app re-enters foreground using applicationDidBecomeActive in the same AppDelegate.swift file, and reset flag appInBackground = false

  3. Then in the function where you call the model, just before calling model, use a statement such as:

    guard appInBackground == false else { return } // new line to add guard let model = try? VNCoreMLModel(for modelName.model) else { fatalError("could not load model") // original line to load model

I doubt this is the most elegant solution, but it worked for me.

I haven't established why the attempt to load the model in background only happens sometimes.

In the Apple example you link to, it looks like their app only ever calls the model in response to a user input, so it will never try to load the model when in background. Hence the difference in my case ... and possibly yours as well?

RayD
  • 366
  • 3
  • 9
6

In the end it was enough for us to set the usesCPUOnly flag. Using the GPU in the background seems prohibited in iOS. Apple actually wrote about this in their documentation as well. To specify this flag we couldn't use the generated model class anymore but had to call the raw coreml classes instead. I can imagine this changing in a future version however. The snippet below is taken from the generated model class, but with the added MLPredictionOptions specified.

let options = MLPredictionOptions()
options.usesCPUOnly = true // Can't use GPU in the background

// Copied from from the generated model class
let input = model_input(input: mlMultiArray)
let output = try generatedModel.model.prediction(from: input, options: options)
let result = model_output(output: output.featureValue(for: "output")!.multiArrayValue!).output
Spring
  • 11,333
  • 29
  • 116
  • 185
Simon Bengtsson
  • 7,573
  • 3
  • 58
  • 87
  • 1
    Totally missed that they had added that in the `Discussion` section, I knew from GPU profiling that a backgrounded GPU-based model would fail. – SushiHangover Mar 16 '18 at 13:41
  • @Simon Bengtsson I have this problem but it crashes just during initialization of model. And I can only send options CpuFlag during prediction time. Any ideas? – Spring Feb 09 '19 at 21:05
  • Do you get the `NN outputs` error? If not you probably have a different problem then described here. – Simon Bengtsson Feb 10 '19 at 11:53