I want to know if there's a way to use iOS speech recognition in offline mode. According to the documentation (https://developer.apple.com/reference/speech) I didn't see anything about it.
-
You can dictate into a textField and textView using the mic button on the software keyboard. That works in offline mode. Don't think interpret speech commands though. – Magnas Mar 20 '17 at 10:08
-
@Magnas are you *pretty* sure that the mic button should be enabled even when offline mode? I tested on four different iPhones and it was disabled... – Ahmad F Mar 20 '17 at 10:22
-
Same for me @AhmadF, mic button is not available in my app when offline mode is on (http://imgur.com/GHZ9I6x). – Danyl Mar 20 '17 at 10:28
-
Hmmm it's on by default in mine. I'll go check more closely. – Magnas Mar 20 '17 at 11:34
2 Answers
I am afraid that there is no way to do it (however, please make sure to check the update at the end of the answer).
As mentioned at the Speech Framework Official Documentation:
Best Practices for a Great User Experience:
Be prepared to handle the failures that can be caused by reaching speech recognition limits. Because speech recognition is a network-based service, limits are enforced so that the service can remain freely available to all apps.
As an end user perspective, trying to get Siri's help without connecting to a network should displays a screen similar to:
Also, When trying to send a massage -for example-, you'll notice that the mike button should be disabled if the device is unconnected to a network.
Natively, the iOS itself won't able this feature until checking network connection, I assume that would be the same for the third-party developer when using the Speech Framework.
UPDATE:
After watching Speech Recognition API Session (especially, the part 03:00 - 03:25) , I came up with:
Speech Recognition API usually requires an internet connection, but there are some of new devices do support this feature all the time; You might want to check whether the given language is available or not.
Adapted from SFSpeechRecognizer Documentation:
Note that a supported speech recognizer is not the same as an available speech recognizer; for example, the recognizers for some locales may require an Internet connection. You can use the
supportedLocales()
method to get a list of supported locales and theisAvailable
property to find out if the recognizer for a specific locale is available.
Further Reading:
These topics might be related:

- 30,560
- 17
- 97
- 143
-
Ok, thank you for your answer @AhmadF ! Unfortunately, I conclude the same before asking. :/ Is it because the Speech Framework is new born since iOS 10 and it will be improved in future perhaps ? Let's wait lel... – Danyl Mar 20 '17 at 10:25
-
@DanylS I hope so... however, it does support offline mode, but I think that applied for few cases, check my updated answer :) – Ahmad F Mar 20 '17 at 12:56
-
I just took a look on it, thanks @ AhmadF, it's very useful for me. Could you tell me more details about which iPhone models is supported ? – Danyl Mar 20 '17 at 13:21
-
@DanylS Honestly, I have no idea for now :D, I'll update my answer once I got information about it; The thing is you should check if the device supported locales before checking if there is an internet connection. – Ahmad F Mar 21 '17 at 08:34
-
@AhmadF Thank you for this helpful summery. It worked on iPhone 6s for english. Not sure about others. (Maybe iPhone 6s and later supported.) – Mohammad Zaid Pathan Oct 31 '17 at 10:37
-
1@ZaidPathan glad to help, so far there is no updates regarding to this, However, I could mention you in a comment if case of updating it :) – Ahmad F Oct 31 '17 at 10:44
-
-
@ZaidPathan it seems that it is the case that I mentioned in the "UPDATE"! – Ahmad F Oct 31 '17 at 10:46
Offline transcription will be available starting in iOS 13. You enable it with requiresOnDeviceRecognition
.
Example code (Swift 5):
// Create and configure the speech recognition request.
recognitionRequest = SFSpeechAudioBufferRecognitionRequest()
guard let recognitionRequest = recognitionRequest else { fatalError("Unable to create a SFSpeechAudioBufferRecognitionRequest object") }
recognitionRequest.shouldReportPartialResults = true
// Keep speech recognition data on device
if #available(iOS 13, *) {
recognitionRequest.requiresOnDeviceRecognition = true
}

- 6,305
- 8
- 48
- 68