5

iOS app connects to the intranet and controls other hardware. This hardware is used for the surgery of patients. As mostly on surgery it's tough to use the iOS app by finger and voice commands look better option out there.

I had gone through the speech recognizer framework and could detect the words without internet.

however, these services can not be run for a long time. SpeechRecognizer framework only allow to run for a minute, as surgery may last more than an hour. I want to create a mechanism where we can trigger the app to start recognizing the voice command. (Similar to ok google in Android) so that the app gets a trigger to capture the voice commands.

Looking for the suggested readings or solutions.

XLE_22
  • 5,124
  • 3
  • 21
  • 72
Dipesh Pokhrel
  • 386
  • 5
  • 20
  • "Hey Siri" works in the same way as "Ok Google", Both just trigger the voice assistant. To be able to perform custom tasks you can create a Siri extension, or maybe use Siri shortcuts but that might be more limited. You cannot just trigger your app directly from voice. You need to use Siri as that is 'always listening' – Scriptable Aug 16 '19 at 09:07
  • Ok google works offline as well , However I could not run Hey Siri offline. – Dipesh Pokhrel Aug 16 '19 at 09:12
  • Yes, this is true, I dont think Siri works offline, but there are no other options for listening to the user in the background and your app reacting to it. (that I know of). This would enable the developers to bypass Siri which Apple would not allow you to do, imagine MS or Google installing their voice assistants on Apple devices. It would be competing with Apple's own directly. – Scriptable Aug 16 '19 at 09:18
  • you are missing what I am saying. from what i know *you cannot do it that way*. There is no background mode for using the microphone that I know of. You need to use Siri – Scriptable Aug 16 '19 at 09:31
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/198036/discussion-between-dipesh-and-scriptable). – Dipesh Pokhrel Aug 16 '19 at 09:37
  • 1
    Siri offline support for iOS 13 works perfect for it. – Dipesh Pokhrel Aug 19 '19 at 05:40

2 Answers2

2

"Hey Siri" works in the same way as "Ok Google", Both just trigger the voice assistant. To be able to perform custom tasks you can create a Siri extension, or maybe use Siri shortcuts but that might be more limited. You cannot just trigger your app directly from voice. You need to use Siri as that is 'always listening'.

There are some older answers that suggest if you start recording in the foreground and then switch the background and request extended time to finish your recording that it works but even then I think it would only give you a short amount of time. With recent releases I think this has been restricted further (probably for privacy reasons)

iOS Background audio recording

So to answer your question, I think you would need to create a Siri extension that can trigger these actions in your app.

Scriptable
  • 19,402
  • 5
  • 56
  • 72
0

I want to create a mechanism where we can trigger the app to start recognizing the voice command.

The simplest solution is using the Voice Control feature (new in iOS 13) that natively works on every iOS 13 devices and can: (among many other things)

  • Be turned off (go to sleep) and turned on (wake up) once activated (Accessibility - Voice Control button ON).
  • Open a specific app when its icon is on screen (open yourapp).
  • Be used to do all possible actions with touches.

Activate this feature in your device settings Accessibility - Voice Control as follows: enter image description here

Take a look at the Customize Commands menu to dive into the vocal commands that can be customized as well.

This amazing iOS 13 new feature affords motor impaired people to autonomously navigate through their Apple mobile devices and computers: that's definitely what you're looking for... even if the target isn't this class of person. ;o)

You can control with your voice all the functionality of your device exactly the same as with touches.

The only thing to do as a developer is eventually to adapt the accessibilityUserInputLabels properties if you need specific names to be displayed for some items.

XLE_22
  • 5,124
  • 3
  • 21
  • 72