2

I have a Google Home app that uses the Actions on Google client Javascript library:

https://github.com/actions-on-google/actions-on-google-nodejs

The speech recognition is extremely good. But sometimes I run into a recognition context that it has a little trouble with. Is there any way to to tell the speech recognition facility to emphasize the probability of certain phrases of words on a per user interaction basis? For example, if the user is being asked for a date, to have the probability of the months of the year be boosted over normal?

In some speech recognition engines you can provide vocabulary lists (aka grammars, etc.) to the engine. Is there a way to do this with an Actions on Google app?

I am aware of Google's Speech Recognition API:

Google Speech Recognition API

But I don't know if that API is exposed or available through the Actions on Google service, or if that API supports grammars or context lists.

Robert Oschler
  • 14,153
  • 18
  • 94
  • 227

2 Answers2

3

EDIT: Looks like Actions on Google does have a way to expect certain user input. see:

https://developers.google.com/actions/reference/rest/Shared.Types/QueryPatterns

from this question:

google action package how to define custom slot types?

Original Answer:

You can't change the way the Google Home perceives voice input, it simply listens with its predefined vocabulary. You can however use https://dialogflow.com/ to require certain parameters from the conversation.

Using Dialogflow(formerly api.ai) you can set Date as a required parameter for the conversation to continue, or if the Home app consistenly hears the same wrong input you can set that input to return the input you wanted to receive.

Example: Google Home asks for the Date, user says "october" but the Home always hears "somethingelse". You can then set "somethingelse" as a synonym for "october" and handle it from there.

besides that there are a small number of actions you can call in Actions on Google where it actually expects a certain input https://developers.google.com/actions/assistant/helpers#built-in_helper_intents even so it won't boost the chance of the Home recognizing a certain context here.

Bart
  • 498
  • 2
  • 8
0

No, unfortunately, The speech recognition is totally abstracted from you, and I also had some challenges when trying to have Names, and such, it sometimes try to look for a similar English word,

DialogFlow (formely Api.ai) or whatever framework you use is used later on of NLP (natural language processing) to eventually process the text and get the keywords out of it.

so it's not really part of the speech to text part as far as I know.

so your google home takes care of the speech to text and sends a text to your NLP framework that takes care of solving the grammars, but until now I could solve all my issues with API.ai it is really powerful.

Ayoub
  • 112
  • 1
  • 9