5

Background

I work on an app that can answer to certain queries (phone number queries, and maybe others).

Google introduced a new feature on Android 6 , called "Google Now On Tap" (AKA "Assist API") , which allows the user to query about things that are shown on the screen (triggered by long-click on home button or by saying something) without the need to type anything.

Google provided a developers tutorial for it, here

The problem

I can't find any code snippet to show how to prepare the app for it.

Only thing that I've noticed is that I can extend from Application class, and add OnProvideAssistDataListener inside , and register to it.

But, it opens a lot of questions about how to do it.

Sadly, because this topic is so new, I can't find almost anything about it, so I'd like to ask the questions here.

The questions

1) Is there any sample or at least a more explained tutorial for this new feature?

2) It is said in the docs:

In most cases, implementing accessibility support will enable the assistant to obtain the information it needs. This includes providing android:contentDescription attributes, populating AccessibilityNodeInfo for custom views, making sure custom ViewGroups correctly expose their children, and following the best practices described in “Making Applications Accessible”.

Why and how does it work with the accessibility features of the app? What does it have anything to do with exposing child views (or views at all)? How could it even be about views, if the app doesn't run yet (because the feature is activated on any app, anywhere).

What I think is that this is called only if the foreground app is my app, but if it is this way, how can I actually offer queries that appear for all apps, depending on what the input is?

3) Does the class that extends from Application supposed to implement OnProvideAssistDataListener ? If so, why does it need to register to it? If not, how could it be that Google-Now-On-Tap works with it? It can't just open all apps that have such a classs, and see if they register...

4) The docs have a sample snippet which I didn't understand:

@Override
public void onProvideAssistContent(AssistContent assistContent) {
  super.onProvideAssistContent(assistContent);

  String structuredJson = new JSONObject()
       .put("@type", "MusicRecording")
       .put("@id", "example.comhttps://example.com/music/recording")
       .put("name", "Album Title")
       .toString();

  assistContent.setStructuredData(structuredJson);
}

What does the new feature do with each key? Is it used by the app, or Google-Now-On-Tap ? What are my options about it? Is this where I define if my app can handle the content that the feature suggests me? Is AssistContent supposed to be the input that I look at, and decide if my app can handle it or ignore it?

android developer
  • 114,585
  • 152
  • 739
  • 1,270
  • 1
    "Is there any sample" -- asking for off-site resources is considered off-topic for Stack Overflow. This question is also too broad. Beyond that, there are two sides to the communications: the assistant and the foreground activity that the user for which the user wants the assistant's help. You seem to be conflating those two. `onProvideAssistContent()` is for the foreground activity, and most of what is documented is for the foreground activity. "how can I actually offer queries that appear for all apps" sounds like you want to write a replacement assistant, which is different. – CommonsWare Nov 29 '15 at 16:07
  • @CommonsWare I don't want to make a replacement. Isn't the assistant showing suggestions of what to search, using apps that are available? What exactly can be done using this new feature? For example, if there are phone numbers on the screen, and because the app can search for phone numbers, wouldn't it make sense to show it and allow to search for it? – android developer Nov 29 '15 at 23:17
  • @CommonsWare Maybe the only part that should be interesting for the app is "Destination App" ? But this doesn't help much as well... – android developer Nov 29 '15 at 23:19
  • 1
    "Isn't the assistant showing suggestions of what to search, using apps that are available?" -- not Now On Tap. It shows suggestions from Google search, as I understand it. `onProvideAssistContent()` is for the foreground activity to augment what Now On Tap gets from the accessibility APIs, to use as search fodder. Where Now On Tap searches is completely up to Google and probably has nothing much to do with whatever else is on the device. – CommonsWare Nov 29 '15 at 23:26
  • @CommonsWare So apps don't really have a lot to offer to this feature? I thought that once you use it, it can suggest you which app to use based on the content that was scanned. – android developer Nov 30 '15 at 00:48
  • 1
    Quoting [Google](https://support.google.com/websearch/answer/6304517): "When you use Now on tap, you "copy and paste" what you're looking at into Google to get information that might be useful based on what you're doing". – CommonsWare Nov 30 '15 at 00:51
  • @CommonsWare So... you say it doesn't do much... What about the part of "Destination App in the link I've provided? Can this help in any way? – android developer Nov 30 '15 at 20:13
  • 1
    "you say it doesn't do much" -- no, I said it does a Google search. "What about the part..." -- sorry, I do not know what you are referring to. – CommonsWare Nov 30 '15 at 20:28
  • @CommonsWare Well, I meant that a google search doesn't provide much for developers. About the part I've written, just read it in the link : http://developer.android.com/training/articles/assistant.html – android developer Nov 30 '15 at 22:12
  • 1
    Well, that paragraph seems self-explanatory. Google search via Now On Tap is no different than any other Google search on the device. – CommonsWare Nov 30 '15 at 22:16
  • @CommonsWare I see. Thank you. Why didn't you put it all into an answer, instead of comments? – android developer Nov 30 '15 at 22:56

0 Answers0