1

My application needs to know whether the phone is in the pocket or in hand based on which few parameters are set specific to individual than to move on to perform next tasks.

I have read various blogs and also SensorManager android developer but none helped me out. The only related link I found on stack is this with no solution, Though one comment on that question suggests using Awareness API. I am going through it, my understanding is that the User Activity is the context to find this- I may be wrong. There maybe someone worked or may be doing R&D on this, please share your observation that may help me out in some way to go further.

Is there any way to find is the phone in pocket or not?If yes, Can somebody tell me How do One do that?

Any guidance/links to the concepts are helpful.

Thanks.

Phantômaxx
  • 37,901
  • 21
  • 84
  • 115
Shree
  • 354
  • 2
  • 21

3 Answers3

2

I implemented this in my project. I got readings from the Light sensor, Accelerometer and Proximity sensor. Keep in mind that it approximately detects device presence in a pocket.

Getting the current parameteres from the sensors (accelerometer, proximity and light sensors):

@Override
public void onSensorChanged(SensorEvent event) {

    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
        g = new float[3];
        g = event.values.clone();

        double norm_Of_g = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2]);

        g[0] = (float)(g[0] / norm_Of_g);
        g[1] = (float)(g[1] / norm_Of_g);
        g[2] = (float)(g[2] / norm_Of_g);

        inclination = (int) Math.round(Math.toDegrees(Math.acos(g[2])));
        accReading.setText("XYZ: " + round(g[0]) + ",  " + round(g[1]) + ",  " + round(g[2]) + "  inc: " + inclination);
    }
    if (event.sensor.getType() == Sensor.TYPE_PROXIMITY) {
        proximityReading.setText("Proximity Sensor Reading:" + String.valueOf(event.values[0]));
        rp = event.values[0];
    }
    if (event.sensor.getType() == Sensor.TYPE_LIGHT) {
        lightReading.setText("LIGHT: " + event.values[0]);
        rl = event.values[0];
    }
    if ((rp != -1) && (rl != -1) && (inclination != -1)) {
        main.detect(rp, rl, g, inclination);
    }
}

Then based on this data I decide whether or not the device is in a pocket:

public void detect(float prox, float light, float g[], int inc){
    if((prox<1)&&(light<2)&&(g[1]<-0.6)&&( (inc>75)||(inc<100))){
        pocket=1;
        //IN POCKET
    }
    if((prox>=1)&&(light>=2)&&(g[1]>=-0.7)){
        if(pocket==1){
            playSound();
            pocket=0;
        }
        //OUT OF POCKET
    }

}

Keep in mind that it's not fully accurate.
Code: https://github.com/IvanLudvig/PocketSword
Blog post: https://ivanludvig.github.io/blog/2019/06/21/detecting-device-in-a-pocket-android.html

1

The only way we can come somewhat near to the solution is using.Google Awareness API wont solve the problem as it has a entirely different usage.

  • Light sensor(Environment sensor)
  • Proximity sensor(Position sensor)

The Android platform provides four sensors that let you monitor various environmental properties. You can use these sensors to monitor

  • relative ambient humidity
  • luminescence
  • ambient pressure
  • ambient temperature

All four environment sensors are hardware-based and are available only if a device manufacturer has built them into a device. With the exception of the light sensor, which most device manufacturers use to control screen brightness, environment sensors are not always available on devices. Because of this, it's particularly important that you verify at run time whether an environment sensor exists before you attempt to acquire data from it.

Light sensor can be used to calculate the light intensity.For example many mobile phones having Auto brightness mode function, this function work on light sensor that will adjust screen brightness as per light intensity. There are many unites such as Lux,candela,lumen etc, to measure light intensity.

Considering this there will be considerable difference in light intensity when you phone in in pocket or outside pocket.

Although the same will happen for the case when you are operating phone is dark room. or at those place where the light intensity is quite low. hence to distinguish among such cases is the real challenge.You can use other environments sensor in combination of light sensor to come to an effective outcome.But i assume an accurate solution is dicey.

To study more about these sensors kindly refer to following links

https://developer.android.com/guide/topics/sensors/sensors_environment.html https://developer.android.com/guide/topics/sensors/sensors_position.html

Google awareness API wont work for this case. as provides entirely different solution. It provides two API

  • Fence API
  • Snapshot API

You can use the Snapshot API to get information about the user's current environment. Using the Snapshot API, you can access a variety of context signals:

  • Detected user activity, such as walking or driving.
  • Nearby beacons that you have registered.
  • Headphone state (plugged in or not)
  • Location, including latitude and longitude.
  • Place where the user is currently located.
  • Weather conditions in the user's current location.

Using the Fence API, you can define fences based on context signals such as:

  • The user's current location (lat/lng)
  • The user's current activity (walking, driving, etc.).
  • Device-specific conditions, such as whether the headphones are plugged in.
  • Proximity to nearby beacons.
Amardeep
  • 1,414
  • 11
  • 18
  • Thanks for your response. Can you tell what are the cases that can be considered to find the phone in pocket ? Because If one has set the screen timeout for longer value we definitely can't go with considering screen brightness as main parameter. – Shree Jan 30 '18 at 11:22
  • As i said Light sensor calculates the intensity of light in nearby environment, hence screen timeout wont effect it.You can start developing your solution around Environment sensors and Position Sensor. As i said that a perfect outcome is dicey because even the devices that are providing pocket mode as feature can be fooled. – Amardeep Jan 30 '18 at 11:26
  • Your suggestion helped me a lot. Thanks. – Shree Mar 20 '18 at 06:25
0

For a cross-platform solution, you can now use the NumberEight SDK for this task.

It performs a wide variety of context recognition tasks on both iOS and Android including:

  • Real-time physical activity detection
  • Device position detection (i.e. presence in pocket)
  • Motion detection
  • Reachability
  • Local weather

It can also record user context for reports and analysis via the online portal.

How to detect whether a phone is in a pocket:

For example, to record user activity in Kotlin, you would do:

val ne = NumberEight()

ne.onDevicePositionUpdated { glimpse ->
    if (glimpse.mostProbable.state == State.InPocket) {
        Log.d("MyApp", "Phone is in a pocket!")
    }
}

or in Java:

NumberEight ne = new NumberEight();

ne.onDevicePositionUpdated(
    new NumberEight.SubscriptionCallback<NEDevicePosition>() {
        @Override
        public void onUpdated(@NonNull Glimpse<NEDevicePosition> glimpse) {
            if (glimpse.mostProbable.state == State.InPocket) {
                Log.d("MyApp", "Phone is in a pocket!");
            }
        }
    });

Here are some iOS and Android example projects.

Disclosure: I'm one of the developers.

Chris Watts
  • 6,197
  • 7
  • 49
  • 98