2

I would like to know if there are some libraries/algorithms/techniques that help to extract the user context (walking/standing) from accelerometer data (extracted from any smartphone)?

For example, I would collect accelerometer data every 5 seconds for a definite period of time and then identify the user context (ex. for the first 5 minutes, the user was walking, then the user was standing for a minute, and then he continued walking for another 3 minutes).

Thank you very much in advance :)

duncanportelli
  • 3,161
  • 8
  • 38
  • 59

4 Answers4

3

Check new activity recognization apis http://developer.android.com/google/play-services/location.html

Jambaaz
  • 2,788
  • 1
  • 19
  • 30
2

its still a research topic,please look at this paper which discuss the algorithm

http://www.enggjournals.com/ijcse/doc/IJCSE12-04-05-266.pdf

nayab
  • 2,332
  • 1
  • 20
  • 34
1

I don't know of any such library.

It is a very time consuming task to write such a library. Basically, you would build a database of "user context" that you wish to recognize.

Then you collect data and compare it to those in the database. As for how to compare, see Store orientation to an array - and compare, the same holds for accelerometer.

Community
  • 1
  • 1
Ali
  • 56,466
  • 29
  • 168
  • 265
0

Walking/running data is analogous to heart-rate data in a lot of ways. In terms of getting the noise filtered and getting smooth peaks, look into noise filtering and peak detection algorithms. The following is used to obtain heart-rate information for heart patients, it should be a good starting point : http://www.docstoc.com/docs/22491202/Pan-Tompkins-algorithm-algorithm-to-detect-QRS-complex-in-ECG

Think about how you want to filter out the noise and detect peaks; the filters will obviously depend on the raw data you gather, but it's good to have a general idea of what kind of filtering you'd want to do on your data. Think about what needs to be done once you have filtered data. In your case, think about how you would go about designing an algorithm to find out when the data indicates activity (like walking, running,etc.), and when it shows the user being stationary. This is a fairly challenging problem to solve, once you consider the dynamics of the device itself (how it's positioned when the user is walking/running), and the fact that there are very few (if not no) benchmarked algos that do this with raw smartphone data.

Start with determining the appropriate algorithms, and then tackle the complexities (mentioned above) one by one.

aspen100
  • 965
  • 11
  • 22