0

I am creating an application. In which iPhone will be placed (a separate cover is made for it) with golf club(racket). I want to get array of points which state the path of the racket movement.

For example, I start collecting the data when racket is on the ground. After then user prepares himself for shot. So, he will take the racket back side and then he will hit the shot by moving racket forward. I want to catch all these points in 3D and want to plot them on screen (2D projection). I saw many similar questions, accelerometer, CMMotion framework documents. But could not find a way to doing so.

I hope, I have explained the question properly. Can you suggest me some formula or how to process the data to achieve it?

Thanks in advance.

Apurv
  • 17,116
  • 8
  • 51
  • 67
  • This is a bit of a 'big' question, i.e. too broad. Make a start and break it down into simple, solvable problems. – Widor Feb 23 '12 at 17:54
  • @Widor Thanks. I want to convert the readings into physical points into 3D world. Can you suggest any startup point for it? – Apurv Feb 23 '12 at 17:57
  • Well I don't know what you already tried so I'd be stabbing in the dark. First step for me would be to get your hands on the iOS dev tools. – Widor Feb 23 '12 at 17:58

2 Answers2

1

You cannot track these movements in the 3D space.

But you can track the orientation of the racket and that should work well.

I have implemented a sensor fusion algorithm for the Shimmer platform, not a trivial task. I would use Core Motion and I would not try to create my own sensor fusion algorithm.

Hope this helps, good luck!

Community
  • 1
  • 1
Ali
  • 56,466
  • 29
  • 168
  • 265
  • Thanks Ali. +1 for that. Can you please provide me some sample code or formula for tracking the orientation. – Apurv Feb 24 '12 at 04:35
  • @Apurv Sorry, I have never programmed any iOS device. You do not need any formula, the computation is done in Core Motion. In particular, I would start with `CMAttitude`. – Ali Feb 24 '12 at 09:11
  • I also arrived to the same point. Thank you very much for the help. – Apurv Feb 24 '12 at 11:45
1

i tried the sensors fusion algorithm developed by Madgwick, but the output, on my device, it's similar to the CoreMotion Attitude output. I don't have the possibility to test the attitude outputs from other iPhone, but in my case, the problem it's the yaw angle, even if the iphone it's fixed on the table the yaw angle tend to be unstable, probably due to the distinct chip-placement of z-axis gyro.

Batti
  • 425
  • 4
  • 15
  • I would not use yaw, pitch and roll. [They are unstable](http://stackoverflow.com/a/5578867/341970) and it is not your phone's fault. – Ali Feb 24 '12 at 09:13