5

I am working on an app that requires the user to aim his iPhone to the sun in order to trigger a special event.

I can retrieve the device 3D orientation quaternion based on the gyroscope and CoreMotion framework, from this I can get the yaw, roll and pitch angles. I can also compute the sun's azimuth and zenith angle based on the current date and time (GMT) and the latitude and longitude. What I am trying to figure out next is how to compare these two sets of values (phone orientation and sun position) to accurately detect the alignment of the device with the sun.

Any ideas on how to achieve that?

Broco
  • 679
  • 1
  • 4
  • 16
  • Care to share what you have so far? Is this an augmented reality app, and will the sun's orbit path be drawn on the screen hence why you need to detect the alignment? – Pavan Jan 21 '14 at 01:13
  • I'm not planning on drawing the actual sun's orbit I just want to display a circle on a UIView that moves around as the user is moving his device. The drawn circle would get closer to the center of the screen as the phone is aimed towards the sun. This would be visually similar to the bubble level app on iOS7. – Broco Jan 21 '14 at 10:10
  • Seems like a great question for mathsplay for the folk at maths.stackexchange.com :P Please do post a forward link so that I can check up on the answers submitted as Im interested to see how it will work out for you :) – Pavan Jan 21 '14 at 16:11
  • 1
    I finally managed to get it working! I came across the Apple code sample 'pARk' which is a simple augmented reality app. I basically took their implementation of the camera and projection matrix and all I pretty much had to do is multiply these matrices in a certain order with the sun's 3D position vector (translated from spherical coordinates to cartesian) and I could retrieve screen coordinates. My implementation seems to be working pretty well and I'm gonna post a more elaborate answer later on. – Broco Feb 03 '14 at 09:59
  • that is very nice of you to do. I would be more than grateful if you could also post a github/link to the project that does simply what you described with the sun's coordinates. I would love to test your solution and be able to see your blob hovering above the sun when I face my device at the sun, and for a blob to still show when I aim downward at the earth at night time! Thanks for getting back and I truly look forward to your implementation Broco – Pavan Feb 03 '14 at 18:20
  • No problem mate! I'm gonna set up a small project on Github just for this feature as I can't share the whole current app due to NDA restrictions. – Broco Feb 03 '14 at 20:24
  • Dude, thats great I hope you can get this done asap, as im eagerly waiting fo this exciting project of yours and to test it out on my device! I once saw an app that was able to do what I wrote in my previous comment and thought it was super cool. Do you reckon you will be able to get a sample project running shortly? – Pavan Feb 03 '14 at 21:28
  • also , I agree, just a small project that highlights what we discussed up above in our comments is more than enough. You did mention that you're under NDA restriction? Lol, dude, what features are you using that youre restricted by NDA? I'm a licensed developer too, but can't see any features from the next release that you might be using in your project that will put you under NDA restriction. Is there a link of some features that are coming up in the next release which was significant enough for you to implement in your project? Excuse my curiosity. Perhaps a link and I can log into my dev ac – Pavan Feb 03 '14 at 21:31
  • Ahah, no it's not Apple's NDA, just the client I'm building the app for... Obviously I can't share the whole app because of sensitive data that could be in. But yeah the code sample should be up on Github tomorrow if I get the time :) – Broco Feb 03 '14 at 23:28
  • Ahah! I was going to say man, you had me excited thinking there was a new Apple NDA which of course means new content for developers ;) you had me searching on the dev site for atleast a few seconds :P I look forward to your git repository broco! – Pavan Feb 04 '14 at 04:02
  • Just posted an answer and put the code on my Github: https://github.com/brocoo/SunTracker – Broco Feb 04 '14 at 10:43

1 Answers1

6

I finally managed to solve the problem,

To achieve this I used a sample code using augmented reality available on Apple developer ressources: pARk

The idea was first to convert the sun's spherical coordinates to cartesian coordinates in order to get its position in the sky as a simple vector: {x, y, z}. The formula is available on Wikipedia : Spherical coordinates. As the distance from the sun (radius in spherical coordinates) doesn't really matter here, I used 1.

Using the device gyroscope and CoreMotion framework I was then able to get the iPhone rotation matrix. From the 'pARk' code sample and using the geometry functions I could then compute the camera projection matrix. I multiply the rotation matrix with it and then multiply the resulting matrix with the sun's vector coordinates.

And that gives a simple vector with screen coordinates for the sun. By displaying a UIView with these x and y values I can finally see the sun moving around as I move the screen.

The code for this feature is available on my Github, feel free to use and share!

Broco
  • 679
  • 1
  • 4
  • 16