I need to get CLLocation object with latitude and longitude having precision only to 6 points. But when I create a CLLocation object from floating point values, I get CLLocation object with latitude and longitude with greater precision (say up to 10 points).
What I have is latitude = 10.268408, longitude = 76.353965
I use the following code to create CLLocation object with the above coordinates.
CLLocation *createdLocation = [[CLLocation alloc]initWithLatitude:latitude
longitude:longitude];
After creating the above object I printed the value for createdLocation.coordinate. What I get is (latitude = 10.268408101671659, longitude = 76.353965649742264)
So how and why iOS automatically complete the precision to 15 from my 6 point precision?
Update
Now I inserted the latitude and longitude to core data db. Then I fetch the lat-long from DB and then created a CLLocation object. Now the precision is as I wanted(shown below)
latitude = 10.268408, longitude = 76.353965
What difference occured when I wrote to DB? Why is it not working with float variables of 6 point precision?