I design databases and have been studying this question for a while. We use an off-the shelf application with an Oracle backend where the data fields were defined to allow 17 decimal places. Ridiculous! That's in the thousandths of an inch. No GPS instrument in the world is that accurate. So let's put aside 17 decimal places and deal with practical. The Government guarantees their system is good to "a "worst case" pseudorange accuracy of 7.8 meters at a 95% confidence level" but then goes on to say actual FAA (using their high quality instruments) has shown GPS readings to usually be good to within a meter.
So you have to ask yourself two questions:
1) What is the source of your values?
2) What will the data be used for?
Cell phones are not particularly accurate, and Google/MapQuest readings are probably only good to 4 or 5 decimals. A high quality GPS instrument might get you 6 (within the United States). But capturing more than that is a waste of typing and storage space. Furthermore, if any searches are done on the values, it's nice for a user to know that 6 would be the most he/she should look for (obviously any search value entered should first be rounded to the same accuracy as the data value being searched).
Furthermore, if all you're going to do is view a location in Google Maps or put it in a GPS to get there, four or five is plenty.
I have to laugh at people around here entering all those digits. And where exactly are they taking that measurement? Front door knob? Mailbox out front? Center of building? Top of cell tower? AND... is everyone consistently taking it at the same place?
As a good database design, I would accept values from a user for maybe a few more than five decimal digits, then round and capture only five for consistency [maybe six if your instruments are good and your end use warrants it].