0

I'm trying to detect timezone on google appengine, based on the latitude and longitude returned from X-AppEngine-CityLatLong.

Solution tried so far:

I've been using pytzwhere on local dev to get timezone by lat long (hardcoded localX-AppEngine-CityLatLong to a given location). This works great, I get the timezone ids I need.

However, pytzwhere uses tz_world data (converted to json, in a format needed for pytzwhere) but the files are too big to upload to appengine. I zipped up the required tz_world data (to allow the upload to appengine) and unzip it when needed. This results in exceeding soft private memory limit.

Would it be better to upload the files to blobstore/cloudstorage and unzip from there, or what that also result in exceeding soft private memory limit? I'd prefer not to bump up the frontend instance size (to get more memory) because I don't really need the extra computing power (but maybe this is an option)

I haven't done much work with geospatial data, but would storing the data from tz_world in the datastore (with geomodel) and writing an implementation of pytzwhere to query the data from the datastore be a feasible soluton?

Other options: web services

I'd prefer not to use a webservice to retrieve timezone (e.g. Google Timezone API) because or rate limits, but perhaps that's the easiest way for now. This Related question is great but doesn't have any appengine specific implementation (other than webservice options).

Community
  • 1
  • 1
Rob Curtis
  • 2,245
  • 24
  • 33

1 Answers1

1

I would suggest you use the Google Timezone API as your primary source of data, but cache the results in the datastore, and only use the API to lookup latlongs that you do not yet have in the datastore.

IanGSY
  • 3,664
  • 1
  • 22
  • 40
  • Ya, that's a method(using g api) discussed. I'm looking specifically to host the solution myself so that I don't have the lag of using another webservice. Perhaps the caching approach will mitigate that lag. – Rob Curtis Aug 19 '14 at 08:37
  • any ideas if using the google api on client side only (without api key) would still get rate-limited? – Rob Curtis Aug 19 '14 at 08:41
  • 1
    Ended up doing as you suggested and used the timezone api. The lag isn't bad, and caching the results works well too. – Rob Curtis Aug 19 '14 at 11:03
  • 1
    @Rob Curtis if you are currently manually caching... just use python's ndb instead of db. ndb automatically "pre-caches" your results in cache memory – Patrice Aug 19 '14 at 16:48
  • @Julldar ya, using ndb. Also using city-latlon as keyname so getting by key is super quick (and is saved in memcache as you suggest). – Rob Curtis Aug 19 '14 at 17:12