5

It seems gae assigns very high IDs to the models. When I download my entities, I get for some entries very big numbers. These were autogenerated in first place. Downloading them as csv is no problem. But deleting the existing data and re-uploading the same data throws an exception.

Exceeded maximum allocated IDs

Trace:

Traceback (most recent call last):
  File "/opt/eclipse/plugins/org.python.pydev_2.7.5.2013052819/pysrc/pydevd.py", line 1397, in <module>
    debugger.run(setup['file'], None, None)
  File "/opt/eclipse/plugins/org.python.pydev_2.7.5.2013052819/pysrc/pydevd.py", line 1090, in run
    pydev_imports.execfile(file, globals, locals) #execute the script
  File "/home/kave/workspace/google_appengine/appcfg.py", line 171, in <module>
    run_file(__file__, globals())
  File "/home/kave/workspace/google_appengine/appcfg.py", line 167, in run_file
    execfile(script_path, globals_)
  File "/home/kave/workspace/google_appengine/google/appengine/tools/appcfg.py", line 4247, in <module>
    main(sys.argv)
  File "/home/kave/workspace/google_appengine/google/appengine/tools/appcfg.py", line 4238, in main
    result = AppCfgApp(argv).Run()
  File "/home/kave/workspace/google_appengine/google/appengine/tools/appcfg.py", line 2396, in Run
    self.action(self)
  File "/home/kave/workspace/google_appengine/google/appengine/tools/appcfg.py", line 3973, in __call__
    return method()
  File "/home/kave/workspace/google_appengine/google/appengine/tools/appcfg.py", line 3785, in PerformUpload
    run_fn(args)
  File "/home/kave/workspace/google_appengine/google/appengine/tools/appcfg.py", line 3676, in RunBulkloader
    sys.exit(bulkloader.Run(arg_dict))
  File "/home/kave/workspace/google_appengine/google/appengine/tools/bulkloader.py", line 4379, in Run
    return _PerformBulkload(arg_dict)
  File "/home/kave/workspace/google_appengine/google/appengine/tools/bulkloader.py", line 4244, in _PerformBulkload
    loader.finalize()
  File "/home/kave/workspace/google_appengine/google/appengine/ext/bulkload/bulkloader_config.py", line 384, in finalize
    self.increment_id(high_id_key)
  File "/home/kave/workspace/google_appengine/google/appengine/tools/bulkloader.py", line 1206, in IncrementId
    unused_start, end = datastore.AllocateIds(high_id_key, max=high_id_key.id())
  File "/home/kave/workspace/google_appengine/google/appengine/api/datastore.py", line 1965, in AllocateIds
    return AllocateIdsAsync(model_key, size, **kwargs).get_result()
  File "/home/kave/workspace/google_appengine/google/appengine/api/apiproxy_stub_map.py", line 612, in get_result
    return self.__get_result_hook(self)
  File "/home/kave/workspace/google_appengine/google/appengine/datastore/datastore_rpc.py", line 1863, in __allocate_ids_hook
    self.check_rpc_success(rpc)
  File "/home/kave/workspace/google_appengine/google/appengine/datastore/datastore_rpc.py", line 1236, in check_rpc_success
    raise _ToDatastoreError(err)
google.appengine.api.datastore_errors.BadRequestError: Exceeded maximum allocated IDs

Usually my Id's are around 26002 but the new id's since a few days ago are as big as 4948283361329150. These are causing problems now. (If I change them to lower values, its all fine, but i didn't generate these ids in first place) Why does GAE have such problems with its own generated ids?

Many Thanks

Houman
  • 64,245
  • 87
  • 278
  • 460
  • can you please clarify few points: 1, you are exporting data along with `_key_` and then before importing these data back to app engine, you are deleting existing app engine data and then using the id saved in csv to add back the modified data? 2, are you using the import transform : `transform.create_foreign_key(Kind)` ? – tony m Jul 12 '13 at 14:01
  • hi tony, yes i am exporting the data along with key and before importing the data back to gae, I am deleting the existing data using the id saved in csv (which was produced by gae in first place). – Houman Aug 05 '13 at 09:04

1 Answers1

2

This is a known issue, fixed in the 1.8.2 or later SDKs.

Note, if you use bulkloader against the dev appserver those SDKs (1.8.2, 1.8.3) unfortunately have a separate bulkloader issue with that use case (see appcfg-py-upload-data-fails-in-google-app-engine-sdk-1-8-2) but not in production.

Community
  • 1
  • 1
Rohwer
  • 149
  • 1