4

To be specific, this question is about how to get the specified quota raised or lifted, not how to be more efficient within the existing quota limit.

When running a MapReduce job on GAE, I hit the quota limit listed below. The limit is 100GB of "file bytes received" per day, which is file bytes received from Blobstore from what I can tell. Increasing my budget has no affect on this quota limit of 100Gb/day. I'd like the limit lifted entirely and the ability to pay for what I use.

Output in logs:

The API call file.Open() required more quota than is available.
Traceback (most recent call last):
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1511, in __call__
    rv = self.handle_exception(request, response, e)
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1505, in __call__
    rv = self.router.dispatch(request, response)
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1253, in default_dispatcher
    return route.handler_adapter(request, response)
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 1077, in __call__
    return handler.dispatch()
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 547, in dispatch
    return self.handle_exception(e, self.app.debug)
  File "/base/python27_runtime/python27_lib/versions/third_party/webapp2-2.3/webapp2.py", line 545, in dispatch
    return method(*args, **kwargs)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/base_handler.py", line 68, in post
    self.handle()
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/handlers.py", line 168, in handle
    for entity in input_reader:
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/mapreduce_pipeline.py", line 109, in __iter__
    for binary_record in super(_ReducerReader, self).__iter__():
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/input_readers.py", line 1615, in __iter__
    record = self._reader.read()
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/records.py", line 335, in read
    (chunk, record_type) = self.__try_read_record()
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/records.py", line 292, in __try_read_record
    header = self.__reader.read(HEADER_LENGTH)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 569, in read
    with open(self._filename, 'r') as f:
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 436, in open
    exclusive_lock=exclusive_lock)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 269, in __init__
    self._open()
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 393, in _open
    self._make_rpc_call_with_retry('Open', request, response)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 397, in _make_rpc_call_with_retry
    _make_call(method, request, response)
  File "/base/data/home/apps/s~utest-appgraph/69.358421800203055451/mapreduce/lib/files/file.py", line 243, in _make_call
    rpc.check_success()
  File "/base/python27_runtime/python27_lib/versions/1/google/appengine/api/apiproxy_stub_map.py", line 558, in check_success
    self.__rpc.CheckSuccess()
  File "/base/python27_runtime/python27_lib/versions/1/google/appengine/api/apiproxy_rpc.py", line 133, in CheckSuccess
    raise self.exception
OverQuotaError: The API call file.Open() required more quota than is available.
Shay Erlichmen
  • 31,691
  • 7
  • 68
  • 87
joweeba
  • 271
  • 3
  • 8

2 Answers2

2

It seems you need to talk to Google directly: on quotas page there is a link to a form to request quota increase: http://support.google.com/code/bin/request.py?&contact_type=AppEngineCPURequest

Peter Knego
  • 79,991
  • 11
  • 123
  • 154
0

I've hit this error too. We are using the "experimental backup" feature of appengine. This in turn runs a map reduce to backup all appengine data to google-cloud-storage. However, currently the backup fails with this error:

OverQuotaError: The API call file.Open() required more quota than is available.

And in the quota dashboard we see:

Other Quotas With Warnings
These quotas are only shown when they have warnings
File Bytes Sent 100%    107,374,182,400 of 107,374,182,400  Limited

So apparently there is a hidden quota "File Bytes Sent" which we have hit. But it not documented anywhere, and we could have never known we would hit it.... Now we're stuck

TjerkW
  • 2,086
  • 21
  • 26
  • Hi, I've run into the same issue when exporting GAE data into Google Cloud Storage. Did you find a solution to this? How to raise this limit? When is the quota reset? I've tried configuring the task queue to 100msg/s. I've tried contacting Google and no response so far. – Ricardo Cabral Sep 04 '14 at 16:20