0

I've never written a recursive python script before. I'm used to splitting up a monolithic function into sub AWS Lambda functions. However, this particular script I am working on is challenging to break up into smaller functions.

Here is the code I am currently using for context. I am using one api request to return a list of objects within a table.

url_pega_EEvisa = requests.get('https://cloud.xxxx.com:443/prweb/api/v1/data/D_pxCaseList?caseClass=xx-xx-xx-xx', auth=(username, password))
pega_EEvisa_raw = url_pega_EEvisa.json()
pega_EEvisa = pega_EEvisa_raw['pxResults']

This returns every object(primary key) within a particular table as a list. For example,

['XX-XXSALES-WORK%20PO-1', 'XX-XXSALES-WORK%20PO-10', 'XX-XXSALES-WORK%20PO-100', 'XX-XXSALES-WORK%20PO-101', 'XX-XXSALES-WORK%20PO-102', 'XX-XXSALES-WORK%20PO-103', 'XX-XXSALES-WORK%20PO-104', 'XX-XXSALES-WORK%20PO-105', 'XX-XXSALES-WORK%20PO-106', 'XX-XXSALES-WORK%20PO-107']

I then use this list to populate more get requests using a for loop which then grabs me all the data per object.

for t in caseid:
    url = requests.get(('https://cloud.xxxx.com:443/prweb/api/v1/cases/{}'.format(t)), auth=(username, password)).json()
    data.append(url)

This particular lambda function takes about 15min which is the limit for one AWS Lambda function. Ideally, I'd like to split up the list into smaller parts and run the same process. I am struggling marking the point where it last ran before failure and passing that information on to the next function.

Any help is appreciated!

Bigmoose70
  • 453
  • 6
  • 15
  • I don't quite follow what you're trying to do, but instead of using recursive Lambda functions, have you considered pushing work the an Amazon SQS queue, and having it trigger Lambda functions for each message? – John Rotenstein Feb 17 '20 at 23:31
  • Hi John, that is the end goal! Our architect is working on using SQS queues. I am trying to split a list of entries into smaller lists and pass the remaining entries to another lambda function. – Bigmoose70 Feb 19 '20 at 03:37
  • @Bigmoose70 is my answer below what you're looking for or I was I off base? Just want to know if I missed something :) – Travis Haby Feb 24 '20 at 19:05
  • @TravisHaby I agree w/ the pseudocode, however, I am not too sure how to pull it off. This type of methodology using recursive lambda functions is probably not the best way. – Bigmoose70 Mar 04 '20 at 02:03
  • I think there are cases where it can be really useful, don't know enough about your specific use case to have an opinion. Check out this talk about using recursive calls to create a video transcoding system as an example of where it makes sense :) https://acloud.guru/series/serverlessconf-nyc-2019/view/solving-big-problems – Travis Haby Apr 24 '20 at 13:45

1 Answers1

0

I'm not sure if I entirely understand what you want to do with the data once you've fetched all the information about the case, but it terms of breaking up the work once lambda is doing into many lambdas, you should be able to chunk out the list of cases and pass them to new invocations of the same lambda. Python psuedocode below, hopefully it helps illustrate the idea. I stole the chunks method from this answer that would help break the list into batches

import boto3
import json

client = boto3.client('lambda')

def handler
  url_pega_EEvisa = requests.get('https://cloud.xxxx.com:443/prweb/api/v1/data/D_pxCaseList?caseClass=xx-xx-xx-xx', auth=(username, password))
  pega_EEvisa_raw = url_pega_EEvisa.json()
  pega_EEvisa = pega_EEvisa_raw['pxResults']

  for chunk in chunks(pega_EEvisa, 10)
    client.invoke(
      FunctionName='lambdaToHandleBatchOfTenCases',
      Payload=json.dumps(chunk)
    )

Hopefully that helps? Let me know if this was not on target

Travis Haby
  • 150
  • 1
  • 6