2

I want to implement slack slash command that has to process fucntion pipeline which takes roughly 30 seconds to process. Now since Slack slash commands only allows 3 seconds to respond, how to go about implementing this. I referred this but don't how to implement it.

Please hold up with me. I am doing this first time. This is what I have tried. I know how to respond with ok status within 3 seconds but I don't understand how to again call pipeline

import requests
import json
from bottle import route, run, request
from S3_download import s3_download
from index import main_func

@route('/action')
def action():
        pipeline()
        return "ok"

def pipeline():
        s3_download()
        p = main_func()
        print (p)

if __name__ == "__main__":
      run(host='0.0.0.0', port=8082, debug=True)

I came across this article. Is using AWS lambda the only solution? Can't we do this completely in python?

Community
  • 1
  • 1
prashantitis
  • 1,797
  • 3
  • 23
  • 52
  • Have you tried using queues? In your get request you can retrieve the request parameters and store them in queue, and your worker process can process the queue and revert with the response expected. If not that maybe we can try using non-blocking requests framework like using tornado in place of bottle? – Sirius Oct 07 '16 at 12:44
  • Can you explain with a snippet or pseudo code ? your first approach – prashantitis Oct 07 '16 at 12:47

2 Answers2

2

Something like this:

from boto import sqs
@route('/action', method='POST')
def action():
    #retrieving all the required request example
    params = request.forms.get('response_url')

    sqs_queue = get_sqs_connection(queue_name)
    message_object = sqs.message.Message()
    message_object.set_body(params)
    mail_queue.write(message_object)
    return "request under process"

and you can have another process which processes the queue and call long running function:

sqs_queue = get_sqs_connection(queue_name)
for sqs_msg in sqs_queue.get_messages(10, wait_time_seconds=5):
    processed_msg = json.loads(sqs_msg.get_body())
    response = pipeline(processed_msg)
    if response:
        sqs_queue.delete_message(sqs_msg)

you can run this 2nd process maybe in a diff standalone python file, as a daemon process or cron.

I`v used sqs Amazon Queue here, but there are different options available.

Sirius
  • 736
  • 2
  • 9
  • 22
0

You have an option or two for doing this in a single process, but it's fraught with peril. If you spin up a new Thread to handle the long process, you might end up deploying or crashing in the middle and losing it.

If durability is important to you, look into background-task workers like SQS, Lambda, or even a Celery task queue backed with Redis. A separate task has some interesting failure modes, and these tools will help you deal with them better than just spawning a thread.

Ben Straub
  • 5,675
  • 3
  • 28
  • 43