0

I've done a little bit of research, but I cannot seem to find a way to handle if two users have sent data in, but the Python script can only handle one.

This is typically how the script works:

1) User Enters Data '123' 2) Python listener executes on Data | | Sends requests to server and retrieves data (typically ~1 min) | Script writes to HTML files | 3) Finishes writing to files, waits for more User input

Now the problem is that if another user enters data during that step 2 - 3 stage, the script is no longer listening, and will not do anything with this data.

Is there anyway that I can have it always listen for a change, and once it does, pass it onto a class or another entity of itself so it can continue to listen for another asynchronous change?

EDIT:

  • The User enters the Data on a website, which is consequently written to a text file.
  • The Python script currently checks the last modified line in this file to see if it differs from the previous check. If this check results in true, then execute the class with the modified line
  • 2
    Where is the user entering data '123'? How is your Python listener actually doing its "listening"? It sounds like you want to use the threading or multiprocessing modules here, but some clarification would help. – dano Apr 30 '14 at 19:33
  • the tag [tag:asynchronous] suggests that you already know the solution. Do you need to decide what specific asynchronous solutions could you use and how to use them in your case: threads, multiprocessing, gevent, asyncio, twisted, tornado? – jfs Apr 30 '14 at 19:38
  • @dano Yes, I have updated the thread. Sorry for the confusion :) –  Apr 30 '14 at 19:43
  • @J.F.Sebastian Yes, I understand the definition of asynchronous, but am not quite sure how to apply it to code. –  Apr 30 '14 at 19:44
  • 2
    @jayumz Do you have control over the server running on the website that writes to the text file? Having the server itself handle all this work seems more logical to me than having a separate script watching a file the server writes to. However, if you have to go that route, the multiprocessing module should help you. You would have the main process watch for file changes, and every time one is found, have a worker subprocess actually handle the processing work. – dano Apr 30 '14 at 19:49
  • @dano Yes, it is a Linux server running Apache 2. And the PHP on the website takes the post data from the form and writes it to a text file. How could the server handle the write to the file, when the POST is client-side? And how would I be able to execute a worker process from the main process? –  Apr 30 '14 at 19:55
  • Oh, the file is written client-side? Sorry, I was thinking the server was writing the file. My mistake. – dano Apr 30 '14 at 20:04
  • @dano The User uses a form to post their data **>** The PHP takes this post and writes it to a file on the server **>** The Python script detects this change in the file and executes using this change **>** Python creates a '.html' file for the user –  Apr 30 '14 at 20:06

2 Answers2

0

You are describing client - server architecture.

As you assume multiple clients using the same service, you have to allow some sort of multiprocessing.

Web server is typical example, but there are many others.

In your task, following roles are expected:

  • client: performing request and expecting some response
  • server: accepting requests and managing their processing resulting in sending response back to clients
  • worker: a component at server, which is tasked to do "real work"

What you describe seems like mixture of all that together. When you write the code, you usually start thinking in a script, which is what later ends up at worker.

When designing your solution, you have to make your decision on communication technology. There are any options, some being:

  • http - typical with web servers, Python offers many frameworks
  • TCP sockets - rather low level, but also well supported in Python
  • zeromq - based on TCP or unix sockets, supported by pyzmq package

You will have to write all three parts - client, server and the worker.

One quick example of client server solution based on zeromq is in my answer Distributed lock manager

Community
  • 1
  • 1
Jan Vlcinsky
  • 42,725
  • 12
  • 101
  • 98
  • I don't actually use any frameworks inside of my code. It is a little bit messy, but I create and modify a '.html' template inside of the python listener, and once it is finished, replace the 'loading' page of the user with the actual content. –  Apr 30 '14 at 19:51
0

Although I'm still not exactly sure why you don't have the server itself handle this, my suggestion to handle it from the Python script would be to use the multiprocessing module. Here's a really basic way to handle this using a single worker process:

from multiprocessing import Queue, Process
import multiprocessing

def worker(e, m):
    while True:
       e.wait() # Wait to be told the file has changed. This will block.
       e.clear()  # Clear the flag so the main thread can set it again if changes happen while we process
       # Send request to server, retrieve data
       # Write to HTML files

def watch_file_for_changes():
    while True:  
       if file_changed: # Use whatever watching mechanism you already have for this; inotify, etc.
          e.set() # Tell the worker to process the file. This unblocks e.wait()

if __name__ == "__main__":
   e = Event()
   # Start some workers. You can choose whatever number you want.
   p = multiprocessing.Process(target=worker, args=(e,))
   p.start()
   watch_file_for_changes()

This is completely untested, and needs some cleaning up, but should give you the general idea. This also assumes your worker is smart enough to figure out if more than one new entry has been added to the text file.

dano
  • 91,354
  • 19
  • 222
  • 219