-6

how I can take a request to my web server and turn it on, perhaps, several request to another web server on other hosts and so on, until, one or more of the various web server to obtain a response, and return it to the client that initiated the request.

Kara
  • 6,115
  • 16
  • 50
  • 57

1 Answers1

1

It sounds like what you are asking for is this:

[client] --> [your webserver] ----> [multitude of servers]

  1. Client initiates a request to your webserver
  2. Your web server issues a request to one or more remote webservers
  3. When at least one of the remote webservers answers back or when the timeout runs out, your webserver sends a response back to your client.

If that is what you are asking, then the scope is too broad. Ie, what is your implementation sitting between? Is this really a python question? :)

If all of the multitude of servers on the backend are the same and are controlled by you, then the server sitting in the front is basically going to be a load balancer or proxy server. Ala HA-Proxy, Nginx, Apache, Varnish, etc.

If you are talking about setting up your site to do partial proxy or web scraping or data capture, then that is something else entirely.

In either case, your question/case is too vague, as posted.

EDIT: In response to the comment.

That's not particularly difficult. What you need to do is to setup a python daemon process, which listens on a web port. Depending on how you want the file reading behaviour to work, you can either read the file in at startup, so that it is in memory, or you can take the performance hit and read the file for each attempt.

Initiating a connection to the remote sites is a matter of constructing and calling the appropriate python library.

To create the python web server:

http://docs.python.org/2/library/simplehttpserver.html

To initiate a connection to remote sites:

Using the Python HTTP libraries: http://docs.python.org/2/library/httplib.html

Building a Python Web Client: http://python.about.com/od/networkingwithpython/ss/beg_web_client_all.htm

Python Web Scraping: Web scraping with Python http://scrapy.org/

What you are describing is not an ideal way to perform the task. You are significantly better off with HAproxy/Nginx in this regard. However, given the requirement is python, the above resource links will point you in the right direction.

Community
  • 1
  • 1
Wing Tang Wong
  • 802
  • 4
  • 10
  • Yes!, That is the question, according to the first part of your reconstruction. Each server may contain one of two files, say one called URLS.txt or one called SOL.txt, if contains the "URLS.txt", then the application must continue the search in each of the urls in "URLS.txt ".. If the host contains the file named SOL.txt, then the content thereof is sent to the client as a response. Why python? It's just a requirement. – user2250116 Apr 17 '13 at 17:53
  • I've edited my answer. There are plenty of resources that describes how to implement each part of what you are looking for. – Wing Tang Wong Apr 17 '13 at 19:10