10

The situation: I have a website that allows people to execute arbitrary code in a different language (specifically, an esolang I created), using a Python interpreter on a shared-hosting server. I run this code in a separate process which is given a time limit of 60 seconds.

The problem: You can do stuff like (Python equivalent) 10**(10**10), which rapidly consumes far more memory than I have allotted to me. It also, apparently, locks up Apache - or it takes too long to respond - so I have to restart it.

I have seen this question, but the given answer uses Perl, which I do not know at all, hence I'd like an answer in Python. The OS is Linux too, though.

Specifically, I want the following characteristics:

  1. Runs automatically
  2. Force-kills any process that exceeds some memory limit like 1MB or 100MB
  3. Kills any process spawned by my code that is more than 24 hours old

I use this piece of code (in a Django view) to create the process and run it (proxy_prgm is a Manager so I can retrieve data from the program that's interpreting the esolang code):

prgmT[uid] = multiprocessing.Process(
    target = proxy_prgm.runCatch,
    args = (steps,),
    name="program run")

prgmT[uid].start()
prgmT[uid].join(60) #time limit of 1 minute

if prgmT[uid].is_alive():
    prgmT[uid].terminate()
    proxy_prgm.stop()

If you need more details, don't hesitate to tell me what to edit in (or ask me questions).

Community
  • 1
  • 1
El'endia Starman
  • 2,204
  • 21
  • 35
  • 2
    Some ideas: You could use ulimit to set the maximum memory of the spawned process. You could use psutil to monitor memory use and kill it if it gets out of hand. – dbn Dec 01 '15 at 08:05
  • 1
    I agree you should use your operating systems' `ulimit` to watch over resources. It can be used to make sure you don't consume other resources like file locks. – Maciek Dec 01 '15 at 13:15
  • Just a side note - a view doesn't seem to be the right place for such code to me. – David Ferenczy Rogožan Dec 14 '15 at 17:47

1 Answers1

4

Another approach that might work; using resource.setrlimit() (more details in this other StackOverflow answer). It seems that by doing so you can set a memory limit on a process and it's subprocesses; you'll have to figure out how to handle if the limit is hit though. I don't have personal experience using it, but hopefully doing so would stop Apache from locking up on you.

Community
  • 1
  • 1
John O'Brien
  • 101
  • 1
  • 2