4

I've been trying to write a basic terminal emulation script, because for some reason i've got no terminal access on my mac. But to write game engine scripts in blender the console, which usually opens in the terminal you started blender with, is crucial.
For just doing simple things like deleting, renaming etc. I used to execute commands using stream = os.popen(command) and then print (stream.read()). That works fine for most things but not for anything interactive.
Shortly i've discovered a new way:
sp = subprocess.Popen(["/bin/bash", "-i"], stdout = subprocess.PIPE, stdin = subprocess.PIPE, stderr = subprocess.PIPE) and then print(sp.communicate(command.encode())). That should spawn an interactive shell that i can use like a terminal, doesen't it?

But either way i can't keep the connection open, and using the last example I can call sp.communicate once, giving me the following output(in this case for 'ls /') and some errors:
(b'Applications\n[...]usr\nvar\n', b'bash: no job control in this shell\nbash-3.2$ ls /\nbash-3.2$ exit\n'). The second time it gives me a ValueError: I/O operation on closed file. Sometimes (like for 'ls') I only get this error: b'ls\nbash-3.2$ exit\n'.

What does that mean? How can i emulate a terminal with python that allows me to control an interactive shell or run blender and communicate with the console?

lucaba
  • 175
  • 1
  • 7
  • Does Blender not allow you to open up its own terminal window from the running process? – JAB Jul 24 '12 at 16:13
  • As far as I know, on mac if you need the console you need to start blender by directly opening the executable which by default starts terminal. But i can't open any terminal windows anyway, because parental controls are activated on my mac, but i'm pretty sure scripting in blender is not what the admin wants to restrict – lucaba Jul 24 '12 at 16:19
  • 1
    You can probably use or obtain Terminal.app for your Mac. – user1277476 Jul 24 '12 at 17:04

3 Answers3

10

Assuming you want an interactive shell that keeps asking for input, you could try the following:

import subprocess
import re

while True:
    # prevents lots of python error output
    try:
        s = raw_input('> ')
    except:
        break

    # check if you should exit
    if s.strip().lower() == 'exit':
        break

    # try to run command
    try:
        cmd = subprocess.Popen(re.split(r'\s+', s), stdout=subprocess.PIPE)
        cmd_out = cmd.stdout.read()

        # Process output
        print cmd_out

    except OSError:
        print 'Invalid command'
ryucl0ud
  • 622
  • 4
  • 7
  • Nice. Although you could replace `s == 'exit'` with `s.startswith('exit')` (with possible a `s.lower()` in there too). This will catch the case when the user types `exit ` (notice the space). – Chris Jul 24 '12 at 16:22
  • Thanks, i tried it, but actually i need the output of the console, for debugging scripts. But this script only executes commands. – lucaba Jul 24 '12 at 16:26
  • Good tip. I believe the code is still simple enough to understand with these changes. – ryucl0ud Jul 24 '12 at 16:27
  • I don't know if i've understood something wrong,or just didn't explain it well, but i've managed to do something like this, but my problem is i don't now how to communicate with blender. When i start blender this way it just blocks the script, but no debug data is printed – lucaba Jul 24 '12 at 16:56
  • @ryucl0ud I just found out, your script actually works, but the information is only printed when i quit blender. Do you know how to fix this? thx – lucaba Jul 24 '12 at 17:17
  • @lucaba I'm not really familiar with how Blender deals with Python scripts. Perhaps it does some weird form of buffering. You may be able to get it to work by performing `import sys` and `sys.stdout.flush()` after printing inside the loop. – ryucl0ud Jul 24 '12 at 17:26
  • @ryucl0ud The problem lies in the `cmd_out = cmd.stdout.read()` part. I think is somehow waits for the process to finish before the read command finishes. Anyway, I tried changing the stdout reader to unbuffered like this `gc.garbage.append(cmd.stdout);cmd.stdout = os.fdopen(cmd.stdout.fileno(), "rb", 0)` as seen in [this](http://stackoverflow.com/questions/107705/python-output-buffering) thread but it doesn't read anything. Anyway, thanks for your effort. – lucaba Jul 24 '12 at 19:55
  • The problem lies in the `cmd_out = cmd.stdout.read()` part. I think is somehow waits for the process to finish before the read command finishes. Anyway, I tried changing the stdout reader to unbuffered like this `gc.garbage.append(cmd.stdout);cmd.stdout = os.fdopen(cmd.stdout.fileno(), "rb", 0)` as seen in [this](http://stackoverflow.com/questions/107705/python-output-buffering) thread but it doesn't read anything. Anyway, thanks for your effort ryucl0ud Does anyone know whether you can set a **file** object **non-blocking**, as with sockets? – lucaba Jul 24 '12 at 19:58
5

Here is something that I worked on to do what you want in windows.. A much more difficult problem because windows doesn't follow any standard but their own. Slight modification of this code should give you exactly what you are looking for.

'''
Created on Mar 2, 2013

@author: rweber
'''
import subprocess
import Queue
from Queue import Empty
import threading


class Process_Communicator():

    def join(self):
        self.te.join()
        self.to.join()
        self.running = False
        self.aggregator.join()

    def enqueue_in(self):
        while self.running and self.p.stdin is not None:
            while not self.stdin_queue.empty():
                s = self.stdin_queue.get()
                self.p.stdin.write(str(s) + '\n\r')
            pass

    def enqueue_output(self):
        if not self.p.stdout or self.p.stdout.closed:
            return
        out = self.p.stdout
        for line in iter(out.readline, b''):
            self.qo.put(line)

    def enqueue_err(self):
        if not self.p.stderr or self.p.stderr.closed:
            return
        err = self.p.stderr
        for line in iter(err.readline, b''):
            self.qe.put(line)

    def aggregate(self):
        while (self.running):
            self.update()
        self.update()

    def update(self):
        line = ""
        try:
            while self.qe.not_empty:
                line = self.qe.get_nowait()  # or q.get(timeout=.1)
                self.unbblocked_err += line
        except Empty:
            pass

        line = ""
        try:
            while self.qo.not_empty:
                line = self.qo.get_nowait()  # or q.get(timeout=.1)
                self.unbblocked_out += line
        except Empty:
            pass

        while not self.stdin_queue.empty():
                s = self.stdin_queue.get()
                self.p.stdin.write(str(s) + '\n\r')

    def get_stdout(self, clear=True):
        ret = self.unbblocked_out
        if clear:
            self.unbblocked_out = ""
        return ret

    def has_stdout(self):
        ret = self.get_stdout(False)
        if ret == '':
            return None
        else:
            return ret

    def get_stderr(self, clear=True):
        ret = self.unbblocked_err
        if clear:
            self.unbblocked_err = ""
        return ret

    def has_stderr(self):
        ret = self.get_stderr(False)
        if ret == '':
            return None
        else:
            return ret

    def __init__(self, subp):
        '''This is a simple class that collects and aggregates the
        output from a subprocess so that you can more reliably use
        the class without having to block for subprocess.communicate.'''
        self.p = subp
        self.unbblocked_out = ""
        self.unbblocked_err = ""
        self.running = True
        self.qo = Queue.Queue()
        self.to = threading.Thread(name="out_read",
                                    target=self.enqueue_output,
                                    args=())
        self.to.daemon = True  # thread dies with the program
        self.to.start()

        self.qe = Queue.Queue()
        self.te = threading.Thread(name="err_read",
                                   target=self.enqueue_err,
                                   args=())
        self.te.daemon = True  # thread dies with the program
        self.te.start()

        self.stdin_queue = Queue.Queue()
        self.aggregator = threading.Thread(name="aggregate",
                                           target=self.aggregate,
                                           args=())
        self.aggregator.daemon = True  # thread dies with the program
        self.aggregator.start()
        pass
def write_stdin(p,c):
    while p.poll() == None:
        i = raw_input("send to process:")
        if i is not None:
            c.stdin_queue.put(i)


p = subprocess.Popen("cmd.exe", shell=True, stdout=subprocess.PIPE,
                     stderr=subprocess.PIPE, stdin=subprocess.PIPE)
c = Process_Communicator(p)
stdin = threading.Thread(name="write_stdin",
                           target=write_stdin,
                           args=(p,c))
stdin.daemon = True  # thread dies with the program
stdin.start()
while p.poll() == None:
    if c.has_stdout():
        print c.get_stdout()
    if c.has_stderr():
        print c.get_stderr()

c.join()
print "Exit"
stonea
  • 405
  • 2
  • 17
Rusty Weber
  • 1,541
  • 1
  • 19
  • 32
1

It seems that you should run it on new control terminal, allocated with "forkpty". To suppress the warning "no job control ...", you need to call "setsid".

kefir_
  • 11
  • 2
  • Thanks for your reply, but i'm not very used to these two commands, could you please explain them and why I should use them. thx – lucaba Jul 28 '12 at 11:31
  • import pty; pid, master = pty.fork(); if not pid: os.execlp("/bin/sh", "/bin/sh", "-c", "cd $HOME && exec %s" % command) – kefir_ Jul 29 '12 at 03:57
  • 1
    fork, openpty and setsid are called in pty.fork(). To get responce, os.read(master, length). – kefir_ Jul 29 '12 at 04:07
  • Thanks. But i still don't understand it. please explain what this does. Anyways, I've upvoted because it helped me – lucaba Jul 29 '12 at 19:54