3

This is my first post here, so if there are any questions or if something is unlcear, don't hesitate to ask.

I am trying to use a dynamic host file so I can build multiple vagrant machines without having to manage the host file first. This is what I found online:

    #!/usr/bin/env python
# Adapted from Mark Mandel's implementation
# https://github.com/ansible/ansible/blob/devel/plugins/inventory/vagrant.py
import argparse
import json
import paramiko
import subprocess
import sys


def parse_args():
    parser = argparse.ArgumentParser(description="Vagrant inventory script")
    group = parser.add_mutually_exclusive_group(required=True)
    group.add_argument('--list', action='store_true')
    group.add_argument('--host')
    return parser.parse_args()


def list_running_hosts():
    cmd = "vagrant status --machine-readable"
    status = subprocess.check_output(cmd.split()).rstrip()
    hosts = []
    for line in status.split('\n'):
        (_, host, key, value) = line.split(',')
        if key == 'state' and value == 'running':
            hosts.append(host)
    return hosts


def get_host_details(host):
    cmd = "vagrant ssh-config {}".format(host)
    p = subprocess.Popen(cmd.split(), stdout=subprocess.PIPE)
    config = paramiko.SSHConfig()
    config.parse(p.stdout)
    c = config.lookup(host)
    return {'ansible_ssh_host': c['hostname'],
            'ansible_ssh_port': c['port'],
            'ansible_ssh_user': c['user'],
            'ansible_ssh_private_key_file': c['identityfile'][0]}


def main():
    args = parse_args()
    if args.list:
        hosts = list_running_hosts()
        json.dump({'vagrant': hosts}, sys.stdout)
    else:
        details = get_host_details(args.host)
        json.dump(details, sys.stdout)

if __name__ == '__main__':
    main()

However, when I run this I get the following error:

ERROR! The file inventory/vagrant.py is marked as executable, but failed to execute correctly. If this is not supposed to be an executable script, correct this with `chmod -x inventory/vagrant.py`.
ERROR! Inventory script (inventory/vagrant.py) had an execution error: Traceback (most recent call last):
  File "/home/sebas/Desktop/playbooks/inventory/vagrant.py", line 52, in <module>
    main()
  File "/home/sebas/Desktop/playbooks/inventory/vagrant.py", line 45, in main
    hosts = list_running_hosts()
  File "/home/sebas/Desktop/playbooks/inventory/vagrant.py", line 24, in list_running_hosts
    (_, host, key, value) = line.split(',')
ValueError: too many values to unpack

ERROR! inventory/vagrant.py:4: Expected key=value host variable assignment, got: argparse

does anybody know what I did wrong? Thank you guys in advance!

  • did you check this http://stackoverflow.com/questions/5466618/too-many-values-to-unpack-iterating-over-a-dict-key-string-value-list – Frederic Henri May 24 '16 at 14:42

1 Answers1

2

I guess the problem is that vagrant status command will work only inside a directory with a Vagrantfile, or if the ID of a target machine is specified.

To get the state of all active Vagrant environments on the system, vagrant global-status should be used instead. But global-status has a drawback: it uses a cache and does not actively verify the state of machines.

So to reliably determine the state, first we need to get the IDs of all VMs with vagrant global-status and then check these IDs with vagrant status ID.

wombatonfire
  • 4,585
  • 28
  • 36
  • Didn't work :\ now I get File "/home/sebas/Desktop/playbooks/inventory/vagrant.py", line 21, in list_running_hosts status = subprocess.check_output(cmd.split(',')).rstrip() File "/usr/lib/python2.7/subprocess.py", line 567, in check_output process = Popen(stdout=PIPE, *popenargs, **kwargs) File "/usr/lib/python2.7/subprocess.py", line 711, in __init__ errread, errwrite) File "/usr/lib/python2.7/subprocess.py", line 1340, in _execute_child raise child_exception OSError: [Errno 2] No such file or directory – Sebastiaan Vroom May 27 '16 at 10:02
  • Oops, my initial assumption was completely wrong. Edited the answer. – wombatonfire May 27 '16 at 19:03