25

OK, strange question. I have SSH forwarding working with Vagrant. But I'm trying to get it working when using Ansible as a Vagrant provisioner.

I found out exactly what Ansible is executing, and tried it myself from the command line, sure enough, it fails there too.

[/common/picsolve-ansible/u12.04%]ssh -o HostName=127.0.0.1 \
 -o User=vagrant -o  Port=2222 -o UserKnownHostsFile=/dev/null \
 -o StrictHostKeyChecking=no -o PasswordAuthentication=no \
 -o IdentityFile=/Users/bryanhunt/.vagrant.d/insecure_private_key \
 -o IdentitiesOnly=yes -o LogLevel=FATAL \
 -o ForwardAgent=yes "/bin/sh  \
 -c 'git clone git@bitbucket.org:bryan_picsolve/poc_docker.git /home/vagrant/poc_docker' "
Permission denied (publickey,password).

But when I just run vagrant ssh the agent forwarding works correctly, and I can checkout R/W my github project.

[/common/picsolve-ansible/u12.04%]vagrant ssh
vagrant@vagrant-ubuntu-precise-64:~$ /bin/sh  -c 'git clone git@bitbucket.org:bryan_picsolve/poc_docker.git /home/vagrant/poc_docker'
Cloning into '/home/vagrant/poc_docker'...
remote: Counting objects: 18, done.
remote: Compressing objects: 100% (14/14), done.
remote: Total 18 (delta 4), reused 0 (delta 0)
Receiving objects: 100% (18/18), done.
Resolving deltas: 100% (4/4), done.
vagrant@vagrant-ubuntu-precise-64:~$

Has anyone got any idea how it is working?

Update:

By means of ps awux I determined the exact command being executed by Vagrant.

I replicated it and git checkout worked.

 ssh vagrant@127.0.0.1 -p 2222 \
  -o Compression=yes \
  -o StrictHostKeyChecking=no \
  -o LogLevel=FATAL \ 
  -o StrictHostKeyChecking=no \
  -o UserKnownHostsFile=/dev/null \
  -o IdentitiesOnly=yes \
  -i /Users/bryanhunt/.vagrant.d/insecure_private_key \
  -o ForwardAgent=yes \
  -o LogLevel=DEBUG \
   "/bin/sh  -c 'git clone git@bitbucket.org:bryan_picsolve/poc_docker.git /home/vagrant/poc_docker' "
binarytemple_picsolve
  • 2,546
  • 2
  • 15
  • 15
  • Have you checked related questions http://stackoverflow.com/questions/11955525/how-to-use-ssh-agent-forwarding-with-vagrant-ssh?rq=1 and http://stackoverflow.com/questions/12923675/how-to-setup-vagrant-ssh-agent-forwarding?lq=1? – Vilsepi Jan 07 '14 at 14:46
  • I took a look, but they didn't directly address my issue. I've found out what was wrong. I'll post the solution now. – binarytemple_picsolve Jan 08 '14 at 15:33

6 Answers6

21

As of ansible 1.5 (devel aa2d6e47f0) last updated 2014/03/24 14:23:18 (GMT +100) and Vagrant 1.5.1 this now works.

My Vagrant configuration contains the following:

config.vm.provision "ansible" do |ansible|
   ansible.playbook = "../playbooks/basho_bench.yml"
   ansible.sudo = true
   ansible.host_key_checking = false
   ansible.verbose =  'vvvv'
   ansible.extra_vars = { ansible_ssh_user: 'vagrant', 
                 ansible_connection: 'ssh',
                 ansible_ssh_args: '-o ForwardAgent=yes'}

It is also a good idea to explicitly disable sudo use. For example, when using the Ansible git module, I do this:

- name: checkout basho_bench repository 
  sudo: no
  action: git repo=git@github.com:basho/basho_bench.git dest=basho_bench
binarytemple_picsolve
  • 2,546
  • 2
  • 15
  • 15
  • From my experience, I believe that you have to be manually specifying an inventory in order for this to work. It didn't work for me if I just let vagrant make the inventory. – btobolaski Jun 09 '14 at 17:44
  • If you destroy and re-create your Vagrant box, ssh-agent forwarding will be silently disabled, unless you pass an empty known hosts file, per Ben Darnell’s answer: http://stackoverflow.com/a/23704069/459442 – eager Aug 02 '14 at 19:08
16

The key difference appears to be the UserKnownHostFile setting. Even with StrictHostKeyChecking turned off, ssh quietly disables certain features including agent forwarding when there is a conflicting entry in the known hosts file (these conflicts are common for vagrant since multiple VMs may have the same address at different times). It works for me if I point UserKnownHostFile to /dev/null:

config.vm.provision "ansible" do |ansible|
  ansible.playbook = "playbook.yml"

  ansible.raw_ssh_args = ['-o UserKnownHostsFile=/dev/null']
end
Ben Darnell
  • 21,844
  • 3
  • 29
  • 50
  • 4
    Wow, how many ways can this stuff break, I had it working before, but got broken again, followed your suggestion and it worked. This stuff is very necessary but very brittle, not helped by the britleness of ssh command, that util is really showing it's age/cruft. – bryan_basho Sep 21 '14 at 10:36
8

Here's a workaround:

Create an ansible.cfg file in the same directory as your Vagrantfile with the following lines:

[ssh_connection]
ssh_args = -o ForwardAgent=yes
bbaassssiiee
  • 6,013
  • 2
  • 42
  • 55
Lorin Hochstein
  • 57,372
  • 31
  • 105
  • 141
  • 4
    That works when using Ansible without Vagrant (I use the same config), but not (if I recall correctly) when using it with Vagrant. IMHO the hassle involved in getting this stuff working is a weak point in an otherwise fantastic tool. – bryan_basho Jan 18 '15 at 21:03
  • 1
    ansible uses ssh in provisioning, and adds your vagrant VM to ~/.ssh/known_hosts. AgentForwarding depends on verified host keys, so before you run ansible to provision better remove any outdated key with: ssh-keygen -R [127.0.0.1]:2222 – bbaassssiiee Jul 19 '21 at 18:35
3

You can simply add this line to your Vagrantfile to enable the ssh forwarding:

config.ssh.forward_agent = true

Note: Don't forget to execute the task with become: false

Hope, this will help.

Arbab Nazar
  • 22,378
  • 10
  • 76
  • 82
  • This is the only thing that worked for my, even though I set `ssh_args = -A` in the ansible.cfg (and do not forget to execute the task with `become: false`). – DevAntoine Jan 22 '19 at 14:28
2

I've found that I need to do two separate things (on Ubuntu 12.04) to get it working:

  • the -o ForwardAgent thing that @Lorin mentions
  • adding /etc/sudoers.d/01-make_SSH_AUTH_SOCK_AVAILABLE with these contents:

    Defaults env_keep += "SSH_AUTH_SOCK"
    
offby1
  • 6,767
  • 30
  • 45
  • I'm pretty sure I've tried both. I'll try again next weekend with latest Ansible, and both your suggestions, thx, bryan – binarytemple_picsolve Mar 09 '14 at 22:25
  • 1
    @binarytemple_picsolve Beware the ControlMaster! As far as I can tell, that keeps your SSH connection alive for 60 seconds _even if Vagrant has stopped_. So if you make a change that affects how ssh works, it won't take effect if the original connection is running. I recommend that you delete the ControlMaster and ControlPersist options from the ansible.cfg, at least while you're debugging. – offby1 Mar 10 '14 at 02:06
2

I struggled with a very similar problem for a few hours. Vagrant 1.7.2 ansible 1.9.4

My symptoms:

failed: [vagrant1] => {"cmd": "/usr/bin/git ls-remote '' -h refs/heads/HEAD", "failed": true, "rc": 128}
stderr: Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.

msg: Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.

FATAL: all hosts have already failed -- aborting

SSH'ing into the guest, I found that my ssh-agent was forwarding as expected:

vagrant@vagrant-ubuntu-trusty-64:~$ ssh -T git@github.com
Hi baxline! You've successfully authenticated, but GitHub does not provide shell access.

However, from the host machine, I could not open the connection:

$ ansible web -a "ssh-add -L"
vagrant1 | FAILED | rc=2 >>
Could not open a connection to your authentication agent.

After confirming that my ansible.cfg file was set up, as @Lorin noted, and my Vagrantfile set config.ssh.forward_agent = true, I still came up short.

The solution was to delete all lines in my host's ~/.ssh/known_hosts file that were associated with my guest. For me, they were the lines that started with:

[127.0.0.1]:2201 ssh-rsa
[127.0.0.1]:2222 ssh-rsa
[127.0.01]:2222 ssh-rsa
[127.0.0.1]:2200 ssh-rsa

Note the third line has a funny ip address. I'm not certain, but I believe that line was the culprit. These lines are created as I destroy and create vagrant VMs.

Brian
  • 1,729
  • 2
  • 14
  • 17