2

I have the following inventory file:

[all]
192.168.1.107
192.168.1.108
192.168.1.109

I want to add fingerprints for these hosts to known_hosts file on local machine. I know that I can use the ansible.builtin.known_hosts but based on the docs:

Name parameter must match with "hostname" or "ip" present in key attribute.

it seems like I must already have keys generated and I must have three sets of keys - one set per host. I would like to have just one key for all my hosts.

Right now I can use this:

- name: accept new remote host ssh fingerprints at the local host
  shell: "ssh-keyscan -t 'ecdsa' {{item}} >> {{ssh_dir}}known_hosts"
  with_inventory_hostnames:
    - all

but the problem with this approach is that it is not idempotent - if I run it three times it will add three similar lines in the known_hosts file.

Another solution would be to check the known_hosts file for presence of a host ip and add it only if it is not present, but I could not figure out how to use variables in when condition to check for more than one host.

So the question is how can I add hosts fingerprints to local known_hosts file before generating a set of private/public keys in idempotent manner?

ruslaniv
  • 458
  • 1
  • 6
  • 14
  • Similar question here https://stackoverflow.com/questions/30226113/ansible-ssh-prompt-known-hosts-issue but the solutions provided are either not idempotent or require disabling host key checking. – ruslaniv Jun 14 '21 at 12:12

2 Answers2

2

Here in my answer to "How to include all host keys from all hosts in group" I created a small Ansible look-up module host_ssh_keys to extract public SSH keys from the host inventory. Adding all hosts' public ssh keys to /etc/ssh/ssh_known_hosts is then as simple as this, thanks to Ansible's integration of loops with look-up plugins:

- name: Add public keys of all inventory hosts to known_hosts
  ansible.builtin.known_hosts:
    path: /etc/ssh/ssh_known_hosts
    name: "{{ item.host }}"
    key: "{{ item.known_hosts }}"
  with_host_ssh_keys: "{{ ansible_play_hosts }}"
Petr
  • 62,528
  • 13
  • 153
  • 317
1

For public SSH-Keys I use this one:

- hosts: localhost
  tasks:
  - set_fact:
      linuxkey: "{{ lookup('file', '~/.ssh/id_rsa.pub') }}"
    check_mode: no 

- hosts: all
  tasks: 

  - shell: 
      cmd: "sudo su - {{ application_user }}"
      stdin: "[[ ! `grep \"{{ hostvars['localhost']['linuxkey'] }}\" ~/.ssh/authorized_keys` ]] && echo '{{ hostvars['localhost']['linuxkey'] }}' >> ~/.ssh/authorized_keys"
      warn: no 
      executable: /bin/bash
    register: results
    failed_when: results.rc not in [0,1] 

I think you can easy adapt it for known_hosts file

Wernfried Domscheit
  • 54,457
  • 9
  • 76
  • 110
  • Ok, so 1) on the localhost you set the `linuxkey` variable to the content of a public key. 2) for all hosts you run the following task - change the user, feed the line to stdin and save the result in `results` variable and then compare the results code to `[0,1]`. Would you explain what's happening in the line that you feed to stdin? – ruslaniv Jun 14 '21 at 12:09
  • It's a standard linux command. Debug it to see. Maybe have a look at [lineinfile](https://docs.ansible.com/ansible/latest/collections/ansible/builtin/lineinfile_module.html), it should be more flexible. However, when you like to add SSH-Keys then you may face the problem that you cannot connect to remote host because the SSH-Key has not been added before, so you have typical chicken-egg problem. That's the reason why I use this workaround with `su` – Wernfried Domscheit Jun 14 '21 at 12:17