3

How to ssh to the node inside the cluster in local. I am using docker edge version which has kubernetes inbuilt. If i run kubectl ssh node I am getting

 Error: unknown command "ssh" for "kubectl"

Did you mean this?
        set

Run 'kubectl --help' for usage.
error: unknown command "ssh" for "kubectl"

Did you mean this?
        set
Matt Clark
  • 27,671
  • 19
  • 68
  • 123
hemaamahad
  • 81
  • 3
  • 5
  • [`kubectl ssh` is not a thing?](https://github.com/kubernetes/kubernetes/issues/3920) – Matt Clark May 09 '18 at 06:01
  • Whats your KUBERNETES_PROVIDER?. If you are on aws. Use this command to ssh: `ssh -i ~/.ssh/kube_aws_rsa admin@`. Check what is the key being automatically generated – papaya May 09 '18 at 06:11

4 Answers4

1

There is no "ssh" command in kubectl yet, but there are plenty of options to access Kubernetes node shell.

In case you are using cloud provider, you are able to connect to nodes directly from instances management interface.

For example, in GCP: Select Menu -> Compute Engine -> VM instances, then press SSH button on the left side of the desired node instance.

In case of using local VM (VMWare, Virtualbox), you can configure sshd before rolling out Kubernetes cluster, or use VM console, which is available from management GUI.

Vagrant provides its own command to access VMs - vagrant ssh

In case of using minikube, there is minikube ssh command to connect to minikube VM. There are also other options.

I found no simple way to access docker-for-desktop VM, but you can easily switch to minikube for experimenting with node settings.

VAS
  • 8,538
  • 1
  • 28
  • 39
  • I faced lot of issues with minikube. It does not work with vpn even. so i switched to docker-for-desktop – hemaamahad May 10 '18 at 06:26
  • It is not related to original question, but anyway. Here are discussions that could help to fix minikube network after VPN starts: https://github.com/kubernetes/minikube/issues/1099 https://gist.github.com/mowings/633a16372fb30ee652336c8417091222 – VAS May 10 '18 at 08:11
  • You switched to docker-for-desktop .. but... what is the alternative of `minikube ssh`? Not sure, but it can be `screen ~/Library/Containers/com.docker.docker/Data/vms/0/tty` – Abdennour TOUMI Nov 30 '18 at 05:56
0

How to ssh to the node inside the cluster in local

  • Kubernetes is aware of nodes on level of secure communication with kubelets on nodes (geting hostname and ip from node), and as such, does not provide cluster-level ssh to nodes out of the box. Depending on your actual provide/setup there are different ways of connecting to nodes and they all boil down to locate your ssh key, open appropriate ports on firewall/security groups and issue ssh -i key user@node_instance_ip command to access node. If you are running locally with virtual machines you can setup your own ssh keypairs and do the trick..
Const
  • 6,237
  • 3
  • 22
  • 27
0

You can effectively shell into a pod using exec(I know its not exactly what the question asks, but might be helpful).

An example usage would be kubectl exec -it name-of-your-pod -- /bin/bash. assuming you have bash installed.

Hope that helps.

Winston RIley
  • 482
  • 3
  • 8
  • 3
    Although helpful, this answer does not anwers original question of ssh to nodes, not pods. – Const May 09 '18 at 17:56
0

You have to first Extend kubectl with plugins adding https://github.com/luksa/kubectl-plugins.

Basically, to "install" ssh, e.g.:

wget https://raw.githubusercontent.com/luksa/kubectl-plugins/master/kubectl-ssh

Then make sure the file kubectl-ssh is in your path.

ntg
  • 12,950
  • 7
  • 74
  • 95