2

I want a deployment in kubernetes to have the permission to restart itself, from within the cluster.

I know I can create a serviceaccount and bind it to the pod, but I'm missing the name of the most specific permission (i.e. not just allowing '*') to allow for the command

kubectl rollout restart deploy <deployment>

here's what I have, and ??? is what I'm missing

apiVersion: v1
kind: ServiceAccount
metadata:
  name: restart-sa
---
kind: Role
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  namespace: default
  name: restarter
rules:
  - apiGroups: ["apps"]
    resources: ["deployments"]
    verbs: ["list", "???"]
---
kind: RoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: testrolebinding
  namespace: default
subjects:
  - kind: ServiceAccount
    name: restart-sa
    namespace: default
roleRef:
  kind: Role
  name: restarter
  apiGroup: rbac.authorization.k8s.io
---
apiVersion: v1
kind: Pod
metadata:
  name: example
spec:
  containers:
  - image: nginx
    name: nginx
  serviceAccountName: restart-sa
yspreen
  • 1,759
  • 2
  • 20
  • 44
  • You can likely confirm this with `kubectl --v=100` or if you have the kube-apiserver's audit log for the operation, but I _believe_ that just `kubectl annotate deployment com.example/whatever=$(date +%s)` type deal, merely bumping some meaningless piece of metadata that will then cause the Pods to roll over. Thus, I would guess the RBAC is either `label` or `annotate`, whichever that command uses under the hood – mdaniel Aug 29 '21 at 16:55

3 Answers3

11

I believe the following is the minimum permissions required to restart a deployment:

rules:
 - apiGroups: ["apps", "extensions"]
   resources: ["deployments"]
   resourceNames: [$DEPLOYMENT]
   verbs: ["get", "patch"]
Gari Singh
  • 11,418
  • 2
  • 18
  • 41
0

If you want permission to restart kubernetes deployment itself from within the cluster you need to set permission on rbac authorisation.

In the yaml file you have missed some specific permissions under Role:rules you need to add in the below format verbs: ["get", "watch", "list"]

Instead of “Pod” you need to add “deployment” in the yaml file.

Make sure that you add “serviceAccountName: restart-sa” in the deployment yaml file under “spec:containers.” As mentioned below:

apiVersion: apps/v1  
kind: Deployment
metadata:
  name: nginx-deployment
  labels:
    app: nginx
spec:
  replicas: 3
  selector:
    matchLabels:
      app: nginx
  template:
    metadata:
      labels:
        app: nginx
    spec:
      containers:
      - name: nginx
        image: nginx:1.14.2
        ports:
        - containerPort: 80
      serviceAccountName: restart-sa

Then you can restart the deployment using the below command:

$ kubectl rollout restart deployment [deployment_name]

Bakul Mitra
  • 432
  • 2
  • 7
0

If someone is also facing this issue, I faced a similar one where the service account couldn't perform the rollout restart.

At the end I managed to fix it by adding replicasets to the list of resources

kind: Role
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  namespace: default
  name: restarter
rules:
  - apiGroups: ["apps"]
    resources: ["deployments", "replicasets", "pods"]
    verbs: ["get", "patch"]

Hope this helps you as well :)