Reset a Kubernetes cluster using Kubespray
Sometimes you have those “Ooopppssss” moments, when you realize what you did, wasn’t the smartest thing to do. I had that this morning, when I wanted to get rid of a failed Kubernetes installation of Kubestone, since the latest release had a bug.
My intention was to delete the clusterrolebindings that were created by the kubestone installation, for which I used the following command:
kubectl delete clusterrolebindings.rbac.authorization.k8s.io -n kubestone-system --all
When used for pods for example, this will delete all pods in the namespace kubestone-system
, however clusterrolebindings
are global… So executing this command removed all clusterrolebindings
in my Kubernetes cluster, including those required by Kubernetes itself.
Oooopppssss…
In the past I might just have thrown away the cluster and deployed a new one, however just for fun I wanted to see if I could do that easier.
Kubespray reset
After a bit of reading through Google, I found an article referencing the Kubespray reset.yml
playbook. This will basically reset all nodes, remove all config and allow you to redeploy Kubernetes.
So I first executed the reset.yml
playbook as follows:
ansible-playbook -i inventory/remko/hosts.yaml reset.yml --become --become-user=root
And then redeployed Kubernetes using the cluster.yml
playbook.
ansible-playbook -i inventory/remko/hosts.yaml cluster.yml --become --become-user=root
After that I was back in action! Although I should note, that all pod, deployments, etc in the cluster were gone. However since this was a dev cluster, that didn’t matter much to me.
One thought on “Reset a Kubernetes cluster using Kubespray”
This topic helped me to reset kubespray cluster. Thank you!