I followed the steps you referenced and managed to generate the files, but I also encountered the same issue before figuring out the cause. The problem is that there are a few possible causes and the script silently fails without telling you exactly what happened.
Here are a few suggestions:
First off you need to configure your OAuth Consent Screen. You won't be able to create the credentials without it.
Make sure that you have the right credentials file. To generate it you have to go to the Credentials page in the Cloud Console. The docs say that you need an OAuth Client ID. Make sure that you have chosen the correct app at the top.

Then you will be prompted to choose an application type. According to the docs you shared the type should be "Other", but this is no longer available so "Desktop app" is the best equivalent if you're just running a local script.

After that you can just choose a name and create the credentials. You will be prompted to download the file afterwards.
- Check that the
credentials-sheets.json
file has that exact name.
- Make sure that the
credentials-sheets.json
file is located in the same directory where you're running your python script file or console commands.
- Check that you've enabled both the Sheets and Drive API in your GCP Project.
- Python will try to setup a temporary server on
http://localhost:8080/
to retrieve the pickle files. If another application is using port 8080 then it will also fail. In my case a previously failed Python script was hanging on to that port.
To find and close the processes using port 8080 you can refer to this answer for Linux/Mac or this other answer for Windows. Just make sure that the process is not something you're currently using.
I just used the single import ezsheets
command to get the files so after getting the token-sheets.pickle
I had to run it again to get the token-drive.pickle
, but after that the library should detect that you already have the files.