I am developing an Azure pipeline and want to create (https://docs.databricks.com/dev-tools/api/latest/repos.html#operation/create-repo) a repo in Databricks and save it to /Repos/sub_folder/repo_name
To test the commands in the pipeline, I am using the Databricks cli and repos API (as described in the link above) locally from my PC. This all works fine and all the repo files are saved into the subfolder under the root /Repos folder.
When I try this in the pipeline, running as the Service Principle, the pipeline fails when trying to create a subfolder under /Repos. In the pipeline, I am issuing the following Databricks CLI command:
databricks workspace mkdirs /Repos/sub_folder
The error shown is
Error: Authorization failed. Your token may be expired or lack the valid scope
Is there some further configuration, or permissions, required to allow the Service Principle create a folder/save files under /Repos?
PS. I would use the /Shared workspace, instead of /Repos, but saving to /Shared does not seem to work with the "Files in Repos" feature (https://learn.microsoft.com/en-us/azure/databricks/repos/work-with-notebooks-other-files), which I need to access non-notebook files and run my actual model.
Any suggestions much appreciated...