1

I am developing an Azure pipeline and want to create (https://docs.databricks.com/dev-tools/api/latest/repos.html#operation/create-repo) a repo in Databricks and save it to /Repos/sub_folder/repo_name

To test the commands in the pipeline, I am using the Databricks cli and repos API (as described in the link above) locally from my PC. This all works fine and all the repo files are saved into the subfolder under the root /Repos folder.

When I try this in the pipeline, running as the Service Principle, the pipeline fails when trying to create a subfolder under /Repos. In the pipeline, I am issuing the following Databricks CLI command:

databricks workspace mkdirs /Repos/sub_folder

The error shown is

Error: Authorization failed. Your token may be expired or lack the valid scope

Is there some further configuration, or permissions, required to allow the Service Principle create a folder/save files under /Repos?

PS. I would use the /Shared workspace, instead of /Repos, but saving to /Shared does not seem to work with the "Files in Repos" feature (https://learn.microsoft.com/en-us/azure/databricks/repos/work-with-notebooks-other-files), which I need to access non-notebook files and run my actual model.

Any suggestions much appreciated...

Alex Ott
  • 80,552
  • 8
  • 87
  • 132

1 Answers1

0

The issue was two-fold. Firstly the service principle was not configured with admin rights so could not create the sub-folder in /Repos. Once this was fixed, I got a different error when issuing the post command trying to create the repo (in the newly created sub-folder). The error I got was:

{"error_code":"PERMISSION_DENIED","message":"Missing Git provider credentials. Go to User Settings > Git Integration to add your personal access token."}

The solution to this permissions issue has already been answered here