3

I want to create an Azure Databricks repository that is linked with my Github repository.

This is what I did:

  1. Create new GitHub repository with Readme.md
  2. Create authentication token and add it to Databricks
  3. In databricks, enable all-file sync for repositories
  4. Clone the repository into Databricks > Repo > My Username
  5. Pull (this works fine)

However, when I now add files to my Databricks repo and try to push, I get the following message:

Error pushing: Remote ref update was rejected. Make sure you have write access to this remote repository.

Same error when I try it on a newly created branch.

Does anyone know what could cause this error?

My authentication token has the rights repo, admin:repo_hook and delete_repo

It seems that I followed the Azure Databricks instructions 1:1, yet it does not work.

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
erocoar
  • 5,723
  • 3
  • 23
  • 45
  • 1
    Please re-check that you set Github user name token via user settings & it's assigned to Github, not Github Enterprise. I'm using that integration regularly, and it just works – Alex Ott Jan 05 '22 at 18:12
  • @AlexOtt I have tried that, but still get the same error. Also created a new, empty repo, but no luck :/ – erocoar Jan 06 '22 at 10:33

2 Answers2

1

The secret is, apparently, to use your GitHub username instead of e-mail.

erocoar
  • 5,723
  • 3
  • 23
  • 45
  • I am facing the same problem. Were you able to resolve this? Where should we use username instead of email? Thanks! – svakili Jun 23 '23 at 21:13
0

I was able to successfully push .py file from Azure databricks to git with same access permissions given in question. I followed this documentation. enter image description here

enter image description here

As per official documentation For non-notebook files in Databricks Repos, you must be running Databricks Runtime 8.4 or above.

Enable support for arbitrary files in Databricks Repos:

Files in Repos lets you sync any type of file, such as .py files, data files in .csv or .json format, or .yaml configuration files. You can import and read these files within a Databricks repo. You can also view and edit plain text files in the UI.

If support for this feature is not enabled, you will still see non-notebook files in your repo, but you will not be able to work with them. Also refer this SO question and its answers.

Abhishek K
  • 3,047
  • 1
  • 6
  • 19
  • Thank you for trying it out. Sad that it doesn't want to work for me :/ Could there be any specific rights required outside of GitHub and DataBricks Resource Group? – erocoar Jan 11 '22 at 12:05
  • Have you tried with .py, .csv or .json file types ? – Abhishek K Jan 11 '22 at 12:19