9

I am encountering a 503 error pushing to Github:

$ git push github develop
Counting objects: 22, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (22/22), done.
Writing objects: 100% (22/22), 4.16 KiB | 0 bytes/s, done.
Total 22 (delta 16), reused 0 (delta 0)
error: RPC failed; HTTP 503 curl 22 The requested URL returned error: 503 Service Unavailable
fatal: The remote end hung up unexpectedly
fatal: The remote end hung up unexpectedly
Everything up-to-date

I've checked their status page and "All systems operational" so I'm thinking it must be something with my configuration. My .gitconfig file just have my name and email:

[user]
    name = Bradley Wogsland
    email = <omitted>

(I've omitted my real email here but in the actual file it's there).

Rene B.
  • 6,557
  • 7
  • 46
  • 72
wogsland
  • 9,106
  • 19
  • 57
  • 93

4 Answers4

10

I had the same issue and fixed it by increasing the Git buffer size to the largest individual file size of my repo:

git config --global http.postBuffer 157286400

Afterwards, I could execute the push request without any problems:

git push

Here is a great explanation from Bitbucket Support:

Cause

The "Smart HTTP" protocol in Git uses "Transfer-Encoding: chunked" in POST requests when it contains packed objects greater than 1MB in size.

Some proxy servers, like Nginx, do not support this transfer encoding by default, and the requests will be rejected before they get to Bitbucket Server. Because of this, the Bitbucket Server logs will not show any extra information.

Another possible cause is a load balancer misconfiguration.

Workaround

  • When pushing a large amount of data (initial push of a big repository, change with very big file(s)) may require a higher http.postBuffer setting on your git client (not the server). From https://www.kernel.org/pub/software/scm/git/docs/git-config.html

    http.postBuffer Maximum size in bytes of the buffer used by smart HTTP transports when POSTing data to the remote system. For requests larger than this buffer size, HTTP/1.1 and Transfer-Encoding: chunked is used to avoid creating a massive pack file locally. Default is 1 MiB, which is sufficient for most requests.

  • Configuration on your reverse proxy. Usually ngnix the parameter client_max_body_size is a blocker. The reverse proxy may also have a connection timeout that's closing the connection (e.g. Timeout or ProxyTimeout in Apache, proxy_read_timeout in ngnix). Try bypassing the proxy by pushing directly to Bitbucket Server IP:port. If this works, it's highly likely that the proxy server is causing the early disconnect and needs to be tuned.

  • User is using an outbound proxy on his machine that is causing the issue.

Resolution

HaBo
  • 13,999
  • 36
  • 114
  • 206
Rene B.
  • 6,557
  • 7
  • 46
  • 72
3

I faced the same issue. Tried setting my global email to my account's email as:

git config ---global user.email MY_GIT_EMAIL

then tried

git push

and it worked for me.

Muhammad Zeeshan
  • 4,608
  • 4
  • 15
  • 41
1

I had this problem when push large amounts of files.

http.postBuffer size increase didnt solve the problem.

The solution was to unset the http.postBuffer:

git config --global --unset http.postBuffer
git config --local --unset http.postBuffer

learn.microsoft.com - http.postBuffer questions

0

in my case that was a firewall rule. when allowed by our network team, the traffic went through

Mamdouh Emam
  • 195
  • 1
  • 9