On a local GitHub repo, I have a directory called "s3" that contains a Terraform file for creating an AWS S3 Bucket remotely. When I try pushing the directory onto my remote repo, I keep getting this error.
$ git push origin repo
Uploading LFS objects: 100% (1/1), 1.3 KB | 0 B/s, done.
Enumerating objects: 22, done.
Counting objects: 100% (22/22), done.
Delta compression using up to 12 threads
Compressing objects: 100% (15/15), done.
Writing objects: 100% (21/21), 50.31 MiB | 1.12 MiB/s, done.
Total 21 (delta 4), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (4/4), completed with 1 local object.
remote: error: Trace: 2bcd19423f0176b3048b9144f26779fab7c903f263a60b8534632e7c697ec700
remote: error: See http://git.io/iEPt8g for more information.
remote: error: File s3/.terraform/providers/registry.terraform.io/hashicorp/aws/3.74.3/windows_amd64/terraform-provider-aws_v3.74.3_x5.exe is 244.70 MB; this exceeds GitHub's file size limit of 100.00 MB
remote: error: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com.
To github.com:User/Remote-Repo.git
! [remote rejected] repo -> repo (pre-receive hook declined)
error: failed to push some refs to 'github.com:User/Remote-Repo.git'
I have git lfs
installed which should help with pushing files with large sizes, but it isn't.
What am I doing wrong?
Note: Some names in the error have been edited for privacy/security.