I am trying to git add
, commit
, push
an update made to some Python code, where I changed the naming convention of the files.
NB: I want my local branch to replace the remote version.
I have also deleted these files from data/
folder. However, git push
and git push --force
yield the same error:
remote: error: File workers/compositekey_worker/compositekey/data/20210617-031807_dataset_.csv is 203.87 MB; this exceeds GitHub's file size limit of 100.00 MB
remote: error: File workers/compositekey_worker/compositekey/data/20210617-032600_dataset_.csv is 180.20 MB; this exceeds GitHub's file size limit of 100.00 MB
But data/
only contains example datasets from online:
$ ls
MFG10YearTerminationData.csv OPIC-scraped-portfolio-public.csv
Is the problem to do with caching? I have limited understanding of this.
git status
:
On branch simulate-data-tests
Your branch is ahead of 'origin/simulate-data-tests' by 6 commits.
(use "git push" to publish your local commits)
nothing to commit, working tree clean
git rm --cached 20210617-031807_dataset_.csv
:
fatal: pathspec '20210617-031807_dataset_.csv' did not match any files
git log -- <filename>
in data/
:
$ git log -- 20210617-031807_dataset_.csv
commit 309e1c192387abc43d8e23f378fbb7ade45d9d3d
Author: ***
Date: Thu Jun 17 03:28:26 2021 +0100
Exception Handling of Faker methods that do not append to Dataframes. Less code, unqiueness enforced by 'faker.unique.<method>()'
commit 959aa02cdc5ea562e7d9af0c52db1ee81a5912a2
Author: ***
Date: Thu Jun 17 03:21:23 2021 +0100
Exception Handling of Faker methods that do not append to Dataframes. Less code, unqiueness enforced by 'faker.unique.<method>()'