I know this is an old thread, but I have to add something here.
after using
$ git fsck --cache --no-reflogs --lost-found --unreachable HEAD
we get the list of thousands of files (like it was for me) and looks like it's no possible to check all ones manually so we can filter the list to find blob files only in any suitable way for you, in may case it was Python and I just copied the output of the git fsck from console to the file the read file to get a list of rows
with open("lost_files.txt") as file:
lost_files = file.readlines()
Then filter to separate blob ones
[line.split(' ')[-1].strip() for line in lost_files if "blob" in line]
result was the list of hashes
Then go to console and declare a variable FILES and past here the list of hashes generated above
FILES=('5c2d8667ef8abbd7e114d2f9b61ee840b034e56f' ....... '6dad86cd9c7a80ff5c3cd1d3222d2f8228dc18cf')
for it was near 5К files
then just write a simple loop in console
for lost_file in "${FILES[@]}"; do git show $lost_file > lost/lost_file_$lost_file.txt; done
as result I get 5К files in lost directory
Then it's need to simply open each file and grep data you want
for me it was next script ( I try to use meaningful naming when working on features so I just add filter by keyword)
import os
lost_files_names = os.listdir("lost")
names = []
for file in lost_files_names:
with open(f"lost/{file}", "rb") as data:
temp = data.readlines()
for x in temp:
if b"my_keyword" in x:
names.append(file)
for x in set(names):
print(x)
Finlay I have got output with 10 files and managed to restore stupidly lost uncommitted code (Thank god, I did git add) in a short amount of time