I have a Zip archive with a large (important) file that will not extract. All Zip utilities that I've tried, including those that claim to recover/fix broken Zip archives are unable to extract the file containing the corrupted zlib compressed data. They get all the files in the archive except for the damaged entry, which gets skipped.
I've written a small utility app in C# that parses the zip archive, identifies each entry and parses out the fields, decrypts the data sections, and then decompresses them using a DeflateStream (from a .Net implementation of zlib). Everything works fine until I get to the damaged entry. The damaged entry decrypts successfully and fully (using AES in CTR mode), but the DeflateStream reader is only able to get through about 40MB into the decrypted data before throwing "Bad state (oversubscribed dynamic bit lengths tree)".
Is it possible to somehow 'seek' past the damaged section and continue decompressing the data? I'd like to recover as much of the file as possible, even if there are some holes. The DeflateStream doesn't implement a Seek method, and if I attempt to create a new DeflateStream with the underlying FileStream positioned to the last read position, it throws the same "Bad State" exception.