0

I'm using an API that expects me to pass in a file object for reading, but I want to pass in just a portion of an existing file so that the API reads just the first n bytes. I assume I could wrap 'file' in my own class, something like:

class FilePortion(file):
    def __init__(self, name, mode, lastByteToRead):
        self.LAST_BYTE = lastByteToRead
        super(FilePortion, self).__init__(name, mode)
    def read(self, size=None):
        position = self.tell()
        if position < self.LAST_BYTE:
            readSize = self.LAST_BYTE - position
            if size and size < readSize:
                readSize = size                
            return super(FilePortion, self).read(readSize)
        else:
            return ''

...except I'm not sure what other methods to override and how exactly. How would I override next(), for example? Is there a better way of doing this?

Jegschemesch
  • 11,414
  • 4
  • 32
  • 37
  • You could Write a generator to do this http://stackoverflow.com/questions/519633/lazy-method-for-reading-big-file-in-python – Timmy O'Mahony Feb 17 '12 at 05:34

2 Answers2

1

You could use existing tarfile._FileInFile wrapper class. Dont make bicycles :)

Vadim Fint
  • 875
  • 8
  • 9
  • +1. It's undocumented, but has been part of Python since v2.5. Or you could just copy it from the [standard library source code](http://hg.python.org/cpython/file/2.5/Lib/tarfile.py). – Søren Løvborg Sep 09 '13 at 09:35
1

How about giving it a StringIO object? You can just read the first n bytes yourself and place them in the object's buffer.

Amber
  • 507,862
  • 82
  • 626
  • 550
  • I considered this, but I'm dealing with fairly large files and would like to avoid the extra pass of reading memory it would entail. – Jegschemesch Feb 17 '12 at 05:37
  • A wrapper object is probably your best bet, then. You might just check what calls the API actually uses; you may not need to perfectly implement everything. – Amber Feb 17 '12 at 06:03