3

I am trying to create an NSInputStream from the AVURLAsset url for a video file (or a photo from PHAsset url) from photos framework. My code goes as following

mAsset = [NSInputStream inputStreamWithFileAtPath:[murl path]];
[mAsset open];

the url is file:///var/mobile/Media/DCIM/100APPLE/IMG_0519.JPG

Now when I do a read as

NSUInteger readLength = [mAsset read:(uint8_t *)data maxLength:maxSize];

the readLength returned is -1. I think it has something to do with permissions for the iOS photo assets

If this way is not correct is there a way I can stream in data from a video or image file from the photos framework assets. Any help will be appreciated.

bir433
  • 61
  • 5

1 Answers1

7

Although the question is a bit old, I’m going to explain how I solved it since I ran into the same problem and never found any working solution on the internet that works with the Photos Framework.

Because of how the Apple APIs are designed, it’s indeed not possible to upload directly from ALAsset and PHAsset source files. So let me start with discussing how this problem was solved back in the days with the old (and now deprecated) API - AssetsLibrary.

ALAssetRepresentation has one awesome method getBytes:fromOffset:length:error: that directly translates to NSInputStream’s read:maxLength:. This gives you a number of options of how to put a stream from an instance of ALAsset - you may either create a bound pair of input and output streams, or you may choose to go with a little bit trickier path of subclassing NSInputStream.

So in regards of working with Photos Framework this gives you the first solution: you may try to get an ALAsset URL from a PHAsset and after that just fall back to creating a stream from the good old ALAssetRepresentation. Yes, this URL conversion is not documented, and yes, AssetsLibrary is now deprecated, but hey - it’s an option. And there is an article on Medium that suggest that it’s indeed a working solution.

Now let’s move on to Photos Framework.

With iOS 9 Apple introduced a new class PHAssetResourceManager that is suitable for our purposes. It’s lengthy method requestDataForAssetResource:options:dataReceivedHandler:completionHandler: a) progressively provides you with chunks of asset's data; b) it provides direct access to these underlying data resources and doesn’t require any additional space of the file system if the photo is present on the phone (i.e. not from the iCloud). Side note: the statement in “b)” is not actually documented, but proved to be correct in real life - you might try to fill up the device’s storage and invoke this method and it will work nicely. However, there are a few caveats with PHAssetResourceManager - it delivers the data asynchronously and the chunks of data are of arbitrary size. It’s quite understandable why this new API looks the way it does - with Photos Framework you have the same methods to work with local and iCloud assets. But all in all this new method doesn’t translate to NSInputStream's interface as nicely as getBytes:fromOffset:length:error: method of ALAssetRepresentation did. But rest easy, there’s one feature of this method that we can exploit to make it consumer-friendly so that it will look just like the old getBytes:fromOffset:length:error: method. This method requestDataForAssetResource:options:dataReceivedHandler:completionHandler: delivers it’s data on a serial queue in sequential order. That means that we could use bounded blocking queue to create a synchronous method that will look like func nextChunk() throws -> Data?. And after we have such method, getting the asset’s bytes is super easy.

And actually that’s exactly what I did in my library PHAssetResourceInputStream. It takes all the heavy lifting behind getting the bytes of assets from Photos Framework and provides you with a simple API, so I hope it’ll be helpful for someone who ran into the same problem.

TL;DR

PHAssetResourceManager will make you happy.

Community
  • 1
  • 1
Alexander Dvornikov
  • 1,074
  • 8
  • 13