Is it possible, and supported, to use the iOS hardware accelerated h.264 decoding API to decode a local (not streamed) video file, and then compose other objects on top of it?
I would like to make an application that involves drawing graphical objects in front of a video, and use the playback timer to synchronize what I am drawing on top, to what is being played on the video. Then, based on the user's actions, change what I am drawing on top (but not the video)
Coming from DirectX, OpenGL and OpenGL ES for Android, I am picturing something like rendering the video to a texture, and using that texture to draw a full screen quad, then use other sprites to draw the rest of the objects; or maybe writing an intermediate filter just before the renderer, so I can manipulate the individual output frames and draw my stuff; or maybe drawing to a 2D layer on top of the video.
It seems like AV Foundation, or Core Media may help me do what I am doing, but before I dig into the details, I would like to know if it is possible at all to do what I want to do, and what are my main routes to approach the problem.
Please refrain from "this is too advanced for you, try hello world first" answers. I know my stuff, and just want to know if what I want to do is possible (and most importantly, supported, so the app won't get eventually rejected), before I study the details by myself.
edit:
I am not knowledgeable in iOS development, but professionally do DirectX, OpenGL and OpenGL ES for Android. I am considering making an iOS version of an Android application I currently have, and I just want to know if this is possible. If so, I have enough time to start iOS development from scratch, up to doing what I want to do. If it is not possible, then I will just not invest time studying the entire platform at this time.
Therefore, this is a technical feasibility question. I am not requesting code. I am looking for answers of the type "Yes, you can do that. Just use A and B, use C to render into D and draw your stuff with E", or "No, you can't. The hardware accelerated decoding is not available for third-party applications" (which is what a friend told me). Just this, and I'll be on my way.
I have read the overview for the video technologies in page 32 of the ios technology overview. It pretty much says that I can use Media Player for the most simple playback functionality (not what I'm looking for), UIKit for embedding videos with a little more control over the embedding, but not over the actual playback (not what I'm looking for), AVFoundation for more control over playback (maybe what I need, but most of the resources I find online talk about how to use the camera), or Core Media to have full low-level control over video (probably what I need, but extremely poorly documented, and even more lacking in resources on playback than even AVFoundation).
I am concerned that I may dedicate the next six months to learn iOS programming full time, only to find at the end that the relevant API is not available for third party developers, and what I want to do is unacceptable for iTunes store deployment. This is what my friend told me, but I can't seem to find anything relevant in the app development guidelines. Therefore, I came here to ask people who have more experience in this area, whether or not what I want to do is possible. No more.
I consider this a valid high level question, which can be misunderstood as an I-didn't-do-my-homework-plz-give-me-teh-codez question. If my judgement in here was mistaken, feel free to delete, or downvote this question to your heart's contempt.