Tl;Dr
Is it possible with JS to strip individual frames and audio data of an uploaded video for streaming to a custom canvas / audio player? Why or why not?
I haven't tried reading raw file data / manipulating it with JavaScript but I wouldn't be surprised if it were possible. I'll explain my reason for going to this extreme / the workarounds that I've already tried.
Explanation
To my dismay, while building a video streaming application which must overlay elements onto the video, I discovered a problem with supporting the iPhone. I'll explain the topic and then explain my question about non traditional weird possible workaround to this problem.
As some of you may already know, Apple has decided to make it impossible for HTML5 video to be played without the video going fullscreen on the iPhone, nor does iPhone support Flash.
There are already some questions about it on SO:
- Can I avoid the native fullscreen video player with HTML5 on iPhone or android?
- Why does apple devices play html5 videos in its own player?
I've even developed a Canvas player which hides the HTML5 video player and renders it on HTML5 Canvas, but to my further dismay Apple has also decided to make this impossible.
Note: Video as a source for the canvas drawImage() method is not currently supported on iOS. Using video as a source for drawImage() involves a lot of system resources. Generally speaking, video is best displayed using the video element, not the canvas element. To composite canvas text or animations over moving video, it’s better to use a video element behind the canvas—the video shows through the transparent background of the canvas without the overhead of displaying video on the canvas itself.
On iOS-based devices with small screens—such as iPhone and iPod touch—video always plays in fullscreen mode, so the canvas cannot be superimposed on playing video. On iOS-based devices with larger screens, such as iPad, you can superimpose canvas graphics on playing video, just as you can on the desktop.
Use video as an image source for the canvas only when you need to access the pixel data of a video. Video on canvas is useful for things such as realtime image processing—green screen effects, extracting the average color value from a video frame, capturing a series of stills from a video—or special effects such as putting segments of a moving video image on tiles and moving them independently.
To which I respond, "YES obviously video is best displayed in the video element, but you've broken iPhone's support for the W3C specs of HTML5 video and made it impossible for creative video applications to support one of the most popular mobile devices on Earth!" - rant out of the way -
So. My next step is to display a screen to iPhone users explaining that our application doesn't work on iPhone, "because Apple doesn't think W3C standards apply to iPhones" BUT first I want to explore one more possibility.
Is it in any way possible to strip individual frames and audio data of an uploaded video for streaming? ( I have the use of WebRTC in mind)
If so, it seems to me that the raw audio and frame data can be streamed / played fairly easily in Canvas without needing to use the HTML5 video player as a source.