14

I'm successfully sending a stream of NSData. The delegate method below is getting that stream and appending to NSMutableData self.data. How do I take this data and make it into a UIView/AVCaptureVideoPreviewLayer (which should show video)? I feel like I'm missing another conversion, AVCaptureSession > NSStream > MCSession > NSStream > ?

- (void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode {
    switch(eventCode) {
        case NSStreamEventHasBytesAvailable:
        {
            if(!self.data) {
                self.data = [NSMutableData data];
            }
            uint8_t buf[1024];
            unsigned int len = 0;
            len = [(NSInputStream *)stream read:buf maxLength:1024];
            if(len) {
                [self.data appendBytes:(const void *)buf length:len];
            } else {
                NSLog(@"no buffer!");
            }

// Code here to take self.data and convert the NSData to UIView/Video
}

I send the stream with this:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
//    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);


    NSError *error;
    self.oStream = [self.mySession startStreamWithName:@"videoOut" toPeer:[[self.mySession connectedPeers]objectAtIndex:0] error:&error];
    self.oStream.delegate = self;
    [self.oStream scheduleInRunLoop:[NSRunLoop mainRunLoop]
                            forMode:NSDefaultRunLoopMode];
    [self.oStream open];

    [self.oStream write:[data bytes] maxLength:[data length]];






//    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );

    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    // also in the 'mediaSpecific' dict of the sampleBuffer

    NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}
Eric
  • 4,063
  • 2
  • 27
  • 49
  • You might want to see if you can use Open GL. Take your data, convert it into GL textures, then use GL to show it. There's probably a higher-level API for this. The data isn't in any standard format? – nielsbot Mar 13 '14 at 21:01
  • What's the video format? A `UIView`? What's the link with a video? – Larme Mar 13 '14 at 21:03
  • Video format is AVCaptureSession – Eric Mar 13 '14 at 21:06
  • Do you have to lock/unlock the pixels each frame? I wonder if that will be costly time wise. – nielsbot Mar 17 '14 at 19:55
  • I was unaware that's what I was doing. Where do you see that in the code? – Eric Mar 17 '14 at 20:17
  • If your idea is to stream video, this is probably a very nieve approach. Understand that CMSampleBufferGetImageBuffer() returns the raw data that makes up the image (4 bytes per pixel RGBA). So if you stream this, you are still missing the timing and description info (and audio) that make up a video. Also, keep in mind that you will need at a minimum 20 frames(images) per second to make it look smooth. if you intend to follow this path, your options are either to create a new file at the receiver using AVAssetWriter, or render each image using OpenGL. – MDB983 Mar 21 '14 at 14:39
  • The problem with creating a file is writing the whole file before displaying it. I want to stream the video over MCSession. I can't believe I need a custom API for this. AVCaptureSession > NSStream > MCSession > NSStream > ? I can't get the stream back to AV – Eric Mar 21 '14 at 16:03
  • Again, You're only passing raw image data... Here is a similar question. http://stackoverflow.com/questions/20150337/multipeer-connectivity-for-video-streaming-between-iphones .It contains a link to a github repo where the author has started on a similar project. Again, taking a quick look at the code, it doesn't appear to render video, but rather display images. It's not a complex task to add in the missing OpenGL Render code. – MDB983 Mar 21 '14 at 23:17

2 Answers2

1

I think you need AVCaptureManager, see if the code below works for you..

AVCamCaptureManager *manager = [[AVCamCaptureManager alloc] init];
[self setCaptureManager:manager];

[[self captureManager] setDelegate:self];

if ([[self captureManager] setupSession]) {
     // Create video preview layer and add it to the UI
    AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[[self captureManager] session]];
    UIView *view = self.videoPreviewView;//Add a view in XIB where you want to show video
    CALayer *viewLayer = [view layer];
    [viewLayer setMasksToBounds:YES];
    CGRect bounds = [view bounds];

    [newCaptureVideoPreviewLayer setFrame:bounds];

    [newCaptureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

    [viewLayer insertSublayer:newCaptureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];

    [self setCaptureVideoPreviewLayer:newCaptureVideoPreviewLayer];

    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
        [[[self captureManager] session] startRunning];
    });
}

Manage the delegates

- (void)captureManager:(AVCamCaptureManager *)captureManager didFailWithError:(NSError *)error
{

}

- (void)captureManagerRecordingBegan:(AVCamCaptureManager *)captureManager
{

}

- (void)captureManagerRecordingFinished:(AVCamCaptureManager *)captureManager outputURL:(NSURL *)url
{



}

- (void)captureManagerStillImageCaptured:(AVCamCaptureManager *)captureManager
{



}

- (void)captureManagerDeviceConfigurationChanged:(AVCamCaptureManager *)captureManager
{

}

I hope it helps.

iphonic
  • 12,615
  • 7
  • 60
  • 107
  • None of the captureManager's delegate methods have a way to handle video. Am I missing something? – Eric Mar 23 '14 at 13:57
  • @Eric See this https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010112-Intro-DontLinkElementID_2 if it helps.. – iphonic Mar 24 '14 at 05:26
-2

You can make an UIImageView on yout handle event like this:

UIImageView * iv = [[UIImageView alloc] initWithImage: [UIImage imageWithData: self.data];

Also you can alloc just once and just call init.

Each time u receive from socket, you initialize the UIImageView and you can show it adding that UIImageView to a UIView.

Sorry for my english, I don´t know if I have understood you

Rievo
  • 79
  • 4