2

I am using AVFoundation for accessing images and audio for making video. Problem is when i am adding device for audio like.

AVCaptureDevice *audioDevice     = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeAudio];
AVCaptureDeviceInput * microphone_input = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
AVCaptureAudioDataOutput * audio_output = [[AVCaptureAudioDataOutput alloc] init];
[self.captureSession2 addInput:microphone_input];
[self.captureSession2 addOutput:audio_output];
dispatch_queue_t queue2;
queue2 = dispatch_queue_create("Audio", NULL);
[audio_output setSampleBufferDelegate:self queue:queue2];
dispatch_release(queue2);

and Camera for images.

AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

//putting it on the input.
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:nil];

//selecting the Output. 
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];

[self.captureSession addInput:captureInput];
[self.captureSession addOutput:captureOutput];
dispatch_queue_t    queue;
queue = dispatch_queue_create("cameraQueue", 0);
[captureOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

and after all getting raw data through delegates

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
   fromConnection:(AVCaptureConnection *)connection 
{   
if ([captureOutput isKindOfClass:[AVCaptureAudioDataOutput class]]) 
    [self sendAudeoRaw:sampleBuffer];
if ([captureOutput isKindOfClass:[AVCaptureVideoDataOutput class]]) 
    [self sendVideoRaw:sampleBuffer];}

the speed of getting Images Raw data is very slow around 2 images per second. how can i improve it because i am looking around 10-12 images/second. please Help

coneybeare
  • 33,113
  • 21
  • 131
  • 183
  • What is `[self sendVideoRaw:sampleBuffer]`? – Steve McFarlin Jul 07 '12 at 19:12
  • Also, you could in your captureOutput code simply compare the pointers rather then using isKindOfClass. e.g. `if(captureOutput == audio_output)`. You do have to be careful with isKindOfClass. It can return something you may not be expecting. This generally only occurs with the container classes. See this [post](http://stackoverflow.com/questions/1096772/is-it-safe-to-use-iskindofclass-against-an-nsstring-instance-to-determine-type) for a discussion. One last think. You do not need to use two different capture sessions for audio and video. Both AV IO classes can be added to the same session. – Steve McFarlin Jul 07 '12 at 19:23
  • @SteveMcFarlin separating audio and image raw data for processing. – loading username..... Jul 09 '12 at 05:38
  • @SteveMcFarlin but the main problem is getting Video frames very slow the delegate function called 2 times in a second and i am looking around 8-10 times in single second. – loading username..... Jul 09 '12 at 05:44
  • username - This is very strange. You should be able to achieve 30FPS with just a pure capture. The reason I asked what `[self sendVideoRaw:sampleBuffer]` was is that I thought perhaps this code was blocking the capture queue long enough to only allow 2FPS. If you uncomment that code what FPS do you get. I'll assume still 2FPS. It may be a configuration issue, but without more code I can not tell. You might take a look at the AVCamDemo project from Apple. – Steve McFarlin Jul 10 '12 at 17:57
  • [self sendVideoRaw:sampleBuffer]; and [self sendAudioRaw:sampleBuffer]; two function if the data is audio send it to audio function if video then send it to video.. but the delegate function calling very much slow..... – loading username..... Jul 11 '12 at 05:56
  • @SteveMcFarlin How to brodcast vidoe from iphone to server, I am also using avfoundation framwork – Rahul Juyal Oct 26 '12 at 06:46
  • @Ron Answering your question here is outside the scope of a comment. If you ask a question on SO you might consider narrowing your question(s). In other words ask something specific. – Steve McFarlin Oct 29 '12 at 18:13
  • @SteveMcFarlin http://stackoverflow.com/questions/12242513/how-to-get-real-time-video-stream-from-iphone-camera-and-send-it-to-server this is my question link please check it. – Rahul Juyal Jan 08 '13 at 11:32
  • Create a *global* queue, and don't release it until you deallocate the encapsulating object; specify 'serial' as the type of queue, and make the target the main queue. Use CMFormatDescription to determine the type of sample buffer sent to captureOutput. Instead of sending the sample buffers to another class by a method call, make the other class the delegate; you're doubling the work, otherwise. – James Bush Jan 21 '18 at 04:51

1 Answers1

0

Do these four things to start:

Create a global queue, and don't release it until you deallocate the encapsulating object; specify 'serial' as the type of queue, and make the target the main queue:

_captureOutputQueue  = dispatch_queue_create_with_target("bush.alan.james.PhotosRemote.captureOutputQueue", DISPATCH_QUEUE_SERIAL, dispatch_get_main_queue());

Get the media type description from each sample buffer to determine whether the sample buffer contains audio or video data:

CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
CMMediaType mediaType = CMFormatDescriptionGetMediaType(formatDescription);
if (mediaType == kCMMediaType_Audio)...
if (mediaType == kCMMediaType_Video)...

Instead of sending the sample buffers to another class by a method call, make the other class the data output delegate; you're doubling the work, otherwise.

Finally, make sure you're running the AVSession in a queue of its own. Per Apple's documentation for AVCaptureSession:

The startRunning method is a blocking call which can take some time, therefore you should perform session setup on a serial queue so that the main queue isn't blocked (which keeps the UI responsive). See AVCam-iOS: Using AVFoundation to Capture Images and Movies for an implementation example.

That includes any calls made to methods that configure the camera and, in particular, any that call the startRunning or stopRunning methods of AVCaptureSession:

dispatch_async(self.sessionQueue, ^{
    [self configureSession];
});

dispatch_async(self.sessionQueue, ^{
    [self.session startRunning];
});

dispatch_async(self.sessionQueue, ^{
    [self.session stopRunning];
});

If you cannot set the delegate as the class that processes the sample buffers, you may consider putting them on a queue to which both classes have access, and then passing a key:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    static char kMyKey; // set key to any value; pass the key--not the sample buffer--to the receiver
    dispatch_queue_set_specific(((AppDelegate *)[[UIApplication sharedApplication] delegate].serialQueue,
                                &kMyKey,
                                (void*)CFRetain(sampleBuffer),
                                (dispatch_function_t)CFRelease); 
    });      
}

In the receiver class:

dispatch_async(((AppDelegate *)[[UIApplication sharedApplication] delegate]).serialQueue, ^{
            CMSampleBufferRef sb = dispatch_get_specific(&kMyKey);
            NSLog(@"sb: %i", CMSampleBufferIsValid(sb));
});
James Bush
  • 1,485
  • 14
  • 19