5

I have two USB cameras. One is a low-cost WebCam, the other is a low-cost USB microscope; both bought from eBay. The microscope is actually just another WebCam.

I want to use the USB microscope with Mac OS X 10.5 and QTKit. MyRecorder works fine with my low-cost WebCam, but it only displays black video when I connect the microscope instead.

If I open QuickTime Player and create a movie recording, I get the error message: "Recording failed because no data was received.|Make sure that the media input source is turned on and playing."

The sequence grabber demo works with both cameras.

miXscope also works with both cameras (it seems it uses the sequence grabber).

Here's the stripped down MyRecorder (for a better overview):

- (void)awakeFromNib
{
    NSError *error;

    mCaptureSession = [[QTCaptureSession alloc] init];
    QTCaptureDevice *videoDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
    BOOL success = [videoDevice open:&error];
    if(!success)
    {
        videoDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeMuxed];
        success = [videoDevice open:&error];
    }
    if(!success) return;
    mCaptureVideoDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:videoDevice];
    success = [mCaptureSession addInput:mCaptureVideoDeviceInput error:&error];
    if(!success) return;
    if(![videoDevice hasMediaType:QTMediaTypeSound] && ![videoDevice hasMediaType:QTMediaTypeMuxed])
    {
        QTCaptureDevice *audioDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeSound];
        success = audioDevice && [audioDevice open:&error];
        if(success)
        {
            mCaptureAudioDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:audioDevice];
            success = [mCaptureSession addInput:mCaptureAudioDeviceInput error:&error];
        }
    }
    mCaptureMovieFileOutput = [[QTCaptureMovieFileOutput alloc] init];
    success = [mCaptureSession addOutput:mCaptureMovieFileOutput error:&error];
    if(!success) return;
    [mCaptureMovieFileOutput setDelegate:self];
    [mCaptureView setCaptureSession:mCaptureSession];
    [mCaptureSession startRunning];
}

What do I need to add/change, in order to get my microscope to work with MyRecorder ? (I've tried logging all I could think of, but I receive no errors from any of the QTKit methods I invoke).

Note: I've gone through all StackOverflow questions I could find on the subject, two questions came close, but they do not solve this issue.

  • 1
    As stack-overflow made me mentally ill, I will now leave this community. –  Oct 13 '15 at 02:46

1 Answers1

5
  1. Find and open an audio input device.
  2. Create the capture session.
  3. Add a device input for the audio device to the session.
  4. Create an audio data output for reading captured audio buffers and add it to the capture session.
  5. Set a callback on the effect unit that will supply the audio buffers received from the audio data output.
  6. Start the capture session.

√ - Check the following code:

    - (id)init
    {
    self = [super init];
    if (self) {
    [self setOutputFile:[@"~/Desktop/Audio Recording.aif" stringByStandardizingPath]];
    }
    return self;
    }
    - (void)awakeFromNib
    {
    BOOL success;
    NSError *error;
    /* Find and open an audio input device. */
    QTCaptureDevice *audioDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeSound];
    success = [audioDevice open:&error];
    if (!success) {
    [[NSAlert alertWithError:error] runModal];
    return;
    }
    /* Create the capture session. */
    captureSession = [[QTCaptureSession alloc] init];
    /* Add a device input for the audio device to the session. */
    captureAudioDeviceInput = [[QTCaptureDeviceInput alloc]         initWithDevice:audioDevice];
    success = [captureSession addInput:captureAudioDeviceInput error:&error];
    if (!success) {
    [captureAudioDeviceInput release];
    captureAudioDeviceInput = nil;
    [audioDevice close];
    [captureSession release];
    captureSession = nil;
    [[NSAlert alertWithError:error] runModal];
    return;
    }
    /* Create an audio data output for reading captured audio buffers and add it to the capture session. */
    captureAudioDataOutput = [[QTCaptureDecompressedAudioOutput alloc] init];
    [captureAudioDataOutput setDelegate:self]; /* Captured audio buffers will be provided to the delegate via the                         captureOutput:didOutputAudioSampleBuffer:fromConnection: delegate method. */
    success = [captureSession addOutput:captureAudioDataOutput error:&error];
    if (!success) {
    [captureAudioDeviceInput release];
    captureAudioDeviceInput = nil;
    [audioDevice close]; 
    [captureAudioDataOutput release];
    captureAudioDataOutput = nil;
    [captureSession release];
    captureSession = nil;
    [[NSAlert alertWithError:error] runModal];
    return;
    }
    /* Create an effect audio unit to add an effect to the audio before it is written to a file. */
    OSStatus err = noErr;
    AudioComponentDescription effectAudioUnitComponentDescription;
    effectAudioUnitComponentDescription.componentType = kAudioUnitType_Effect;
    effectAudioUnitComponentDescription.componentSubType = kAudioUnitSubType_Delay;
    effectAudioUnitComponentDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
    effectAudioUnitComponentDescription.componentFlags = 0;
    effectAudioUnitComponentDescription.componentFlagsMask = 0;
    AudioComponent effectAudioUnitComponent = AudioComponentFindNext(NULL,         &effectAudioUnitComponentDescription);
    err = AudioComponentInstanceNew(effectAudioUnitComponent, &effectAudioUnit);
    if (noErr == err) {
    /* Set a callback on the effect unit that will supply the audio buffers received from the audio data output. */
    AURenderCallbackStruct renderCallbackStruct;
    renderCallbackStruct.inputProc = PushCurrentInputBufferIntoAudioUnit;
    renderCallbackStruct.inputProcRefCon = self;
    err = AudioUnitSetProperty(effectAudioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input, 0, &renderCallbackStruct, sizeof(renderCallbackStruct)); 
    }
    if (noErr != err) {
    if (effectAudioUnit) {
    AudioComponentInstanceDispose(effectAudioUnit);
    effectAudioUnit = NULL;
    }
    [captureAudioDeviceInput release];
    captureAudioDeviceInput = nil;
    [audioDevice close];
    [captureSession release];
    captureSession = nil;
    [[NSAlert alertWithError:[NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]] runModal];
    return;
    }
    /* Start the capture session. This will cause the audo data output delegate method to be called for each new audio buffer that is captured from the input device. */
    [captureSession startRunning];
    /* Become the window's delegate so that the capture session can be stopped and cleaned up immediately after the window is closed. */
    [window setDelegate:self];
    }
    - (void)windowWillClose:(NSNotification *)notification
    {
    [self setRecording:NO];
    [captureSession stopRunning];
    QTCaptureDevice *audioDevice = [captureAudioDeviceInput device];
    if ([audioDevice isOpen])
    [audioDevice close];
    }
    - (void)dealloc
    {
    [captureSession release];
    [captureAudioDeviceInput release];
    [captureAudioDataOutput release];
    [outputFile release];
    if (extAudioFile)
    ExtAudioFileDispose(extAudioFile);
    if (effectAudioUnit) {
    if (didSetUpAudioUnits)
    AudioUnitUninitialize(effectAudioUnit);
    AudioComponentInstanceDispose(effectAudioUnit);
    }
    [super dealloc];
    }
    #pragma mark ======== Audio capture methods =========
    /*
    Called periodically by the QTCaptureAudioDataOutput as it receives         QTSampleBuffer objects containing audio frames captured by the QTCaptureSession.
    Each QTSampleBuffer will contain multiple frames of audio encoded in the canonical non-interleaved linear PCM format compatible with AudioUnits.
    */
    - (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputAudioSampleBuffer:(QTSampleBuffer *)sampleBuffer fromConnection:(QTCaptureConnection *)connection
    {
    OSStatus err = noErr;
    BOOL isRecording = [self isRecording];
    /* Get the sample buffer's AudioStreamBasicDescription, which will be used to set the input format of the effect audio unit and the ExtAudioFile. */
    QTFormatDescription *formatDescription = [sampleBuffer formatDescription];
    NSValue *sampleBufferASBDValue = [formatDescription attributeForKey:QTFormatDescriptionAudioStreamBasicDescriptionAttribute];
    if (!sampleBufferASBDValue)
    return;
    AudioStreamBasicDescription sampleBufferASBD = {0};
    [sampleBufferASBDValue getValue:&sampleBufferASBD]; 
    if ((sampleBufferASBD.mChannelsPerFrame != currentInputASBD.mChannelsPerFrame) || (sampleBufferASBD.mSampleRate != currentInputASBD.mSampleRate)) {
    /* Although QTCaptureAudioDataOutput guarantees that it will output sample buffers in the canonical format, the number of channels or the sample rate of the audio can changes at any time while the capture session is running. If this occurs, the audio unit receiving the buffers from the QTCaptureAudioDataOutput needs to be reconfigured with the new format. This also must be done when a buffer is received for the first time. */
    currentInputASBD = sampleBufferASBD;
    if (didSetUpAudioUnits) {
    /* The audio units were previously set up, so they must be uninitialized now. */
    AudioUnitUninitialize(effectAudioUnit);
    /* If recording was in progress, the recording needs to be stopped because the audio format changed. */
    if (extAudioFile) {
    ExtAudioFileDispose(extAudioFile);
    extAudioFile = NULL;
    }
    } else {
    didSetUpAudioUnits = YES;
    }
    /* Set the input and output formats of the effect audio unit to match that of the sample buffer. */
    err = AudioUnitSetProperty(effectAudioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &currentInputASBD, sizeof(currentInputASBD));
    if (noErr == err)
    err = AudioUnitSetProperty(effectAudioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &currentInputASBD, sizeof(currentInputASBD));
    if (noErr == err)
    err = AudioUnitInitialize(effectAudioUnit);
    if (noErr != err) {
    NSLog(@"Failed to set up audio units (%d)", err);
    didSetUpAudioUnits = NO;
    bzero(&currentInputASBD, sizeof(currentInputASBD));
    }
    }
    if (isRecording && !extAudioFile) {
    /* Start recording by creating an ExtAudioFile and configuring it with the same sample rate and channel layout as those of the current sample buffer. */
    AudioStreamBasicDescription recordedASBD = {0};
    recordedASBD.mSampleRate = currentInputASBD.mSampleRate;
    recordedASBD.mFormatID = kAudioFormatLinearPCM;
    recordedASBD.mFormatFlags = kAudioFormatFlagIsBigEndian | kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
    recordedASBD.mBytesPerPacket = 2 * currentInputASBD.mChannelsPerFrame;
    recordedASBD.mFramesPerPacket = 1;
    recordedASBD.mBytesPerFrame = 2 * currentInputASBD.mChannelsPerFrame;
    recordedASBD.mChannelsPerFrame = currentInputASBD.mChannelsPerFrame;
    recordedASBD.mBitsPerChannel = 16;
    NSData *inputChannelLayoutData = [formatDescription                 attributeForKey:QTFormatDescriptionAudioChannelLayoutAttribute];
    AudioChannelLayout *recordedChannelLayout = (AudioChannelLayout *)[inputChannelLayoutData bytes];
    err = ExtAudioFileCreateWithURL((CFURLRef)[NSURL fileURLWithPath:[self         outputFile]], kAudioFileAIFFType, &recordedASBD, recordedChannelLayout, kAudioFileFlags_EraseFile, &extAudioFile);
    if (noErr == err) 
    err = ExtAudioFileSetProperty(extAudioFile,     kExtAudioFileProperty_ClientDataFormat, sizeof(currentInputASBD), &currentInputASBD);
    if (noErr != err) {
    NSLog(@"Failed to set up ExtAudioFile (%d)", err);
    ExtAudioFileDispose(extAudioFile);
    extAudioFile = NULL;
    }
    } else if (!isRecording && extAudioFile) {
    /* Stop recording by disposing of the ExtAudioFile. */
    ExtAudioFileDispose(extAudioFile);
    extAudioFile = NULL;
    }
    NSUInteger numberOfFrames = [sampleBuffer numberOfSamples]; 
    /* -[QTSampleBuffer numberOfSamples] corresponds to the number of CoreAudio audio frames. */
    /* In order to render continuously, the effect audio unit needs a new time stamp for each buffer. Use the number of frames for each unit of time. */
    currentSampleTime += (double)numberOfFrames;
    AudioTimeStamp timeStamp = {0};
    timeStamp.mSampleTime = currentSampleTime;
    timeStamp.mFlags |= kAudioTimeStampSampleTimeValid; 
    AudioUnitRenderActionFlags flags = 0;
    /* Create an AudioBufferList large enough to hold the number of frames from the sample buffer in 32-bit floating point PCM format. */
    AudioBufferList *outputABL = calloc(1, sizeof(*outputABL) + (currentInputASBD.mChannelsPerFrame - 1)*sizeof(outputABL->mBuffers[0]));
    outputABL->mNumberBuffers = currentInputASBD.mChannelsPerFrame;
    UInt32 channelIndex;
    for (channelIndex = 0; channelIndex < currentInputASBD.mChannelsPerFrame;         channelIndex++) {
    UInt32 dataSize = numberOfFrames * currentInputASBD.mBytesPerFrame;
    outputABL->mBuffers[channelIndex].mDataByteSize = dataSize;
    outputABL->mBuffers[channelIndex].mData = malloc(dataSize);
    outputABL->mBuffers[channelIndex].mNumberChannels = 1;
    }
    /*
    Get an audio buffer list from the sample buffer and assign it to the currentInputAudioBufferList instance variable.
    The the effect audio unit render callback, PushCurrentInputBufferIntoAudioUnit(), can access this value by calling the currentInputAudioBufferList method.
    */
    currentInputAudioBufferList = [sampleBuffer         audioBufferListWithOptions:QTSampleBufferAudioBufferListOptionAssure16ByteAlignment];
    /* Tell the effect audio unit to render. This will synchronously call PushCurrentInputBufferIntoAudioUnit(), which will feed the audio buffer list into the effect audio unit. */
    err = AudioUnitRender(effectAudioUnit, &flags, &timeStamp, 0, numberOfFrames, outputABL);
    currentInputAudioBufferList = NULL;
    if ((noErr == err) && extAudioFile) {
    err = ExtAudioFileWriteAsync(extAudioFile, numberOfFrames, outputABL);
    }
    for (channelIndex = 0; channelIndex < currentInputASBD.mChannelsPerFrame; channelIndex++) {
    free(outputABL->mBuffers[channelIndex].mData);
    }
    free(outputABL);
    }
    /* Used by PushCurrentInputBufferIntoAudioUnit() to access the current audio buffer list that has been output by the QTCaptureAudioDataOutput. */
    - (AudioBufferList *)currentInputAudioBufferList
    {
    return currentInputAudioBufferList;
    }

This came from THIS tutorial, also try checking out the Audio capture methods #prama mark further into the example code provided in the tutorial.

Hope this helps!

Added References

Community
  • 1
  • 1
ChrisHaze
  • 2,800
  • 16
  • 20
  • 1: There are several build-errors. 2: There seem to be no QTCaptureDecompressedAudioOutput on my system. 3: After fixing the build-errors, I still get no picture. (Note: it seems the last block containing `[captureAudioDeviceInput release];` is duplicated) –  Oct 06 '15 at 20:49
  • `QTKit Application Tutorial` can be found and downloaded as PDF from Apple's own site; the chinese site takes half an hour to load, then it crashes my browser. –  Oct 06 '15 at 21:36
  • 1
    I wonder... who voted this answer up; did you do that yourself ? -It really seems that you did not understand my question correctly; I'm asking for how to get a picture from my microscope device, not how to record audio. –  Oct 06 '15 at 21:38
  • @NoOne Sorry, it was a late night of coding and I mistook microscope with microphone. No, I didn't upvote my own answer. – ChrisHaze Oct 06 '15 at 23:12
  • I guess that can happen to us all; you're forgiven. ;) –  Oct 07 '15 at 03:08
  • @NoOne haha thanks for the forgiveness. Any luck with a solution? – ChrisHaze Oct 07 '15 at 05:17
  • Sadly, no - no luck yet. –  Oct 07 '15 at 21:12
  • @NoOne I am not as familiar with the QuickTime SDK, but have you tried it with AVFoundation? – ChrisHaze Oct 08 '15 at 18:30
  • 1
    AVFoundation is not available on Mac OS X 10.5, so no, I have not tried it. –  Oct 09 '15 at 04:59
  • You'll receive the bounty automatically, even though your answer is not correct. Please fix the compile-errors. –  Oct 13 '15 at 02:47
  • @NoOne sorry for the delay. I have updated the Sample Code to include all the steps (in detail) as described from the linked tutorial. – ChrisHaze Oct 14 '15 at 01:26