6

I have researched about to play a beep sound in iphone related to the frequency & decibels that i have given.

Links i referred:

http://developer.apple.com/library/ios/#samplecode/MusicCube/Introduction/Intro.html#//apple_ref/doc/uid/DTS40008978

http://www.politepix.com/2010/06/18/decibel-metering-from-an-iphone-audio-unit/

http://atastypixel.com/blog/using-remoteio-audio-unit/

http://www.politepix.com/2010/06/18/decibel-metering-from-an-iphone-audio-unit/

How to play a sound of paticular frequency and framework not found AudioUnit question

Also i have used Flite to do text to speech in my application.

May i know , is it possible to play a beep sound in iphone related to the frequency & decibels using flite.

I know that they are creating a audio file as per the input(only related to the pitch,variance,speed and given string) and playing through it Audioplayer once created.

But they have no custom methods to set the frequency & decibels!!!!

So could any one provide me a good way to do it in iphone.

Any help on this question is appreciated.

Thanks

Community
  • 1
  • 1
Gabrielle
  • 4,933
  • 13
  • 62
  • 122
  • Decibels (dB) are used to express a *ratio* between two magnitudes. You probably mean `dB SPL` (dB Sound Pressure Level), which is what people usually mean when they talk about how loud a sound is in decibels. To generate a sound with a given dB SPL amplitude though you will need to be able to calibrate the hardware in some way. – Paul R Jun 08 '12 at 07:37
  • me also need the same..... have to create a beep asper freequency and decibel.... i am looking on.. – Shamsudheen TK Jun 09 '12 at 17:58

1 Answers1

0

This class allows you to play a beep at a given frequency, and with a given amplitude. It uses AudioQueues from AudioToolbox.framework. It's just a sketch, many things should be refined, but the mechanism for creating the signal works.

The usage is pretty straightforward if you see the @interface.

#import <AudioToolbox/AudioToolbox.h>
#define TONE_SAMPLERATE 44100.

@interface Tone : NSObject {
    AudioQueueRef queue;
    AudioQueueBufferRef buffer;
    BOOL rebuildBuffer;
}
@property (nonatomic, assign) NSUInteger frequency;
@property (nonatomic, assign) CGFloat dB;

- (void)play;
- (void)pause;
@end


@implementation Tone
@synthesize dB=_dB,frequency=_frequency;

void handleBuffer(void *inUserData,
                  AudioQueueRef inAQ,
                  AudioQueueBufferRef inBuffer);

#pragma mark - Initialization and deallocation -

- (id)init
{
    if ((self=[super init])) {

        _dB=0.;
        _frequency=440;
        rebuildBuffer=YES;

        // TO DO: handle AudioQueueXYZ's failures!!

        // create a descriptor containing a LPCM, mono, float format
        AudioStreamBasicDescription desc;

        desc.mSampleRate=TONE_SAMPLERATE;
        desc.mFormatID=kAudioFormatLinearPCM;
        desc.mFormatFlags=kLinearPCMFormatFlagIsFloat;
        desc.mBytesPerPacket=sizeof(float);
        desc.mFramesPerPacket=1;
        desc.mBytesPerFrame=sizeof(float);
        desc.mChannelsPerFrame=1;
        desc.mBitsPerChannel=8*sizeof(float);

        // create a new queue
        AudioQueueNewOutput(&desc,
                            &handleBuffer,
                            self,
                            CFRunLoopGetCurrent(),
                            kCFRunLoopCommonModes,
                            0,
                            &queue);

        // and its buffer, ready to hold 1" of data
        AudioQueueAllocateBuffer(queue,
                                 sizeof(float)*TONE_SAMPLERATE,
                                 &buffer);

        // create the buffer and enqueue it
        handleBuffer(self, queue, buffer);

    }
    return self;
}

- (void)dealloc
{
    AudioQueueStop(queue, YES);
    AudioQueueFreeBuffer(queue, buffer);
    AudioQueueDispose(queue, YES);

    [super dealloc];
}

#pragma mark - Main function -

void handleBuffer(void *inUserData,
                AudioQueueRef inAQ,
                AudioQueueBufferRef inBuffer) {

    // this function takes care of building the buffer and enqueuing it.

    // cast inUserData type to Tone
    Tone *tone=(Tone *)inUserData;

    // check if the buffer must be rebuilt
    if (tone->rebuildBuffer) {

        // precompute some useful qtys
        float *data=inBuffer->mAudioData;
        NSUInteger max=inBuffer->mAudioDataBytesCapacity/sizeof(float);

        // multiplying the argument by 2pi changes the period of the cosine
        //  function to 1s (instead of 2pi). then we must divide by the sample
        //  rate to get TONE_SAMPLERATE samples in one period.
        CGFloat unit=2.*M_PI/TONE_SAMPLERATE;
        // this is the amplitude converted from dB to a linear scale
        CGFloat amplitude=pow(10., tone.dB*.05);

        // loop and simply set data[i] to the value of cos(...)
        for (NSUInteger i=0; i<max; ++i)
            data[i]=(float)(amplitude*cos(unit*(CGFloat)(tone.frequency*i)));

        // inform the queue that we have filled the buffer
        inBuffer->mAudioDataByteSize=sizeof(float)*max;

        // and set flag
        tone->rebuildBuffer=NO;
    }

    // reenqueue the buffer
    AudioQueueEnqueueBuffer(inAQ,
                            inBuffer,
                            0,
                            NULL);

    /* TO DO: the transition between two adjacent buffers (the same one actually)
              generates a "tick", even if the adjacent buffers represent a continuous signal.
              maybe using two buffers instead of one would fix it.
     */
}

#pragma - Properties and methods -

- (void)play
{
    // generate an AudioTimeStamp with "0" simply!
    //  (copied from FillOutAudioTimeStampWithSampleTime)

    AudioTimeStamp time;

    time.mSampleTime=0.;
    time.mRateScalar=0.;
    time.mWordClockTime=0.;
    memset(&time.mSMPTETime, 0, sizeof(SMPTETime));
    time.mFlags = kAudioTimeStampSampleTimeValid;

    // TO DO: maybe it could be useful to check AudioQueueStart's return value
    AudioQueueStart(queue, &time);
}

- (void)pause
{
    // TO DO: maybe it could be useful to check AudioQueuePause's return value
    AudioQueuePause(queue);
}

- (void)setFrequency:(NSUInteger)frequency
{
    if (_frequency!=frequency) {
        _frequency=frequency;

        // we need to update the buffer (as soon as it stops playing)
        rebuildBuffer=YES;
    }
}

- (void)setDB:(CGFloat)dB
{
    if (dB!=_dB) {
        _dB=dB;

        // we need to update the buffer (as soon as it stops playing)
        rebuildBuffer=YES;
    }
}

@end
  • The class generates a cos waveform oscillating at the given integer frequency (amplitude*cos(2pi*frequency*t)); the whole job is done by void handleBuffer(...), using an AudioQueue with a linear PCM, mono, float @44.1kHz format. In order to change the signal shape, you can just change that line. For example, the following code will produce a square waveform:

    float x = fmodf(unit*(CGFloat)(tone.frequency*i), 2 * M_PI);
    data[i] = amplitude * (x > M_PI ? -1.0 : 1.0);
    
  • For floating point frequencies, you should consider that there isn't necessarely an integer number of oscillations in one second of audio data, so the signal represented is discontinuous at the junction between two buffers, and produces a strange 'tick'. For example you could set less samples so that the junction is at the end of a signal period.

  • As Paul R pointed out, you should first calibrate the hardware to get a reliable conversion between the value you set in your implementation and the sound produced by your device. Actually, the floating point samples generated in this code ranges from -1 to 1, so I just converted the amplitude value into dB (20*log_10(amplitude)).
  • Take a look at the comments for other details in the implementation and the "known limitations" (all those 'TO DO'). The functions used are well documented by Apple in their reference.
Pietro Saccardi
  • 2,602
  • 34
  • 41
  • Doesn't work for me. Silence. Or I don't understand how to use this code. – Valeriy Van Jun 26 '13 at 20:34
  • @ValeriyVan Just tested it again and it does work. Linked against AudioToolbox.framework, just one view and one button that calls `[tone play];`. Can you be more specific? – Pietro Saccardi Jun 29 '13 at 13:22
  • I've built clean test project to play with this and now it works. But wort are those periodical clicks? I hear them in Simulator as well as on real device. – Valeriy Van Jun 29 '13 at 16:36
  • In the routine `handleBuffer` there's a comment that explains that. When AudioToolbox has finished playing a buffer, the buffer is dequeued, `handleBuffer` gets called and re-enqueues the same buffer. This introduces some ms of delay, during which there is no sound to reproduce; so you have 1 sec of audio and some ms of silence: that's the "tick". Probably using **two** buffers (with the same content for example) would fix that: while the second is playing, the first is re-enqueued, and when one has finished playing, the other is already in queue and to be reproduced "seamlessly". – Pietro Saccardi Jul 11 '13 at 09:52
  • how to modify as a square wave with series of pulses ? – Jeff Bootsholz Apr 25 '14 at 07:21
  • I modified the answer adding the code you need. You just change the mathematical function being sampled with whatever you want. – Pietro Saccardi Apr 26 '14 at 13:50
  • Here's a better Audio Queue example that uses a callback to avoid clicks: http://stackoverflow.com/questions/3326665/example-of-using-audio-queue-services – hotpaw2 Jan 23 '16 at 04:47
  • @PietroSaccardi Your example doesn't use AVAudioEngine or AUAudioUnit. Is it better to do it your way than to use AVAudioEngine as in the Apple sample https://developer.apple.com/library/archive/samplecode/AVAEMixerSample/Introduction/Intro.html#//apple_ref/doc/uid/TP40015134 or AUAudioUnit? – daniel Feb 08 '19 at 18:42
  • @DanielBrower I would have probably used it if it were available back then when I wrote the answer (`AVAudioEngine` requires iOS8+, which was released 2 years later). I guess that now this code is superseded. Here’s [another question](https://stackoverflow.com/q/28058777/1749822) that seems to be using it to generate tones. – Pietro Saccardi Feb 08 '19 at 18:58
  • @PietroSaccardi That post you gave a link for actually uses code to play a wav file. I actually needed your code that generates the sound directly from iOS. – daniel Feb 08 '19 at 20:39
  • @PietroSaccardi I'm new to Objective-C. Should your code be put in a .h file or a .m file? – daniel Feb 08 '19 at 20:44