2

how can i get the averagePowerForChannel in AVPlayer in order to make an audio visualization on my music app! ive already done the visualization part but im stuck in its engine (realtime volume channel).

i know that by using AVAudioPlayer it can be done easily using the .meteringEnabled Property but for some known reason AVPlayer is a must in my app! im actualy thinking of using AVAudioPlayer Alongside with AVPlayer to get the desired result but it sounds kind of messy workaround, how can that affect performance and stability? thanks in advance

rmaddy
  • 314,917
  • 42
  • 532
  • 579
Med Abida
  • 1,214
  • 11
  • 31

2 Answers2

2

I have an issue with AVPlayer visualisation for about two years. In my case it involves HLS live streaming, in that case, you won't get it running, as of my knowledge.

EDIT This will not let you access the averagePowerForChannel: method, but you will get access to the raw data and with for example FFT get the desired information.

I got it working with local playback, though. You basically wait for the players player item to have a track up and running. At that point you will need to patch an MTAudioProcessingTap into the audio mix.

The processing tap will run callbacks you specify in which you will be able to compute the raw audio data as you need.

Here is a quick example (sorry for heaving it in ObjC, though):

#import <AVFoundation/AVFoundation.h>
#import <MediaToolbox/MediaToolbox.h>

void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) {};
void finalize(MTAudioProcessingTapRef tap) {};
void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) {};
void unprepare(MTAudioProcessingTapRef tap) {};
void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames, MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut, CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) {};

- (void)play {
    // player and item setup ...

    [[[self player] currentItem] addObserver:self forKeyPath:@"tracks" options:kNilOptions context:NULL];
}

//////////////////////////////////////////////////////

- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)changecontext:(void *)context
    if ([keyPath isEqualToString:@"tracks"] && [[object tracks] count] > 0) {
        for (AVPlayerItemTrack *itemTrack in [object tracks]) {
            AVAssetTrack *track = [itemTrack assetTrack];

            if ([[track mediaType] isEqualToString:AVMediaTypeAudio]) {
                [self addAudioProcessingTap:track];
                break;
            }
        }
}

- (void)addAudioProcessingTap:(AVAssetTrack *)track {
    MTAudioProcessingTapRef tap;
    MTAudioProcessingTapCallbacks callbacks;
    callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
    callbacks.clientInfo = (__bridge void *)(self);
    callbacks.init = init;
    callbacks.prepare = prepare;
    callbacks.process = process;
    callbacks.unprepare = unprepare;
    callbacks.finalize = finalise;

    OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap);

    if (err) {
        NSLog(@"error: %@", [NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]);
        return;
    }

    AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];

    AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
    [inputParams setAudioTapProcessor:tap];
    [audioMix setInputParameters:@[inputParams]];

    [[[self player] currentItem] setAudioMix:audioMix];
}

There is some discussion going on over on my question from over two years ago, so make sure to check it out as well.

Julian F. Weinert
  • 7,474
  • 7
  • 59
  • 107
  • thanks, objective-c is ok too. i found your old post when searching for a solution ! the only problem is that im not using HLS live streaming. it all about *.audio(mp3/wav/etc..) files! im working on a solution where i cache the file and save it as mp3 file then update the visualisation. ill test your code and see if it can help. – Med Abida Aug 30 '16 at 09:16
  • That's fine, because my stuff does not work with streaming *but* with local files (what you asked for). Also check out my edits, I added more details in the code as well! – Julian F. Weinert Aug 30 '16 at 09:18
  • it turns out that MTAudioProcessingTap is not fully compatible with swift. ill keep on digging and try to implement it using objective-c(my project is in swift) any way, you ll get the reward as soon as it fully works and no further answer were posted! you can still help by providing a swift solution if its possible! – Med Abida Aug 31 '16 at 12:55
  • Bummer. But that reflects the current status and production usability of Swift pretty well. Unfortunately I can't help out with Swift, but I would recommend a simple wrapper class that is easy to bridge to Swift (i.e. Provide a better callback interface, ...). You can use the contexts to get your class instance in any of thr callbacks, so it shouldn't be too hard to accomplish – Julian F. Weinert Aug 31 '16 at 13:44
  • absolutely! swift isnt (at least for now) fully waterproof. im working on a clss to bridge it to swift ! ill let you know when its ready ! ill post it here for refrence ! thanks. – Med Abida Aug 31 '16 at 13:47
  • @MedAbida how is `MTAudioProcessingTap` not compatible with swift? – Rhythmic Fistman Dec 06 '16 at 03:38
  • 1
    @MedAbida I'd like to know the same... It's simply a C interface which seems to be available in Swift. Although, it looks like there are some difficulties. Here is an example implementation I found: https://github.com/gchilds/MTAudioProcessingTap-in-Swift/blob/master/swift%20tap/AppDelegate.swift – Julian F. Weinert Dec 06 '16 at 07:51
-1

You will need an audio processor class in combination with AV Foundation to visualize audio samples as well as applying a Core Audio audio unit effect (Bandpass Filter) to the audio data. You can find a sample by Apple here

Essentially you will need to add an observer to you AVPlayer like the below:

// Notifications
let playerItem: AVPlayerItem!  = videoPlayer.currentItem
playerItem.addObserver(self, forKeyPath: "tracks", options: NSKeyValueObservingOptions.New, context:  nil);

NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: videoPlayer.currentItem, queue: NSOperationQueue.mainQueue(), usingBlock: { (notif: NSNotification) -> Void in
   self.videoPlayer.seekToTime(kCMTimeZero)
   self.videoPlayer.play()
   print("replay")
})

Then handle the notification in the overriden method below:

override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
    if (object === videoPlayer.currentItem && keyPath == "tracks"){
        if let playerItem: AVPlayerItem = videoPlayer.currentItem {
            if let tracks = playerItem.asset.tracks as? [AVAssetTrack] {
                tapProcessor = MYAudioTapProcessor(AVPlayerItem: playerItem)
                playerItem.audioMix = tapProcessor.audioMix
                tapProcessor.delegate = self
            }
        }
    }
}

Here's a link to a sample project on GitHub

TechSeeko
  • 1,521
  • 10
  • 19
  • please provide a fully tested/working solution. ive stated that i want to use avplayer! ur giving no clue man ! – Med Abida Aug 29 '16 at 14:59
  • preferably Swift ! – Med Abida Aug 29 '16 at 20:07
  • hey thanks for your response !! as i mention in my post im not planning to use AVAudioPlayer, the real challenge is to find a way to use AVPlayer to achieve this kind of thing !! sorry but ur not getting any reward !! because ive already stated that i know how to do it in AVAudioPlayer. PS: You are using AVAudioPlayer – Med Abida Aug 30 '16 at 08:52
  • @MedAbida did you check the link above? – TechSeeko Aug 30 '16 at 12:12
  • have you tested the code you provided ? its giving me an error no matter what i tried (Command failed due to signal: Segmentation fault: 11) – Med Abida Aug 31 '16 at 13:15
  • I have changed the link in the answer above after taking the liberty to make a couple of changes to the original demo to help you run the project. Use this project here made especially for your needs... https://github.com/techseeko/AVPlayerAudioMeter – TechSeeko Aug 31 '16 at 14:20
  • 1
    While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - [From Review](/review/low-quality-posts/13524663) – SurvivalMachine Aug 31 '16 at 14:39
  • @TechSeeko easy mate ! we are not starting a debate on what's wrong and what's right arent we ? providing a link without any explanation or even a single word on how things could be done is not the purpose of SO! its all about sharing knowledge here! anyway, thanks for your help ! ill check the link and report back ! – Med Abida Aug 31 '16 at 14:50
  • 1
    @SurvivalMachine Done, added some information and provided a link to an Apple sample just to make sure pieces of the answer will not be removed. Thanks for the help. – TechSeeko Aug 31 '16 at 14:55