16

Is it possible to redirect audio output to the phone speaker and still use the microphone headphone input?

If i redirect the audio route to the phone speaker instead of the headphones it also redirects the mic. This makes sense but I can't seem to just be able to just redirect the mic input? Any ideas?

Here is the code I'm using to redirect audio to the speaker:

UInt32 doChangeDefaultRoute = true;        
propertySetError = AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryDefaultToSpeaker, sizeof(doChangeDefaultRoute), &doChangeDefaultRoute);
NSAssert(propertySetError == 0, @"Failed to set audio session property: OverrideCategoryDefaultToSpeaker");
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
palacsint
  • 28,416
  • 10
  • 82
  • 109
user744412
  • 161
  • 1
  • 1
  • 4
  • 1
    You probably know this but when you enable speaker you also enable mic on the phone. However on the iPod , if you enable speaker you will still be able to get mic sound from headset. Prob intended since no mic on the ipod. I briefly got an ios 4.3 sdk app to get mic from headphone and speaker output, by init the AUgraph again after the route change, but it happened intermittenly and now its does not happen at all (ios 4.3+ xcode 4+) – zeAttle Jan 10 '12 at 14:34

3 Answers3

6

This is possible, but it's picky about how you set it up.

[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);

It's very important to use AVAudioSessionCategoryPlayAndRecord or the route will fail to go to the speaker. Once you've set the override route for the audio session, you can use an AVAudioPlayer instance and send some output to the speaker.

Hope that works for others like it did for me. The documentation on this is scattered, but the Skype app proves it's possible. Persevere, my friends! :)

Some Apple documentation here: http://developer.apple.com/library/ios/#documentation/AudioToolbox/Reference/AudioSessionServicesReference/Reference/reference.html

Do a search on the page for kAudioSessionProperty_OverrideAudioRoute

jocull
  • 20,008
  • 22
  • 105
  • 149
  • 1
    Sadly, it looks like this no longer works as [the docs](https://developer.apple.com/documentation/audiotoolbox/kaudiosessionproperty_overrideaudioroute) now say: "If a headset is plugged in at the time you set this property’s value to kAudioSessionOverrideAudioRoute_Speaker, **the system changes the audio routing for input as well as for output: input comes from the built-in microphone; output goes to the built-in speaker.**" – Adam Jan 20 '22 at 17:53
4

It doesn't look like it's possible, I'm afraid.

From the Audio Session Programming Guide - kAudioSessionProperty_OverrideAudioRoute

If a headset is plugged in at the time you set this property’s value to kAudioSessionOverrideAudioRoute_Speaker, the system changes the audio routing for input as well as for output: input comes from the built-in microphone; output goes to the built-in speaker.

Possible duplicate of this question

Community
  • 1
  • 1
Alex L
  • 8,748
  • 5
  • 49
  • 75
  • It might be worth following the same steps as Tommy did in [this question](http://stackoverflow.com/questions/4002133/forcing-iphone-microphone-as-audio-input) to find available `AVCaptureDevice`s – Alex L Jan 10 '12 at 16:09
2

What you can do is to force audio output to speakers in any case:

From UI Hacker - iOS: Force audio output to speakers while headphones are plugged in

@interface AudioRouter : NSObject

+ (void) initAudioSessionRouting;
+ (void) switchToDefaultHardware;
+ (void) forceOutputToBuiltInSpeakers;

@end

and

#import "AudioRouter.h"
#import <AudioToolbox/AudioToolbox.h>
#import <AVFoundation/AVFoundation.h>

@implementation AudioRouter

#define IS_DEBUGGING NO
#define IS_DEBUGGING_EXTRA_INFO NO

+ (void) initAudioSessionRouting {

    // Called once to route all audio through speakers, even if something's plugged into the headphone jack
    static BOOL audioSessionSetup = NO;
    if (audioSessionSetup == NO) {

        // set category to accept properties assigned below
        NSError *sessionError = nil;
        [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error: &sessionError];

        // Doubly force audio to come out of speaker
        UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
        AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);

        // fix issue with audio interrupting video recording - allow audio to mix on top of other media
        UInt32 doSetProperty = 1;
        AudioSessionSetProperty (kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(doSetProperty), &doSetProperty);

        // set active
        [[AVAudioSession sharedInstance] setDelegate:self];
        [[AVAudioSession sharedInstance] setActive: YES error: nil];

        // add listener for audio input changes
        AudioSessionAddPropertyListener (kAudioSessionProperty_AudioRouteChange, onAudioRouteChange, nil );
        AudioSessionAddPropertyListener (kAudioSessionProperty_AudioInputAvailable, onAudioRouteChange, nil );

    }

    // Force audio to come out of speaker
    [[AVAudioSession sharedInstance] overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil];


    // set flag
    audioSessionSetup = YES;
}

+ (void) switchToDefaultHardware {
    // Remove forcing to built-in speaker
    UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_None;
    AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
}

+ (void) forceOutputToBuiltInSpeakers {
    // Re-force audio to come out of speaker
    UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
    AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);


}

void onAudioRouteChange (void* clientData, AudioSessionPropertyID inID, UInt32 dataSize, const void* inData) {

    if( IS_DEBUGGING == YES ) {
        NSLog(@"==== Audio Harware Status ====");
        NSLog(@"Current Input:  %@", [AudioRouter getAudioSessionInput]);
        NSLog(@"Current Output: %@", [AudioRouter getAudioSessionOutput]);
        NSLog(@"Current hardware route: %@", [AudioRouter getAudioSessionRoute]);
        NSLog(@"==============================");
    }

    if( IS_DEBUGGING_EXTRA_INFO == YES ) {
        NSLog(@"==== Audio Harware Status (EXTENDED) ====");
        CFDictionaryRef dict = (CFDictionaryRef)inData;
        CFNumberRef reason = CFDictionaryGetValue(dict, kAudioSession_RouteChangeKey_Reason);
        CFDictionaryRef oldRoute = CFDictionaryGetValue(dict, kAudioSession_AudioRouteChangeKey_PreviousRouteDescription);
        CFDictionaryRef newRoute = CFDictionaryGetValue(dict, kAudioSession_AudioRouteChangeKey_CurrentRouteDescription);
        NSLog(@"Audio old route: %@", oldRoute);
        NSLog(@"Audio new route: %@", newRoute);
        NSLog(@"=========================================");
    }



}

+ (NSString*) getAudioSessionInput {
    UInt32 routeSize;
    AudioSessionGetPropertySize(kAudioSessionProperty_AudioRouteDescription, &routeSize);
    CFDictionaryRef desc; // this is the dictionary to contain descriptions

    // make the call to get the audio description and populate the desc dictionary
    AudioSessionGetProperty (kAudioSessionProperty_AudioRouteDescription, &routeSize, &desc);

    // the dictionary contains 2 keys, for input and output. Get output array
    CFArrayRef outputs = CFDictionaryGetValue(desc, kAudioSession_AudioRouteKey_Inputs);

    // the output array contains 1 element - a dictionary
    CFDictionaryRef diction = CFArrayGetValueAtIndex(outputs, 0);

    // get the output description from the dictionary
    CFStringRef input = CFDictionaryGetValue(diction, kAudioSession_AudioRouteKey_Type);
    return [NSString stringWithFormat:@"%@", input];
}

+ (NSString*) getAudioSessionOutput {
    UInt32 routeSize;
    AudioSessionGetPropertySize(kAudioSessionProperty_AudioRouteDescription, &routeSize);
    CFDictionaryRef desc; // this is the dictionary to contain descriptions

    // make the call to get the audio description and populate the desc dictionary
    AudioSessionGetProperty (kAudioSessionProperty_AudioRouteDescription, &routeSize, &desc);

    // the dictionary contains 2 keys, for input and output. Get output array
    CFArrayRef outputs = CFDictionaryGetValue(desc, kAudioSession_AudioRouteKey_Outputs);

    // the output array contains 1 element - a dictionary
    CFDictionaryRef diction = CFArrayGetValueAtIndex(outputs, 0);

    // get the output description from the dictionary
    CFStringRef output = CFDictionaryGetValue(diction, kAudioSession_AudioRouteKey_Type);
    return [NSString stringWithFormat:@"%@", output];
}

+ (NSString*) getAudioSessionRoute {
    /*
     returns the current session route:
     * ReceiverAndMicrophone
     * HeadsetInOut
     * Headset
     * HeadphonesAndMicrophone
     * Headphone
     * SpeakerAndMicrophone
     * Speaker
     * HeadsetBT
     * LineInOut
     * Lineout
     * Default
    */

    UInt32 rSize = sizeof (CFStringRef);
    CFStringRef route;
    AudioSessionGetProperty (kAudioSessionProperty_AudioRoute, &rSize, &route);

    if (route == NULL) {
        NSLog(@"Silent switch is currently on");
        return @"None";
    }
    return [NSString stringWithFormat:@"%@", route];
}

@end
loretoparisi
  • 15,724
  • 11
  • 102
  • 146