0

I have been trying to stream from my icecast2 server from the microphone (or any other audio input) from my laptop.

Icecast communication:

  • Using the information gleamed from this thread I am able to connect to my server create and mountpoint and send the audio data of an mp3 file to the server with no problem.

Audio input / conversion:

  • I am getting audio samples in the form of an AudioBufferList using the following method. I could not find too much information on how to actually go about converting the AudioBufferList to an MP3 buffer so any advice here is much appreciated.

    // Trigger audio play through
    void playAudioFromDevice(AudioObjectID inDevice, void* inClientData)
    {
        // Register the device for process
        AudioDeviceIOProcID theIOProcID = NULL;
        OSStatus triggerAudioStatus = AudioDeviceCreateIOProcID(inDevice, audioIOProc,          inClientData, &theIOProcID);
    
        // start the process
        triggerAudioStatus = AudioDeviceStart(inDevice, theIOProcID);
    }
    
    // Audio process callback
    OSStatus audioIOProc(AudioDeviceID              inDevice,
                         const AudioTimeStamp*      inNow,
                         const AudioBufferList*     inInputData,
                         const AudioTimeStamp*      inInputTime,
                         AudioBufferList*           outOutputData,
                         const AudioTimeStamp*      inOutputTime,
                         void*                      inClientData)
    {
        OSStatus audioProcStatus = 0;
    
        // Any tips on the mp3 conversion?
    
        const int MP3_SIZE = 8192;
        const int PCM_SIZE = inInputData->mBuffers->mDataByteSize;
        unsigned char mp3_buffer[MP3_SIZE];
    
        // get left and right buffer
        SInt16 *leftBuffer = (SInt16 *)inInputData->mBuffers[0].mData;
        SInt16 *rightBuffer = (SInt16 *)inInputData->mBuffers[1].mData;
    
        lame_encode_buffer(lame_global, leftBuffer, rightBuffer, PCM_SIZE, mp3_buffer, MP3_SIZE);
    
        //
        // Send the mp3 encoded buffer here
        //
    
        return audioProcStatus;
    }
    

Streaming

  • When I was streaming a static audio file to Icecast, I was using NSInputStream which is initialised with an NSData object containing the contents of an mp3 file. This was then set to the [NSMutableURLRequest setHTTPBodyStream:] method. I don't think it is possible to append to an NSInputStream so I am confused how I could get this information to Icecast.

NOTES

I have looked into the source code of BUTT (Broadcast using this tool) and found that they are using PortAudio to stream, which looks straight forward enough. I would try to do it this using this lib, but I am having trouble including it in my project.

Community
  • 1
  • 1
IyadAssaf
  • 308
  • 5
  • 9
  • What specifically are you having trouble with? Capturing the audio? Encoding it with LAME? Interfacing with the Icecast server? – Brad Apr 21 '14 at 03:20
  • The main issue is sending it to the server, are you aware of an ideal method of how to stream the data to the icecast server? Since icecast protocol documentation seems to be non-existant at this point I am trying to guess. – IyadAssaf Apr 23 '14 at 23:49
  • See my post here for some rough documentation: http://stackoverflow.com/a/9985297/362536 You should use a packet sniffer such as Wireshark to debug your application. – Brad Apr 24 '14 at 00:06

0 Answers0