2

I am trying to build an audio stream app.

I am getting a response like the following:

Request Method: GET Status Code: 206 Partial Content
Response Header:

Accept-Ranges: bytes
Content-Length: 2728759
Content-Range: bytes 0-2097185/2728759
Content-Type: audio/mpeg

I am using SwiftAudioPlayer, but I also tried AVPlayer. The problem is I try to play some audio, it plays for aproximatively one minute and then it stops and I get the following error:

Task <45298B51-9535-4492-A6D1-7F48CE94E26B>.<6> finished with error [-1005] Error Domain=NSURLErrorDomain Code=-1005 "The network connection was lost." UserInfo={NSErrorFailingURLStringKey=https://www.blablabla.com/bla/blablablablabla/

From this post I understand that requests with 206 partial content are not supported on iOS AVPlayer ( & AVAudioEngine I presume)

!! - Also, one important mention is that I need to have Authorisation header, so players like VLC won't work for me, as from my research I understand that VLC does not support headers.

The code I use to start an audio stream is quite simple:

func setupPlayer() {
    guard let audioPath = audioPath,
                  let url = URL(string: "https://www.blabla.com/\(audioPath)"),
                  isNewSetupNeeded else {
                return
            }
            let headers = ["Authorization": "Bearer \(token)"]
            
            SAPlayer.shared.HTTPHeaderFields = headers
            SAPlayer.shared.startRemoteAudio(withRemoteUrl: url)
}

// MARK: AVPlayer
    
    func setupAvPlayer(with url: URL, headers: [String: String]) {
        let asset = AVURLAsset(url: url, options: ["AVURLAssetHTTPHeaderFieldsKey": headers])
        let assetKeys = ["playable", "hasProtectedContent"]
        let playerItem = AVPlayerItem(asset: asset, automaticallyLoadedAssetKeys: assetKeys)
        
        self.audioPlayer = AVPlayer(playerItem: playerItem)
        self.audioPlayer.volume = 1.0
        self.playAudioBackground()
    }
    
    private func playAudioBackground() {
        do {
            try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playback, mode: AVAudioSession.Mode.default, options: [.mixWithOthers, .allowAirPlay])
            print("Playback OK")
            try AVAudioSession.sharedInstance().setActive(true)
            print("Session is Active")
        } catch {
            print(error)
        }
    }

EDIT

For anyone stumbling across this post. My solution was to have a completely different approach and I ended up using StreamingKit With this player everything works as expected.

razvan
  • 355
  • 4
  • 19
  • Pretty sure I used `AVPlayer` to stream video from S3 bucket via Cloudfront. – pronebird Apr 27 '22 at 10:08
  • 1
    I wouldn't read too much into the post you linked to. It makes no logical sense! In any case, I recommend using a packet sniffer like Wireshark to see what's happening under the hood. Most players that use HTTP for streaming will do rate control by putting back pressure on the stream, which adjusts the TCP window size, eventually to zero. If the window stays shut for too long, the server may give up on you and close the connection. It's expected that the client then reconnect and request the later bytes with a ranged request. This can happen seamlessly... the user doesn't know. – Brad Apr 27 '22 at 14:40

0 Answers0