Friday 29 June 2018

iOS Swift playing audio (aac) from network stream

I'm developing an iOS application and I'm quite new to iOS development. So far I have implemented a h264 decoder from network stream using VideoToolbox, which was quite hard. Now I need to play an audio stream that comes from network, but with no file involved, just a raw AAC stream read directly from the socket. This streams comes from the output of a ffmpeg instance. The problem is that I don't know how to start with this, it seems there is little information about this topic. I have already tried with AVAudioPlayer but found just silence. I think I have first need to decompress the packets from the stream, just like with the h264 decoder.

I have been trying also with AVAudioEngine and AVAudioPlayerNode but no sucess, same as with AVAudioPlayer. Can someone provide me some guidance? Maybe AudioToolbox? AudioQueue?

Thank you very much for the help :)

Edit: I'm playing around with AVAudioCompressedBuffer and having no error using AVAudioEngine and AVAudioNode. But, I don't know what this output means:

inBuffer: <AVAudioCompressedBuffer@0x6040004039f0: 0/1024 bytes>

Does this mean that the buffer is empty? I have been trying to feed this buffer in several ways, but always returns something like 0/1024. I think I'm not doing this right:

compressedBuffer.mutableAudioBufferList.pointee = audioBufferList

Any idea?

Thank you!

Edit 2: I'm editing for reflecting my code for decompressing the buffer. Maybe some one can point me in the right direction. Note: The packet that is ingested by this function actually is passed without the ADTS header (9 bytes) but I have also tried passing it with the header.

func decodeCompressedPacket(packet: Data) -> AVAudioPCMBuffer {

    var packetCopy = packet
    var streamDescription: AudioStreamBasicDescription = AudioStreamBasicDescription.init(mSampleRate: 44100, mFormatID: kAudioFormatMPEG4AAC, mFormatFlags: UInt32(MPEG4ObjectID.AAC_LC.rawValue), mBytesPerPacket: 0, mFramesPerPacket: 1024, mBytesPerFrame: 0, mChannelsPerFrame: 1, mBitsPerChannel: 0, mReserved: 0)
    let audioFormat = AVAudioFormat.init(streamDescription: &streamDescription)
    let compressedBuffer = AVAudioCompressedBuffer.init(format: audioFormat!, packetCapacity: 1, maximumPacketSize: 1024)

    print("packetCopy count: \(packetCopy.count)")
    var audioBuffer: AudioBuffer = AudioBuffer.init(mNumberChannels: 1, mDataByteSize: UInt32(packetCopy.count), mData: &packetCopy)
    var audioBufferList: AudioBufferList = AudioBufferList.init(mNumberBuffers: 1, mBuffers: audioBuffer)
    var mNumberBuffers = 1
    var packetSize = packetCopy.count
    // memcpy(&compressedBuffer.mutableAudioBufferList[0].mBuffers, &audioBuffer, MemoryLayout<AudioBuffer>.size)
    // memcpy(&compressedBuffer.mutableAudioBufferList[0].mBuffers.mDataByteSize, &packetSize, MemoryLayout<Int>.size)
    // memcpy(&compressedBuffer.mutableAudioBufferList[0].mNumberBuffers, &mNumberBuffers, MemoryLayout<UInt32>.size)

    // compressedBuffer.mutableAudioBufferList.pointee = audioBufferList

    var bufferPointer = compressedBuffer.data

    for byte in packetCopy {
        memset(compressedBuffer.mutableAudioBufferList[0].mBuffers.mData, Int32(byte), MemoryLayout<UInt8>.size)
    }

    print("mBuffers: \(compressedBuffer.audioBufferList[0].mBuffers.mNumberChannels)")
    print("mBuffers: \(compressedBuffer.audioBufferList[0].mBuffers.mDataByteSize)")
    print("mBuffers: \(compressedBuffer.audioBufferList[0].mBuffers.mData)")


    var uncompressedBuffer = uncompress(inBuffer: compressedBuffer)
    print("uncompressedBuffer: \(uncompressedBuffer)")
    return uncompressedBuffer
}



from iOS Swift playing audio (aac) from network stream

No comments:

Post a Comment