Wednesday, 23 October 2019

Redirect the output of an AVAudioEngine to the rendering block in an AUv3

I currently have an Xcode project that is based on an AVAudioEngine written in Swift.

At the head of the engine is an AVAudioPlayerNode used to schedule some LPCM audio buffers.

In order for an AUv3 to process the audio, it needs to override the following properties ( Info from here )

  1. Override the inputBusses getter method to return the app extension’s audio input connection points.

  2. Override the outputBusses getter method to return the app extension’s audio output connection points.

  3. Override the internalRenderBlock getter method to return the block that implements the app extension’s audio rendering loop.

    One must also override the allocateRenderResourcesAndReturnError: method, which the host app calls before it starts to render audio, and override the deallocateRenderResources method, which the host app calls after it has finished rendering audio.

Within each override, one must call the AUAudioUnit superclass implementation.

Given a working AVAudioEngine, where or how does one connect the inputbusses, outputbusses and buffers to the internalRenderBlock in the AUv3?

I have a working prototype AUv3 that can load in a host like GarageBand.
What I am trying to do is pass the audio buffers from the AVAudioEngine into the internalRenderBlock of the AUv3 in order to complete the audio pipeline from the AUv3 to its host.



from Redirect the output of an AVAudioEngine to the rendering block in an AUv3

No comments:

Post a Comment