I have below logic to record (set samplerate = 16000 in AudioContext, mono channel recording by considering only one channel)
- I set a parameter shouldRecord from AudioWorklet, and depending on that AudioWorkletProcessor will start putting data into buffer as below
process(inputs, outputs, parameters) {
const isRecordingValues = parameters.isRecording;
//taking first input
var input0 = inputs[0];
var inputChannel = input0[0];
if (isRecordingValues.length ===1){
const shouldRecord = isRecordingValues[0] === 1;
if (!shouldRecord && !this._isBufferEmpty()) {
this._flush();
this._recordingStopped();
}
if (shouldRecord) {
this._appendToBuffer(inputChannel);
}
}
return true;
}
}
_appendToBuffer is as below:
_appendToBuffer(value) {
if (this._isBufferFull()) {
this._flush();
}
// Here _buffer is of type Float32Array
this._buffer.set(value, this._bytesWritten);
this._bytesWritten += value.length;
}
- In _flush method i am sending the contents of _buffer to AudioWorklet as below:
var blob = this._exportWAV(buffer, this._bytesWritten);
this.port.postMessage({
eventType: 'data',
audioBuffer: blob
});
Here buffer contains values between -1.0 to 1.0 .
- I receive the data in AudioWorklet as ArrayBuffer object and i download it as Wave file. Irrespective of the size of the file i can open the file in Windows Media Player without error but it lasts less than a second and playback ends.
I believe i am doing something wrong in the process method and the data recorded in buffer is not in correct format.
What am i doing wrong here?
from How to record audio using audioWorklet and AudioWorkletProcessor in javascript?
No comments:
Post a Comment