HTML5音频API inputBuffer.getChannelData音频数组缓冲区 [英] HTML5 Audio API inputBuffer.getChannelData to audio Array buffer
问题描述
我在做我在哪里从INPUTBUFFER采取MIC数据的应用程序,我想传输到另一个客户端和播放。但是,我不能让它工作呢。
I am making an application where I am taking mic data from the inputBuffer and I want to stream to another client and play it. However, I cannot get it wokring.
我的录制/捕获正常工作,所以我会跳到code的相关部分。
My recording/capturing works fine so I will skip to relevant parts of the code
function recorderProcess(e) {
var left = e.inputBuffer.getChannelData(0);
var convert = convertFloat32ToInt16(left);
window.stream.write(convert);
var src = window.URL.createObjectURL(lcm);
playsound(convert);
ss(socket).emit('file',convert, {size: src.size},currentgame);
ss.createBlobReadStream(convert).pipe(window.stream);
//ss.createReadStream(f).pipe(widnow.stream);
}
function playsound(raw) {
console.log("now playing a sound, that starts with", new Uint8Array(raw.slice(0, 10)));
context.decodeAudioData(raw, function (buffer) {
if (!buffer) {
console.error("failed to decode:", "buffer null");
return;
}
var source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
source.start(0);
console.log("started...");
}, function (error) {
console.error("failed to decode:", error);
});
}
我能够成功地创建使用float32toint16功能的数组缓冲区,但是当我使用init声音功能我得到一个错误空,这意味着arraybuffer不会去code到音频流?任何人都有这个问题?我已经走遍了互联网上如何做到这一点没有答案。我想这样玩,因为我最终会流从客户端到客户端,所以我会通过套接字发送arraybufers。
I am able to successfully create an array buffer using the float32toint16 function, however when I use the init sound function I get an error "null" meaning that the arraybuffer will not decode into an audio stream? Has anyone else had this issue? I have scoured the internet with no answer on how to do this. I am trying to play it this way because ultimately I will be streaming from client to client so I will be sending arraybufers via sockets.
先谢谢了。
推荐答案
如果我正确理解这(也有一些缺失的部分您的code样品中)...
If I'm understanding this correctly (there are some missing pieces in your code sample)...
德codeAudioData code>只能去code之类的MP3或WAV。它看起来像你传递一个原始的
Int16Array
或 Uint16Array
。因为底层 ArrayBuffer
不是德codeAudioData code>理解,它给了一个格式。
decodeAudioData
can only decode things like MP3 or WAV. It looks like you're passing it a raw Int16Array
or Uint16Array
. Because the underlying ArrayBuffer
isn't a format that decodeAudioData
understands, it gives up.
我想你想要做的是这样的:
I think what you want to do is something like this:
function playsound( raw ) {
// i'll assume you know how to convert in this direction
// since you have convertFloat32ToInt16
var buffer = convertInt16ToFloat32( raw ),
src = context.createBufferSource(),
audioBuffer = context.createBuffer( 1, buffer.length, context.sampleRate );
audioBuffer.getChannelData( 0 ).set( buffer );
src.buffer = audioBuffer;
src.connect( context.destination );
src.start( 0 );
}
基本上,你已经有一个方法来创建原始的 Float32Array
的网络音频API喜欢,所以没有必要去code(你的不能的德code无论如何,因为你的数据不是一个有效的文件格式)。所以,你只是转换回 Float32Array
,创建自己的 AudioBuffer
,在将数据从写缓冲
,并从那里走了。
Basically, you already have a way to create the raw Float32Array
that the Web Audio API likes, so there's no need to decode (and you can't decode anyway, since your data isn't a valid file format). So you just convert back to Float32Array
, create your own AudioBuffer
, write in the data from buffer
, and go from there.
这篇关于HTML5音频API inputBuffer.getChannelData音频数组缓冲区的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!