使用FFMPEG实时流式传输到Web音频API [英] Live streaming using FFMPEG to web audio api

查看:500
本文介绍了使用FFMPEG实时流式传输到Web音频API的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正尝试使用node.js + ffmpeg将音频流传输到仅使用Web音频API的局域网中连接的浏览器.

I am trying to stream audio using node.js + ffmpeg to browsers connected in LAN only using web audio api.

不使用element,因为它添加了自己的8到10秒的缓冲区,并且我想获得最大的高延迟(最大大约1到2秒).

Not using element because it's adding it's own buffer of 8 to 10 secs and I want to get maximum high latency possible (around 1 to 2 sec max).

音频播放成功,但是音频非常嘈杂.

Audio plays successfully but audio is very choppy and noisy.

这是我的node.js(服务器端)文件:

Here is my node.js (server side) file:

var ws = require('websocket.io'), 
server = ws.listen(3000);
var child_process = require("child_process");
var i = 0;
server.on('connection', function (socket) 
{

console.log('New client connected');

var ffmpeg = child_process.spawn("ffmpeg",[
    "-re","-i",
    "A.mp3","-f",
    "f32le",
    "pipe:1"                     // Output to STDOUT
    ]);

 ffmpeg.stdout.on('data', function(data)
 {
    var buff = new Buffer(data);
    socket.send(buff.toString('base64'));
 });
});

这是我的HTML:

var audioBuffer = null;
var context = null;
window.addEventListener('load', init, false);
function init() {
    try {
        context = new webkitAudioContext();
    } catch(e) {
        alert('Web Audio API is not supported in this browser');
    }
}

var ws = new WebSocket("ws://localhost:3000/");

ws.onmessage = function(message)
{
    var d1 = base64DecToArr(message.data).buffer;
    var d2 = new DataView(d1);

    var data = new Float32Array(d2.byteLength / Float32Array.BYTES_PER_ELEMENT);
    for (var jj = 0; jj < data.length; ++jj)
    {
        data[jj] = d2.getFloat32(jj * Float32Array.BYTES_PER_ELEMENT, true);
    }

    var audioBuffer = context.createBuffer(2, data.length, 44100);
    audioBuffer.getChannelData(0).set(data);

    var source = context.createBufferSource(); // creates a sound source
    source.buffer = audioBuffer;
    source.connect(context.destination); // connect the source to the context's destination (the speakers)
    source.start(0);
};

任何人都可以告诉我哪里出了问题吗?

Can any one advise what is wrong?

关于, 纳扬(Nayan)

Regards, Nayan

推荐答案

我让它正常工作!

我要做的就是调整频道数量.

All I had to do is adjust the number of channel.

我已将FFMPEG设置为输出单声道音频,它的工作原理很吸引人.这是我的新FFMOEG命令:

I've set FFMPEG to output mono audio and it worked like a charm. Here is my new FFMOEG command:

var ffmpeg = child_process.spawn("ffmpeg",[
    "-re","-i",
    "A.mp3",
    "-ac","1","-f",
    "f32le",
    "pipe:1"                     // Output to STDOUT
    ]);

这篇关于使用FFMPEG实时流式传输到Web音频API的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆