如何发送和接收通过 getUsermedia() 生成的桌面捕获流 [英] How to send and receive desktop capture stream generated via getUsermedia()
问题描述
我正在使用 WebRTC + Socket.io 制作屏幕共享应用程序,但卡在某个地方.使用 WebRTC + Socket.io 连接两个浏览器,可以发送文本
I am making a Screen sharing app with WebRTC + Socket.io and stuck at a place. Connected with two browser using WebRTC + Socket.io and can send text
我正在接受 codelab 的支持,但事实并非如此用于流.(如果解决方案基于此链接,则非常有帮助)
I am taking support from codelab but it is not for stream.(If solution is based on this link then highly helpful)
如何发送 getUserMedia() 流:
How can I send getUserMedia() stream:
dataChannel.send(stream);
并在 channel.onmessage() 上接收相同的流:我得到的 event.data 是[object MediaStream]"而不是流.
And receive same stream on channel.onmessage(): I am getting event.data as '[object MediaStream]' not stream.
channel.onmessage = function(event){
// unable to get correct stream
// event.data is "[object MediaStream]" in string
}
function createPeerConnection(isInitiator, config) {
console.log('Creating Peer connection as initiator?', isInitiator, 'config:', config);
peerConn = new RTCPeerConnection(config);
// send any ice candidates to the other peer
peerConn.onicecandidate = function (event) {
console.log('onIceCandidate event:', event);
if (event.candidate) {
sendMessage({
type: 'candidate',
label: event.candidate.sdpMLineIndex,
id: event.candidate.sdpMid,
candidate: event.candidate.candidate
});
} else {
console.log('End of candidates.');
}
};
if (isInitiator) {
console.log('Creating Data Channel');
dataChannel = peerConn.createDataChannel("screen");
onDataChannelCreated(dataChannel);
console.log('Creating an offer');
peerConn.createOffer(onLocalSessionCreated, logError);
} else {
peerConn.ondatachannel = function (event) {
console.log('ondatachannel:', event.channel);
dataChannel = event.channel;
onDataChannelCreated(dataChannel);
};
}
}
它适用于字符串或 json,即 dataChannel.send('Hello');
我为此创建了一个 wiki 页面:wiki
I have created a wiki page for same: wiki
请帮忙.
推荐答案
请尝试这样的事情:(代码末尾的说明)
Please try something like this: (explanation at the end of the code)
var btnShareYourCamera = document.querySelector('#share-your-camera');
var localVideo = document.querySelector('#local-video');
var remoteVideo = document.querySelector('#remote-video');
var websocket = new WebSocket('wss://path-to-server:port/');
websocket.onmessage = function(event) {
var data = JSON.parse(event.data);
if (data.sdp) {
if (data.sdp.type === 'offer') {
getUserMedia(function(video_stream) {
localVideo.srcObject = video_stream;
answererPeer(new RTCSessionDescription(data.sdp), video_stream);
});
}
if (data.sdp.type === 'answer') {
offerer.setRemoteDescription(new RTCSessionDescription(data.sdp));
}
}
if (data.candidate) {
addIceCandidate((offerer || answerer), new RTCIceCandidate(data.candidate));
}
};
var iceTransportPolicy = 'all';
var iceTransportLimitation = 'udp';
function addIceCandidate(peer, candidate) {
if (iceTransportLimitation === 'tcp') {
if (candidate.candidate.toLowerCase().indexOf('tcp') === -1) {
return; // ignore UDP
}
}
peer.addIceCandidate(candidate);
}
var offerer, answerer;
var iceServers = {
iceServers: [{
'urls': [
'stun:stun.l.google.com:19302',
'stun:stun1.l.google.com:19302',
'stun:stun2.l.google.com:19302',
'stun:stun.l.google.com:19302?transport=udp',
]
}],
iceTransportPolicy: iceTransportPolicy,
rtcpMuxPolicy: 'require',
bundlePolicy: 'max-bundle'
};
// https://https;//cdn.webrtc-experiment.com/IceServersHandler.js
if (typeof IceServersHandler !== 'undefined') {
iceServers.iceServers = IceServersHandler.getIceServers();
}
var mediaConstraints = {
OfferToReceiveAudio: true,
OfferToReceiveVideo: true
};
/* offerer */
function offererPeer(video_stream) {
offerer = new RTCPeerConnection(iceServers);
offerer.idx = 1;
video_stream.getTracks().forEach(function(track) {
offerer.addTrack(track, video_stream);
});
offerer.ontrack = function(event) {
remoteVideo.srcObject = event.streams[0];
};
offerer.onicecandidate = function(event) {
if (!event || !event.candidate) return;
websocket.send(JSON.stringify({
candidate: event.candidate
}));
};
offerer.createOffer(mediaConstraints).then(function(offer) {
offerer.setLocalDescription(offer).then(function() {
websocket.send(JSON.stringify({
sdp: offer
}));
});
});
}
/* answerer */
function answererPeer(offer, video_stream) {
answerer = new RTCPeerConnection(iceServers);
answerer.idx = 2;
video_stream.getTracks().forEach(function(track) {
answerer.addTrack(track, video_stream);
});
answerer.ontrack = function(event) {
remoteVideo.srcObject = event.streams[0];
};
answerer.onicecandidate = function(event) {
if (!event || !event.candidate) return;
websocket.send(JSON.stringify({
candidate: event.candidate
}));
};
answerer.setRemoteDescription(offer).then(function() {
answerer.createAnswer(mediaConstraints).then(function(answer) {
answerer.setLocalDescription(answer).then(function() {
websocket.send(JSON.stringify({
sdp: answer
}));
});
});
});
}
var video_constraints = {
mandatory: {},
optional: []
};
function getUserMedia(successCallback) {
function errorCallback(e) {
alert(JSON.stringify(e, null, ' '));
}
var mediaConstraints = {
video: true,
audio: true
};
navigator.mediaDevices.getUserMedia(mediaConstraints).then(successCallback).catch(errorCallback);
}
btnShareYourCamera.onclick = function() {
getUserMedia(function(video_stream) {
localVideo.srcObject = video_stream;
offererPeer(video_stream);
});
};
- 您必须使用
peer.addTrack
附加流,如上例所示 - 如上例所示,您必须使用
peer.ontrack
接收远程流
- You must attach stream using
peer.addTrack
as you can see in the above example - You must receive remote stream using
peer.ontrack
as you can see in the above example
即使用 addTrack
连接您的相机并使用 ontrack
接收远程相机.
i.e. use addTrack
to attach your camera and use ontrack
to receive remote camera.
您绝不能使用 dataChannel.send
发送流.两者是完全不同的协议.MediaStream
必须使用 RTP 共享;不是 SCTP.仅当您调用 peer.addTrack
方法附加您的相机流时才使用 RTP.
You must never send your stream using dataChannel.send
. Both are totally different protocols. A MediaStream
must be shared using RTP; not SCTP. RTP is used only if you call peer.addTrack
method to attach your camera stream.
此过程发生在您打开或加入房间之前.
This process happens before you open or join a room.
在此处查看单页演示:https://www.webrtc-experiment.com/getStats/
以上代码片段的 HTML:
HTML for above code snippet:
<button id="share-your-camera"></button>
<video id="local-video" controls autoplay playsinline></video>
<video id="remote-video" controls autoplay playsinline></video>
这篇关于如何发送和接收通过 getUsermedia() 生成的桌面捕获流的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!