如何在Websocket中将连续的视频块作为Blob数组接收并动态设置为video标签 [英] How to receive continuous chunk of video as a blob array and set to video tag dynamically in Websocket

查看:183
本文介绍了如何在Websocket中将连续的视频块作为Blob数组接收并动态设置为video标签的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试建立自己的广播架构。在此系统中,由于我知道 it 适用于连续数据传输。



在我的系统中,有一个 Host 发起网络摄像头实时广播视频。我使用 MediaStreamRecorder.js 来记录每5秒的视频并发送

I am trying to make my own broadcasting architecture. In this system i am using Websocket to transfer data since i know it is suitable for continuous data transfer.


In my system there is a Host who initiate webcam live broadcast video. I use MediaStreamRecorder.js which record every 5s chunk of video and send to server through websocket as blob array.

服务器只是接收并发送给该会话中连接的所有客户端。

Server simply recieve and send to the all client who are connected in that Session.

当客户端连接后,它将通过Websocket接收连续5s的视频作为blob数组。

When client connected then it receive continuous 5s chunk of video as blob array through Websocket.

我的主要问题是在客户端,如何动态地将视频Blob数组设置为 html video 源5秒,以便它可以播放每5s的视频数据块。

My main problem is in Client side how can I set the video blob array to html video source dynamically in every 5 seconds such that it can play every 5s chunk of video data.

我在主机和客户端使用Glassfish 4.0作为服务器和Javscript。浏览器:Chrome
源代码:

I am using Glassfish 4.0 as server and Javscript in Host and Client side. Browser: Chrome Source Code:

    package websocket1;

import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.Collections;
import java.util.HashSet;
import java.util.Iterator;
import java.util.Set;

import javax.websocket.OnClose;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.server.ServerEndpoint;

@ServerEndpoint(value = "/liveStreamMulticast")
public class LiveStreamMultiCast {
    private static final Set<Session> sessions = Collections.synchronizedSet(new HashSet<Session>());

    @OnOpen
    public void whenOpening(Session session) {
        // session.setMaxBinaryMessageBufferSize(1024*512); // 512 KB
        sessions.add(session);
        System.out.println("You are Connected!");
        System.out.println("Total Connection are connected: " + sessions.size());

    }

    @OnMessage
    public void handleVideo(byte[] videoData, Session HostSession) {
        // System.out.println("Insite process video");

        try {
            if (videoData != null) {
                sendVideo(videoData, HostSession);
            }

        } catch (Throwable e) {
            System.out.println("Error sending message " + e.getMessage());
        }
    }


    @OnClose
    public void onClosing(Session session) {
        System.out.println("Goodbye!");
        sessions.remove(session);
    }

    private void sendVideo(byte[] videoData, Session hostSession) throws IOException {

        Iterator<Session> iterator = sessions.iterator();
        Session tempSession = null;

        while (iterator.hasNext()) {
            tempSession = iterator.next();

            // System.out.println("Sever send data to "+ tempSession);
            if (!tempSession.equals(hostSession))
                tempSession.getBasicRemote().sendBinary(ByteBuffer.wrap(videoData));

        }

    }
}



< h3> host.html

host.html

<html>
<head>
    <title>Demo</title>
    <script type="text/javascript" src="js/required/mediastream.js"></script>
</head>
<body>

<video id="video" autoplay=""></video>

<button id="stopButton" onclick="stop()">Stop</button>
<script type="text/javascript">

var url = "ws://localhost:8080/LiveTraining3Demo/liveStreamMulticast"; // 8080/application_name/value_given_in_annotation

var socket = new WebSocket(url);
    var video = document.querySelector('video');

socket.onopen = function(){

    console.log("Connected to Server!!");

}
socket.onmessage = function(msg){
    console.log("Message come from server");

}
/////////////////////////////////
var wholeVideo =[];
var chunks = [];
var mediaRecorder;
//////////////////////////////////////

  function gotMedia(stream) {
    video.srcObject = stream;
    mediaRecorder = new MediaStreamRecorder(stream);
    console.log("mediaRecorderCalled");
    mediaRecorder.mimeType = 'video/webm';
    mediaRecorder.start(5000);//
    console.log("recorder started");

    mediaRecorder.ondataavailable = (event) =>{
        chunks.push(event.data);
        console.log("push  B");
        wholeVideo.push(event.data);
        console.log("WholeVideo Size:");
        setTimeout(sendData(),5010);
    }



  }


  function sendData(){ 
    //var byteArray = new Uint8Array(recordedTemp);
    const superBuffer =  new Blob(chunks, {
        type: 'video/webm'
        });

     socket.send(superBuffer);
     console.log("Send Data");
      console.table(superBuffer);
      chunks = [];

  }


  navigator.getUserMedia  = navigator.getUserMedia || 
                                     navigator.webkitGetUserMedia ||
                                      navigator.mozGetUserMedia || 
                                       navigator.msGetUserMedia;

  navigator.mediaDevices.getUserMedia({video: true , audio: true})
      .then(gotMedia)
      .catch(e => { console.error('getUserMedia() failed: ' + e); });
    </script>

</body>
</html>



client.html



client.html

<html>
<head>

<title>Recieve Video</title>

</head>
<body>
<video id="video" autoplay controls loop
    style="width: 700; height: 500; margin: auto">
    <source src="" type="video/webm">
</video>
<script>
    var url = "ws://localhost:8080/LiveTraining3Demo/liveStreamMulticast"; // 8080/application_name/value_given_in_annotation
    var check = true;
    var socket = new WebSocket(url);
    var videoData = [];
    var superBuffer = null;
    //var videoUrl;

    //socket.binaryType = 'arraybuffer';
    socket.onopen = function() {
        console.log("Connected!!");

    }

    var check = true;
    socket.onmessage = function(videoStream) {

        var video = document.querySelector('video');
        var videoUrl = window.URL.createObjectURL(videoStream.data);
        video.src = videoUrl;
        video.load();
        video.onloadeddata = function() {
            URL.revokeObjectURL(video.src);
            video.play();
        }
        //video.srcObject

        //video.play();

        console.table(videoStream);

    }
    socket.onerror = function(err) {
        console.log("Error: " + err);
    }
</script>
</body>
</html>



当我尝试运行所有其他外观时,但在<$ c中$ c> client.html 仅显示视频标签源,而没有任何视频播放。


When I try to run all other looks fine but in client.html only the video tag source is display with no any video play.

我已经工作了一周。
可能是我的某些实现出错了,我也知道 WebRTC Mauz Webrtc Broadcast ,但如果有另一种简单的方法,我不喜欢复杂要做到这一点。我不喜欢使用 node.js 服务器,因为我必须使用spring制作此Web应用程序。
任何想法都可以理解。
提前谢谢!!

I am working on it since a week. Might be my some implementation goes wrong, I also know WebRTC, Mauz Webrtc Broadcast but i didn't like to go through that complex if there is another simple way to do that. I am not like to use node.js server since i have to make this web application with spring. Any idea can be appreciated. Thanks In Advance!!.

推荐答案

在客户端将获得数组缓冲区。因此,您需要将数组缓冲区转换为blob数组。

In client side will get array buffer. So you need to convert array buffer into blob array.

 let video = document.querySelector('video'); 
  let blobArray = [];
 socket.on('message',data=>{
  blobArray.push(new Blob([new Uint8Array(data)],{'type':'video/mp4'}));
  let currentTime = video.currentTime;
  let blob = new Blob(blobArray,{'type':'video/mp4'});
  video.src = window.URL.createObjectURL(blob);
  video.currentTime = currentTime;
  video.play();
 });

这篇关于如何在Websocket中将连续的视频块作为Blob数组接收并动态设置为video标签的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆