在Qt中渲染OpenGL场景并将其传输到HTML5界面 [英] Render OpenGL scene in Qt and stream it to HTML5 interface

查看:274
本文介绍了在Qt中渲染OpenGL场景并将其传输到HTML5界面的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想知道在Qt中渲染OpenGL场景并实时流式传输到HTML5接口是否可能(我的意思是场景是现场生成的)。



我一直在试图找到关于这方面的信息,以及如何做到这一点,但我没有成功......



如果存在,是否有任何一种现有的机制来压缩图像并优化带宽使用。我正在考虑像Citrix这样的解决方案,但需要一个HTML5客户端。 解决方案

这个答案解释了这个任务是如何实现的用OpenGL,Qt和。



更新:



问题解决了!可以将原始BGRA框架编码为 jpg * Ogg 并直接流式传输而不在磁盘上创建中间文件。我冒昧地将FPS限制设置为15,并将 theoraenc 的标准质量降低了50%:

  gst-launch-1.0.exe -v filesrc location = dumped.bin blocksize = 1920000! video / x-raw,format = BGRA,width = 800,height = 600,framerate = 1/1! videoconvert! video / x-raw,format = RGB,帧率= 1/1! videoflip方法=垂直翻转! imagefreeze!视频! video / x-raw,format = RGB,帧率= 30/2!视频转换! clockoverlay shaded-background = true font-desc =Sans 38! theoraenc质量= 24! oggmux!排队! tcpserversink host = 127.0.0.1 port = 8080 sync-method = 2 

此处有一些操作你真的不需要的管道。尽管如此,您可以通过一些优化带宽的方式将帧缩小到更小的尺寸(400x300),为FPS设置下限,降低编码帧的质量等等:

b
$ b

  gst-launch-1.0.exe -v filesrc location = dumped.bin blocksize = 1920000! video / x-raw,format = BGRA,width = 800,height = 600,framerate = 1/1!视频转换! video / x-raw,format = RGB,帧率= 1/1! videoflip方法=垂直翻转!视频!视频/ X-原料,宽度= 400,高度= 300! imagefreeze!视频速率! video / x-raw,format = RGB,framerate = 30/2! videoconvert! clockoverlay shaded-background = true font-desc =Sans 38! theoraenc质量= 24! oggmux! tcpserversink host = 127.0.0.1 port = 8080 sync-method = 2 


I was wondering whether it is possible or not to render an OpenGL scene in Qt and stream it to an HTML5 interface in real time (I mean by that that the scene is generated on the spot).

I have been trying to find information about that and how to do it but I was not successful...

If it exists, is there any kind of existing mechanism to compress the image and optimize the bandwith use. I am thinking of a solution in the likes of Citrix but with an HTML5 client.

解决方案

This answer explains how this task can be accomplished with OpenGL, Qt and GStreamer. But before I start, there are 2 issues that need to be addressed right away:

  • Streaming video to HTML5 is still problematic. I suggest using Ogg for encoding since its better supported by modern browsers than h264;
  • Encoding the video and stream it to HTTP is quite a challenge without 3rd party libraries to help you. Take a good look at GStreamer (a cross-platform library for handling multimedia files). It's what I use here to encode and stream a frame from OpenGL's framebuffer;

What does a roadmap to implement something like this looks like?

Start by capturing frames from the framebuffer. There are different methods that can be used for this purpose, and a Googling for opengl offscreen rendering will return several interesting posts and documents. I will not get into technical details since this subject has been covered extensively, but for educational purposes I'm sharing the code below to demonstrate how to retrieve a frame and save it as a jpg on the disk:

// GLWidget is a class based on QGLWidget.
void GLWidget::paintGL()
{
    /* Setup FBO and RBO */

    glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _fb);

    glGenFramebuffersEXT(1, &_fb);
    glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _fb);

    glGenRenderbuffersEXT(1, &_color_rb);
    glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, _color_rb);

    GLint viewport[4];
    glGetIntegerv(GL_VIEWPORT, viewport);
    glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_BGRA, viewport[2], viewport[3]);
    glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_RENDERBUFFER_EXT, _color_rb);

    /* Draw the scene (with transparency) */

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glMatrixMode(GL_MODELVIEW);

    glEnable(GL_BLEND);
    glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

    glLoadIdentity();
    glTranslatef(-2.0f, 0.0f, -7.0f);
    glRotatef(45, 1.0f, 1.0f, 0.0f);
    _draw_cube();

    glLoadIdentity();
    glTranslatef(2.0f, 0.0f, -7.0f);
    glRotatef(30, 0.5f, 1.0f, 0.5f);
    _draw_cube();

    glFlush();

    /* Retrieve pixels from the framebuffer */

    int imgsize = viewport[2] * viewport[3];
    std::cout << "* Viewport size: " << viewport[2] << "x" << viewport[3] << std::endl;

    glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
    glReadBuffer(GL_COLOR_ATTACHMENT0);

    unsigned char* pixels = new unsigned char[sizeof(unsigned char) * imgsize * 4];
    glReadPixels(0, 0, viewport[2], viewport[3], GL_BGRA, GL_UNSIGNED_BYTE, pixels);

    // Use fwrite to dump data:
    FILE* fp = fopen("dumped.bin","w");
    fwrite(pixels, sizeof(unsigned char) * imgsize * 4, 1, fp);
    fclose(fp);

    // or use QImage to encode the raw data to jpg:
    QImage image((const unsigned char*)pixels, viewport[2], viewport[3], QImage::Format_RGB32);
    QImage flipped = image.mirrored();
    flipped.save("output2.jpg");

    // Disable FBO and RBO
    glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0);
    glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);

    // Delete resources
    glDeleteRenderbuffersEXT(1, &_color_rb);
    glDeleteFramebuffersEXT(1, &_fb);
    delete[] pixels;
}

A QImage is used to convert the raw GL_BGRA frame to a jpg file. The draw_scene() method simply draws a colored cube with transparency:

The next step is to encode the frame and stream it through HTTP. However, you probably don't want to have to save every single frame from the framebuffer to the disk before being able to stream it. And you are right, you don't have to! GStreamer provides a C API that you can use in your application to perform the operations that are done by gst-launch (introduced below). There's even a Qt wrapper for this library named QtGstreamer to make things even easier.

GStreamer 1.0 provides a cmd-line application named gst-launch-1.0 that can be used to test it's features before you jump into coding. Developers usually play with it to assemble a pipeline of instructions that make the magic happens before starting to code.

The following command shows how it can be used to decode a jpg, encode it to Ogg theora and stream that single image to HTTP in a way that a HTML5 page can play it:

gst-launch-1.0.exe -v filesrc location=output.jpg ! decodebin ! imagefreeze ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc ! oggmux ! tcpserversink host=127.0.0.1 port=8080

The third and last step is to open a HTML5 page that was crafted to display the stream. This step must be executed while gst-launch is running, so copy and paste the code below to a file and open that page in your browser (I tested this on Chrome). The page connects to the localhost, port 8080 and starts receiving the stream. You might have noticed that the gst-launch pipeline overlays a clock over the original image:

<html>
    <title>A simple HTML5 video test</title>
</html>
<body> 
    <video autoplay controls width=320 height=240>    
    <source src="http://localhost:8080" type="video/ogg">
       You browser doesn't support element <code>video</code>.
    </video>
</body>

I'm just trying to figure out exactly how GStreamer can convert a raw BGRA frame to jpg (or other formats) before it is streamed.

Update:

Problem solved! It's possible to encode a raw BGRA frame to jpg or *Ogg and stream it directly without creating intermediate files on the disk. I took the liberty of setting the FPS limit at 15 and also decreased the standard quality of theoraenc by 50%:

gst-launch-1.0.exe -v filesrc location=dumped.bin blocksize=1920000 ! video/x-raw,format=BGRA,width=800,height=600,framerate=1/1 ! videoconvert ! video/x-raw,format=RGB,framerate=1/1 ! videoflip method=vertical-flip ! imagefreeze ! videorate ! video/x-raw,format=RGB,framerate=30/2 ! videoconvert ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc quality=24 ! oggmux ! queue ! tcpserversink host=127.0.0.1 port=8080 sync-method=2

There are a few operations on this pipeline that you don't really need. Nevertheless, some of the things you can do to optimize the bandwidth is scale the frame to a smaller size (400x300), set a lower limit for the FPS, decrease the quality of the encoded frame, and so on:

gst-launch-1.0.exe -v filesrc location=dumped.bin blocksize=1920000 ! video/x-raw,format=BGRA,width=800,height=600,framerate=1/1 ! videoconvert ! video/x-raw,format=RGB,framerate=1/1 ! videoflip method=vertical-flip ! videoscale ! video/x-raw,width=400,height=300! imagefreeze ! videorate ! video/x-raw,format=RGB,framerate=30/2 ! videoconvert ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc quality=24 ! oggmux ! tcpserversink host=127.0.0.1 port=8080 sync-method=2

这篇关于在Qt中渲染OpenGL场景并将其传输到HTML5界面的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆