如何使用Gstreamer和C ++将Open CV帧写入RTSP流? [英] How can I write Open CV frames to an RTSP stream with Gstreamer and C++?

查看:193
本文介绍了如何使用Gstreamer和C ++将Open CV帧写入RTSP流?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将视频帧放入OpenCV,对其进行一些处理(确切地说, aruco检测),然后使用GStreamer将结果帧打包到RTSP流中.

I'm trying to take a video frame into OpenCV, do some processing on it (to be exact, aruco detection) and then package the resultant frame into a RTSP stream with GStreamer.

我已经看到了 Python解决方案问题,但在将其转换为C ++时遇到了麻烦.

I've seen a Python solution to this problem, but I'm having trouble translating it to C++.

这是我尝试重新创建 SensorFactory 类的尝试:

Here's my attempt at recreating the SensorFactory class:

#include <glib-object.h>
#include <iostream>
#include "SensorFactory.h"

SensorFactory::SensorFactory(std::string launch) {
    launchString = launch;
    cap = cv::VideoCapture(0);
    // should be incremented once on each frame for timestamping
    numberFrames = 0;

    // simple struct with only the cap (int*), lastFrame (cv::Mat*) and numberFrames (int* again) fields
    CVData cvData;
    cvData.cap = &cap;
    cvData.lastFrame = &lastFrame;
    cvData.numberFrames = &numberFrames;
}

GstFlowReturn SensorFactory::on_need_data(GstElement *src, CVData *datum) {
    if (datum->cap->isOpened()) {
        if (datum->cap->read(*(datum->lastFrame))) {
            std::string data = std::string(reinterpret_cast<char * > (datum->lastFrame->data));
            GstBuffer *buf = gst_buffer_new_allocate(nullptr, data.max_size(), nullptr);
            gst_buffer_fill(buf, 0, &data, data.max_size());
            buf->duration = static_cast<GstClockTime>(duration);
            GstClockTimeDiff timestamp = *(datum->numberFrames) * duration;
            buf->pts = buf->dts = static_cast<GstClockTime>(timestamp);
            buf->offset = static_cast<guint64>(timestamp);
            int *numf = datum->numberFrames;
            *numf += 1;
            g_signal_emit_by_name(src, "push-buffer", buf);
            gst_buffer_unref(buf);
            return GST_FLOW_OK;
        }
    }
    // never reached
    return GST_FLOW_NOT_LINKED;
}

GstElement *SensorFactory::create_element(const GstRTSPUrl *url) { return gst_parse_launch(launchString.c_str(), nullptr); }

void SensorFactory::configure(GstRTSPMedia *rtspMedia) {
    numberFrames = 0;
    GstElement *appsrc;
    appsrc = gst_rtsp_media_get_element(rtspMedia);
    g_signal_connect(appsrc, "need-data", (GCallback) on_need_data, &cvData);
}

SensorFactory 的标头没有什么特别的:

The header for SensorFactory is nothing special:

#include <gst/rtsp-server/rtsp-media-factory.h>
#include <gst/rtsp-server/rtsp-media.h>
#include <gst/app/gstappsrc.h>
#include <opencv2/videoio.hpp>

class SensorFactory : public GstRTSPMediaFactory {
public:
    typedef struct _CVData {
        cv::VideoCapture *cap;
        cv::Mat *lastFrame;
        int *numberFrames;
    } CVData;

    CVData cvData;
    std::string launchString;
    cv::VideoCapture cap;
    cv::Mat lastFrame;

    int numberFrames = 0;
    const static int framerate = 30;
    const static GstClockTimeDiff duration = 1 / framerate * GST_SECOND;

    explicit SensorFactory(std::string launch);

    static GstFlowReturn on_need_data(GstElement *src, CVData *datum);

    GstElement *create_element(const GstRTSPUrl *url);

    void configure(GstRTSPMedia *media);
};

然后 main.cpp 看起来像这样:

#include <gst/gst.h>
#include "src/SensorFactory.h"

int main() {
    gst_init(nullptr, nullptr);

    GstRTSPServer *server;
    server = gst_rtsp_server_new();

    SensorFactory sensorFactory("appsrc name=source is-live=true block=true format=GST_FORMAT_TIME"
                                "caps=video/x-raw,format=BGR ! "
                                "videoconvert ! video/x-raw,format=I420 ! "
                                "x264enc speed-preset=ultrafast tune=zerolatency ! rtph264pay name=pay0");

    g_print("setting shared\n");
    gst_rtsp_media_factory_set_shared(&sensorFactory, true);
    g_print("set shared\n");
    GstRTSPMountPoints *mounts;
    mounts = gst_rtsp_server_get_mount_points(server);
    gst_rtsp_mount_points_add_factory(mounts, "/test", &sensorFactory);

    GMainLoop *loop;
    loop = g_main_loop_new(nullptr, false);
    g_main_loop_run(loop);
}

程序可以正常编译,甚至可以开始运行,但是在 gst_rtsp_media_factory_set_shared(& sensorFactory,true); 上存在段错误.此程序中没有其他任何hacky内存管理.

The program compiles fine, and will even start running, but segfaults on gst_rtsp_media_factory_set_shared(&sensorFactory, true);. There isn't any other hacky memory management in this program.

推荐答案

这里是另一种方法.现在,将 SensorFactory 与rtsp代码分开.

Here is an alternative approach. Seperate your SensorFactory from the rtsp code for now.

使用管道启动 SensorFactory .

appsrc name=source is-live=true block=true format=GST_FORMAT_TIME caps=video/x-raw,format=BGR,width=640,height=480,framerate=30/1 ! videoconvert ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast tune=zerolatency ! udpsink port=5050

我们通过将h264用管道输送到端口5050上的udpsink结束该管道.

We end that pipeline by piping the h264 over a udpsink on port 5050.

然后编译gstreamer rtsp服务器示例在这里并通过管道启动

Then compile the gstreamer rtsp server example here And launch that with pipeline

./test-launch "( udpsrc port=5050 ! rtph264pay name=pay0 pt=96 )"

假设您的SensorFactory可以按预期工作,那么应该可以在 rtsp://localhost:8554/test

Assuming your SensorFactory works as you intend, this should get you an RTSP Stream serving at rtsp://localhost:8554/test

这篇关于如何使用Gstreamer和C ++将Open CV帧写入RTSP流?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆