FFmpeg Javacv-延迟问题 [英] FFmpeg Javacv - Latency Issue

查看:737
本文介绍了FFmpeg Javacv-延迟问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用android v21设备将数据流式传输到javafx应用程序.它的工作正常,但我有大约2秒钟的延迟.

I am using an android v21 device to stream data to a javafx application. Its working fine but I have about 2 seconds of latency.

截至目前,基本交通方式是这样的

As of now the basic transportation goes like this

  1. android webrtc/自定义实现16毫秒
  2. android packetizer(udp)6毫秒
  3. udp传输假定为< 5毫秒
  4. windows depacketizer在缓冲区中没有数据积累
  5. windows ffmpeg framgrabber未知延迟
  6. javafx imageview< 1毫秒

我到台式机和打包器的数据流比帧速率快得多,并且常常只是等待.在其他任何地方都没有数据的积累,因此我假设我的任何代码都不会延迟.

My data stream to my desktop and my packetizer is much faster than my frame rate and is often just waiting. There is no buildup of data anywhere else and therefore I assume no delay in any of my code.

我通过将相机中的yuv写入纹理并定时确定android设备将帧编码为h264的时间以及发送帧之前需要多长时间来测试了我的android设备.所以16 + 6 = 22ms

I tested my android device by writing the yuv from camera to a texture and timing how long before the android device can encode the frame into h264 and then how long until its sent. so 16 + 6 = 22ms

我觉得问题出在Javacv ffmpeg framegrabber.我正在研究此api,以了解发生这种情况的原因.

I feel the problem is with the Javacv ffmpeg framegrabber. Im studying this api in order to learn why this is occurring.

我主要担心的是Framegrabber耗时4秒钟左右才能启动.

My major concern is that framegrabber takes foever to start...around 4 seconds.

一旦开始,我就能清楚地看到我插入了多少帧,抓取了多少帧,并且总是滞后一些较大的数字,例如40到200.

Once it start I can clearly see how many frames I insert and how many its grabbing and it always lagging by some large number such as 40 up to 200.

Framegrabber.grab()也会阻塞,并且每隔100毫秒运行一次以匹配我的帧速率,无论我告诉它运行多快,这样我就永远追赶不上.

Also Framegrabber.grab() is blocking and runs every 100ms to match my frame rate no matter how fast I tell it to run so I can never catch up.

您有什么建议吗?

我开始认为javacv不是一个可行的解决方案,因为似乎很多人都在为这个延迟问题而苦苦挣扎.如果您有其他建议,请提出建议.

Im starting to think javacv is not a viable solution because it seems many people struggle with this delay issue. If you have alternate suggestions please advise.

我的ffmpeg framgrabber

My ffmpeg framgrabber

    public RapidDecoder(final InputStream inputStream, final ImageView view)
{
    System.out.println(TAG + " starting");

     grabber = new FFmpegFrameGrabber(inputStream, 0);
     converter = new Java2DFrameConverter();
     mView = view;


    emptyBuffer = new Runnable() {
        @Override
        public void run() {
            System.out.println(TAG + " emptybuffer thread running");
            try {

                grabber.setFrameRate(12);
                grabber.setVideoBitrate(10000);

                //grabber.setOption("g", "2");
               // grabber.setOption("bufsize", "10000");
                //grabber.setOption("af", "delay 20");
                //grabber.setNumBuffers(0);
                //grabber.setOption("flush_packets", "1");
                //grabber.setOption("probsize", "32");
                //grabber.setOption("analyzeduration", "0");
                grabber.setOption("preset", "ultrafast");

                grabber.setOption("fflags", "nobuffer");
                //grabber.setVideoOption("nobuffer", "1");
                //grabber.setOption("fflags", "discardcorrupt");
                //grabber.setOption("framedrop", "\\");
               //grabber.setOption("flags","low_delay");
                grabber.setOption("strict","experimental");
                //grabber.setOption("avioflags", "direct");
                //grabber.setOption("filter:v", "fps=fps=30");
                grabber.setVideoOption("tune", "zerolatency");
                //grabber.setFrameNumber(60);


                grabber.start();
            }catch (Exception e)
            {
                System.out.println(TAG + e);
            }

            while (true)
            {

                try{
                    grabFrame();
                    Thread.sleep(1);
                }catch (Exception e)
                {
                    System.out.println(TAG + " emptybuffer " + e);
                }

            }



        }
    };

    display = new Runnable() {
        @Override
        public void run() {

            System.out.println(TAG + " display thread running ");

            while(true)
            {

                try{
                    displayImage();
                    Thread.sleep(10);
                }catch (Exception e)
                {
                    System.out.println(TAG + " display " + e);
                }

            }


        }
    };




}


public void generateVideo()
{
    System.out.println(TAG + " genvid ");




    new Thread(emptyBuffer).start();
    new Thread(display).start();



}



public synchronized void grabFrame() throws FrameGrabber.Exception
{
           //frame = grabber.grabFrame();
        frame = grabber.grab();
    //System.out.println("grab");


}

public synchronized void displayImage()
{


    bufferedImage = converter.convert(frame);
    frame = null;
    if (bufferedImage == null) return;
    mView.setImage(SwingFXUtils.toFXImage(bufferedImage, null));
    //System.out.println("display");
}

在这里您可以看到我用图像绘制纹理并将其发送到h264编码器

here you can see i draw texture with image and send to h264 encoder

@Override public void onTextureFrameCaptured(int width,int height,int texId,float [] tranformMatrix,int rotation,long timestamp){ //Log.d(TAG,"onTextureFrameCaptured:->");

@Override public void onTextureFrameCaptured(int width, int height, int texId, float[] tranformMatrix, int rotation, long timestamp) { //Log.d(TAG, "onTextureFrameCaptured: ->");

            VideoRenderer.I420Frame frame = new VideoRenderer.I420Frame(width, height, rotation, texId, tranformMatrix, 0,timestamp);
            avccEncoder.renderFrame(frame);
            videoView.renderFrame(frame);
            surfaceTextureHelper.returnTextureFrame();

        }

在这里您可以看到webrtc编码发生

Here you can see webrtc encoding happen

 @Override
    public void renderFrame(VideoRenderer.I420Frame i420Frame) {
        start = System.nanoTime();
        bufferque++;

        mediaCodecHandler.post(new Runnable() {
            @Override
            public void run() {
                videoEncoder.encodeTexture(false, i420Frame.textureId, i420Frame.samplingMatrix, TimeUnit.NANOSECONDS.toMicros(i420Frame.timestamp));
            }
        });


    }

    /**
     * Called to retrieve an encoded frame
     */
    @Override
    public void onEncodedFrame(MediaCodecVideoEncoder.OutputBufferInfo frame, MediaCodec.BufferInfo bufferInfo) {

        b = new byte[frame.buffer().remaining()];
        frame.buffer().get(b);
        synchronized (lock)
        {
            encodedBuffer.add(b);
            lock.notifyAll();
            if(encodedBuffer.size() > 1)
            {
                Log.e(TAG, "drainEncoder: too big: " + encodedBuffer.size(),null );

            }
        }
        duration = System.nanoTime() - start;
        bufferque--;
        calcAverage();
        if (bufferque > 0)
        {
        Log.d(TAG, "onEncodedFrame: bufferque size: " + bufferque);


    }

}

推荐答案

我在几天的时间里解决了这个问题,然后在上面编辑了我的问题,但让我为那些可能需要它们的人提供详细信息.

I edited my question above as I solved the problem over the course of a few days but let me give detail for those who may need them.

Android-我最终使用了该库 https://github.com/Piasy/VideoCRE 它打开了webrtc函数,并允许您逐帧编码视频.那就是我如何在16ms上对帧进行基准测试,以便在旧的可怕手机上进行编码.

Android - I ended up using this library https://github.com/Piasy/VideoCRE It tears the webrtc function open and allows you to encode video frame by frame. Thats how I benchmarked the frames at 16ms for encoding on an old terrible phone.

javacv ffmpeg-解决方案是c ++ avcodec中的一个缓冲问题.为了证明这一点,请尝试每帧输入两次或10次而不是一次.尽管提要也变得无用,但它可以将延迟减少相同的因素.它还可以减少视频供稿的启动时间.但是在javacv代码中的ffmpegframegrabber的第926行中,我通过此链接将线程从(0)设置为(1) https://mailman.videolan.org/pipermail/x264-devel/2009-May/005880.html

javacv ffmpeg -The solution was a buffering issue in the c++ avcodec. To prove it try feed every frame in twice or 10 times instead of once. It cuts down latency by the same factor although the feed becomes useless as well. It also reduces the startup time of the video feed. However on line 926 of ffmpegframegrabber in the javacv code I set thread from (0) to (1) per this link https://mailman.videolan.org/pipermail/x264-devel/2009-May/005880.html

thread_count = 0指示x264使用足够的线程来加载所有线程 您的CPU核心在编码过程中.因此,您可能需要在双机上运行测试 核心机器(2个核心将具有3个线程).不使用x264进行编码 延迟,将thread_count设置为1.

The thread_count = 0 directs x264 to use enough threads to load all your CPU cores during encode. So you probably run the tests on a dual core machine (2 cores will have 3 threads). To get x264 encode without delay, set thread_count = 1.

您可能会发现通过javacv设置选项的无数建议,但是我从来没有让javacv拒绝我设置的选项,并且多次了解到我正在影响错误的因素.这是我尝试过的事情的清单;

You may find countless suggestions of setting options through javacv however I never had javacv reject the options I set and learned many times that I was affecting the wrong factors. Here is a list of the thing i tried;

                //grabber.setFrameRate(12);
                //grabber.setVideoBitrate(10000);

                //grabber.setOption("g", "2");
               // grabber.setOption("bufsize", "10000");
                //grabber.setOption("af", "delay 20");
                //grabber.setNumBuffers(0);
                //grabber.setOption("flush_packets", "1");
                //grabber.setOption("probsize", "32");
                //grabber.setOption("analyzeduration", "0");
                //grabber.setOption("preset", "ultrafast");

                //grabber.setOption("fflags", "nobuffer");
                //grabber.setVideoOption("nobuffer", "1");
                //grabber.setOption("fflags", "discardcorrupt");
                //grabber.setOption("framedrop", "\\");
               //grabber.setOption("flags","low_delay");
                //grabber.setOption("strict","experimental");
                //grabber.setOption("avioflags", "direct");
                //grabber.setOption("filter:v", "fps=fps=30");
                //grabber.setOptions("look_ahead", "0");
                //Map options = new HashMap();
                //options.put("tune", "zerolatency");
                grabber.setVideoOption("look_ahead", "0");
                //grabber.setFrameNumber(60);

它们都不起作用,当您阅读文档时,您将了解到,当ffmpeg启动时,有不同的编码器(avcontext,videocontext,audiocontext)采用不同的值,并且有不同的api framegrabber和ffply带有不同的标志(i相信),所以把东西扔在墙上是徒劳的.

None of them worked and as you read the documentation you will understand that when ffmpeg starts up there are different encoders (avcontext, videocontext, audiocontext) which takes different values and there are different api framegrabber and ffply which takes different flags( i believe) so throwing things at the wall is rather futile.

尝试先将多余的帧添加到流中.另外,如果只需要一个图像,只需在输入流中添加一个空数据包,它将刷新缓冲区.

Try adding the extra frames to your stream first. Also if you only need a single image just add a null packet to you input stream and it will flush the buffer.

如果您需要流式传输视频以实现机器人视觉,请查看我的博客文章 http://cagneymoreau.com/stream-video-android/

If you need to stream video for robotic vision check out my blog post http://cagneymoreau.com/stream-video-android/

这篇关于FFmpeg Javacv-延迟问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆