Libstreaming错误(德codeR缓冲区不够大,德codeR没去code的任何东西) [英] Libstreaming errors (decoder buffer not big enough, decoder did not decode anything)

查看:785
本文介绍了Libstreaming错误(德codeR缓冲区不够大,德codeR没去code的任何东西)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想从这里使用libstreaming库: https://github.com/fyhertz/libstreaming

I'm trying to use the libstreaming library from here: https://github.com/fyhertz/libstreaming

我下面从这个例2: https://github.com/fyhertz/libstreaming-examples

尝试使用此流库上的Galaxy Nexus。

Trying to use this streaming library on a Galaxy Nexus.

如果我用一个小分辨率(新VideoQuality(128,96,20,500000)),我得到一个错误去codeR没去code什么:

If I use a small resolution (new VideoQuality(128,96,20,500000)), I get an error that the decoder did not decode anything:

06-09 19:59:31.531: D/libEGL(8198): loaded /vendor/lib/egl/libEGL_POWERVR_SGX540_120.so
06-09 19:59:31.539: D/libEGL(8198): loaded /vendor/lib/egl/libGLESv1_CM_POWERVR_SGX540_120.so
06-09 19:59:31.539: D/libEGL(8198): loaded /vendor/lib/egl/libGLESv2_POWERVR_SGX540_120.so
06-09 19:59:31.632: D/OpenGLRenderer(8198): Enabling debug mode 0
06-09 19:59:33.773: D/MainActivity(8198): Start
06-09 19:59:33.773: D/MainActivity(8198): Found mSurfaceView: net.majorkernelpanic.streaming.gl.SurfaceView{420285e0 V.E..... ........ 32,32-688,910 #7f080001 app:id/surface}
06-09 19:59:33.789: I/dalvikvm(8198): Could not find method android.media.MediaCodec.createInputSurface, referenced from method net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2
06-09 19:59:33.789: W/dalvikvm(8198): VFY: unable to resolve virtual method 377: Landroid/media/MediaCodec;.createInputSurface ()Landroid/view/Surface;
06-09 19:59:33.789: D/dalvikvm(8198): VFY: replacing opcode 0x6e at 0x005e
06-09 19:59:33.789: I/MediaStream(8198): Phone supports the MediaCoded API
06-09 19:59:33.843: D/dalvikvm(8198): GC_CONCURRENT freed 65K, 2% free 9075K/9168K, paused 4ms+2ms, total 34ms
06-09 19:59:33.843: D/dalvikvm(8198): WAIT_FOR_CONCURRENT_GC blocked 15ms
06-09 19:59:34.750: V/VideoQuality(8198): Supported resolutions: 1920x1080, 1280x720, 960x720, 800x480, 720x576, 720x480, 768x576, 640x480, 320x240, 352x288, 240x160, 176x144, 128x96
06-09 19:59:34.750: V/VideoQuality(8198): Supported frame rates: 15-15fps, 15-30fps, 24-30fps
06-09 19:59:35.140: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.171: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.179: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.211: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.242: W/ACodec(8198): Use baseline profile instead of 8 for AVC recording
06-09 19:59:35.242: I/ACodec(8198): setupVideoEncoder succeeded
06-09 19:59:35.515: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.515: E/OMXNodeInstance(8198): OMX_GetExtensionIndex failed
06-09 19:59:36.359: D/dalvikvm(8198): GC_CONCURRENT freed 156K, 3% free 9356K/9552K, paused 4ms+5ms, total 25ms
06-09 19:59:38.531: W/System.err(8198): java.lang.RuntimeException: The decoder did not decode anything.
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.decode(EncoderDebugger.java:799)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:246)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:115)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.video.H264Stream.testMediaCodecAPI(H264Stream.java:132)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.video.H264Stream.testH264(H264Stream.java:119)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.video.H264Stream.configure(H264Stream.java:111)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.Session.syncConfigure(Session.java:395)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.Session$3.run(Session.java:371)
06-09 19:59:38.539: W/System.err(8198):     at android.os.Handler.handleCallback(Handler.java:725)
06-09 19:59:38.539: W/System.err(8198):     at android.os.Handler.dispatchMessage(Handler.java:92)
06-09 19:59:38.546: W/System.err(8198):     at android.os.Looper.loop(Looper.java:137)
06-09 19:59:38.546: W/System.err(8198):     at android.os.HandlerThread.run(HandlerThread.java:60)

如果我尝试用更大的分辨率(新VideoQuality(640,480,20,500000)),它抱怨说,德codeR输入缓冲区不够大:

If I try with a bigger resolution (new VideoQuality(640,480,20,500000)), it complains that the decoder input buffer is not big enough:

06-09 19:51:51.054: D/libEGL(8096): loaded /vendor/lib/egl/libEGL_POWERVR_SGX540_120.so
06-09 19:51:51.062: D/libEGL(8096): loaded /vendor/lib/egl/libGLESv1_CM_POWERVR_SGX540_120.so
06-09 19:51:51.070: D/libEGL(8096): loaded /vendor/lib/egl/libGLESv2_POWERVR_SGX540_120.so
06-09 19:51:51.164: D/OpenGLRenderer(8096): Enabling debug mode 0
06-09 19:51:53.054: D/MainActivity(8096): Start
06-09 19:51:53.054: D/MainActivity(8096): Found mSurfaceView: net.majorkernelpanic.streaming.gl.SurfaceView{42031b00 V.E..... ........ 32,32-688,910 #7f080001 app:id/surface}
06-09 19:51:53.062: I/dalvikvm(8096): Could not find method android.media.MediaCodec.createInputSurface, referenced from method net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2
06-09 19:51:53.062: W/dalvikvm(8096): VFY: unable to resolve virtual method 377: Landroid/media/MediaCodec;.createInputSurface ()Landroid/view/Surface;
06-09 19:51:53.062: D/dalvikvm(8096): VFY: replacing opcode 0x6e at 0x005e
06-09 19:51:53.070: I/MediaStream(8096): Phone supports the MediaCoded API
06-09 19:51:53.132: D/dalvikvm(8096): GC_CONCURRENT freed 103K, 2% free 9038K/9168K, paused 4ms+3ms, total 42ms
06-09 19:51:53.132: D/dalvikvm(8096): WAIT_FOR_CONCURRENT_GC blocked 28ms
06-09 19:51:54.039: V/VideoQuality(8096): Supported resolutions: 1920x1080, 1280x720, 960x720, 800x480, 720x576, 720x480, 768x576, 640x480, 320x240, 352x288, 240x160, 176x144, 128x96
06-09 19:51:54.039: V/VideoQuality(8096): Supported frame rates: 15-15fps, 15-30fps, 24-30fps
06-09 19:51:54.468: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.500: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.515: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.554: D/dalvikvm(8096): GC_FOR_ALLOC freed 106K, 2% free 9210K/9344K, paused 18ms, total 18ms
06-09 19:51:54.554: I/dalvikvm-heap(8096): Grow heap (frag case) to 9.458MB for 460816-byte allocation
06-09 19:51:54.578: D/dalvikvm(8096): GC_FOR_ALLOC freed 0K, 2% free 9660K/9796K, paused 22ms, total 22ms
06-09 19:51:54.593: D/dalvikvm(8096): GC_CONCURRENT freed <1K, 2% free 9660K/9796K, paused 3ms+2ms, total 20ms
06-09 19:51:54.656: D/dalvikvm(8096): GC_FOR_ALLOC freed <1K, 2% free 9660K/9796K, paused 13ms, total 13ms
06-09 19:51:54.656: I/dalvikvm-heap(8096): Grow heap (frag case) to 9.897MB for 460816-byte allocation
06-09 19:51:54.671: D/dalvikvm(8096): GC_FOR_ALLOC freed 0K, 2% free 10110K/10248K, paused 16ms, total 16ms
06-09 19:51:54.679: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.687: D/dalvikvm(8096): GC_CONCURRENT freed <1K, 2% free 10110K/10248K, paused 2ms+1ms, total 13ms
06-09 19:51:54.703: W/ACodec(8096): Use baseline profile instead of 8 for AVC recording
06-09 19:51:54.703: I/ACodec(8096): setupVideoEncoder succeeded
06-09 19:51:55.257: D/dalvikvm(8096): GC_CONCURRENT freed 2K, 1% free 10501K/10576K, paused 4ms+2ms, total 32ms
06-09 19:51:55.359: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:55.359: E/OMXNodeInstance(8096): OMX_GetExtensionIndex failed
06-09 19:51:56.187: W/System.err(8096): java.lang.IllegalStateException: The decoder input buffer is not big enough (nal=91280, capacity=65536).
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.check(EncoderDebugger.java:838)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.decode(EncoderDebugger.java:753)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:246)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:115)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.video.H264Stream.testMediaCodecAPI(H264Stream.java:132)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.video.H264Stream.testH264(H264Stream.java:119)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.video.H264Stream.configure(H264Stream.java:111)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.Session.syncConfigure(Session.java:395)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.Session$3.run(Session.java:371)
06-09 19:51:56.187: W/System.err(8096):     at android.os.Handler.handleCallback(Handler.java:725)
06-09 19:51:56.187: W/System.err(8096):     at android.os.Handler.dispatchMessage(Handler.java:92)
06-09 19:51:56.187: W/System.err(8096):     at android.os.Looper.loop(Looper.java:137)
06-09 19:51:56.187: W/System.err(8096):     at android.os.HandlerThread.run(HandlerThread.java:60)

我试过几十分辨率,帧率和比特率的不同组合。一切我尝试在任一去codeR没去code什么或结果去codeR输入缓冲区不够大。

I've tried dozens of different combinations for resolution, framerate, and bitrate. Everything I try results in either "The decoder did not decode anything" or "The decoder input buffer is not big enough."

有没有人有这个库工作开箱?什么是这些错误的原因,有什么解决方法?如果我的搜索结果是任何迹象显示,我似乎是唯一的人在世界上有这个问题。我AP preciate任何见解!

Does anyone have this library working out of the box? What are the causes of these errors and what are the solutions? If my search results are any indication, I seem to be the only person in the world having this problem. I appreciate any insight!

下面是从我的MainActivity.java code:

Here is the code from my MainActivity.java:

package com.cornet.cornetspydroid2;

import net.majorkernelpanic.streaming.Session;
import net.majorkernelpanic.streaming.SessionBuilder;
import net.majorkernelpanic.streaming.audio.AudioQuality;
import net.majorkernelpanic.streaming.gl.SurfaceView;
import net.majorkernelpanic.streaming.video.VideoQuality;
import android.app.Activity;
import android.app.Fragment;
import android.content.pm.ActivityInfo;
import android.os.Bundle;
import android.util.Log;
import android.view.LayoutInflater;
import android.view.Menu;
import android.view.MenuItem;
import android.view.SurfaceHolder;
import android.view.View;
import android.view.ViewGroup;
import android.view.WindowManager;

public class MainActivity extends Activity implements Session.Callback, SurfaceHolder.Callback {

    private static final String TAG = "MainActivity";

    private static final String ip = "10.3.1.204";
    private static final VideoQuality VIDEO_QUALITY = new VideoQuality(128,96,20,500000);

    private Session mSession;
    private SurfaceView mSurfaceView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {

        super.onCreate(savedInstanceState);

        if (savedInstanceState == null) {
            getFragmentManager().beginTransaction().add(R.id.container, new PlaceholderFragment()).commit();
        }

        setContentView(R.layout.activity_main);

        setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

    }

    public void start(View view) {

        if (mSession != null && mSession.isStreaming()) {
            Log.d(TAG, "Already streaming!");
            return;
        }

        Log.d(TAG, "Start");

        mSurfaceView = (SurfaceView)findViewById(R.id.surface);

        mSession = SessionBuilder.getInstance()
            .setCallback(this)
            .setSurfaceView(mSurfaceView)
            .setPreviewOrientation(90)
            .setContext(getApplicationContext())
            .setAudioEncoder(SessionBuilder.AUDIO_NONE)
            .setAudioQuality(new AudioQuality(16000, 32000))
            .setVideoEncoder(SessionBuilder.VIDEO_H264)
            .setVideoQuality(VIDEO_QUALITY)
            .setDestination(ip)
        .build();

        mSurfaceView.getHolder().addCallback(this);

        if (!mSession.isStreaming()) {
            mSession.configure();
        }

    }

    public void stop(View view) {

        Log.d(TAG, "Stop");

        if (mSession != null) {
            mSession.stop();
        }

        if (mSurfaceView != null) {
            mSurfaceView.getHolder().removeCallback(this);
        }

    }

    @Override
    public void onDestroy() {

        super.onDestroy();

        if (mSession != null) {
            mSession.release();
        }

    }

    @Override
    public void onPreviewStarted() {
        Log.d(TAG,"Preview started.");
    }

    @Override
    public void onSessionConfigured() {
        Log.d(TAG,"Preview configured.");
        // Once the stream is configured, you can get a SDP formated session description
        // that you can send to the receiver of the stream.
        // For example, to receive the stream in VLC, store the session description in a .sdp file
        // and open it with VLC while streming.
        Log.d(TAG, mSession.getSessionDescription());
        mSession.start();
    }

    @Override
        public void onSessionStarted() {
        Log.d(TAG,"Session started.");
    }

    @Override
    public void onBitrareUpdate(long bitrate) {
        Log.d(TAG,"Bitrate: "+bitrate);
    }

    @Override
    public void onSessionError(int message, int streamType, Exception e) {
        if (e != null) {
            Log.e(TAG, e.getMessage(), e);
        }
    }

    @Override
    public void onSessionStopped() {
        Log.d(TAG,"Session stopped.");
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        mSession.startPreview();
    }

    @Override
        public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        mSession.stop();
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {

        // Inflate the menu; this adds items to the action bar if it is present.
        getMenuInflater().inflate(R.menu.main, menu);
        return true;
    }

    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
        // Handle action bar item clicks here. The action bar will
        // automatically handle clicks on the Home/Up button, so long
        // as you specify a parent activity in AndroidManifest.xml.
        int id = item.getItemId();
        if (id == R.id.action_settings) {
            return true;
        }
        return super.onOptionsItemSelected(item);
    }

    /**
     * A placeholder fragment containing a simple view.
     */
    public static class PlaceholderFragment extends Fragment {

        public PlaceholderFragment() {
        }

        @Override
        public View onCreateView(LayoutInflater inflater, ViewGroup container,
                Bundle savedInstanceState) {
            View rootView = inflater.inflate(R.layout.fragment_main, container,
                    false);
            return rootView;
        }
    }

}

更新:此库中的MediaStream可类有一个静态初始化,以查找名为android.media.Media codeC一类。当我强制使用sSuggestedMode = MODE_MEDIARECORDER_API而不是媒体codeC的,没有错误,不管我选择的分辨率,和Wireshark看到从手机传输的数据包。但是VLC不能出于某种原因(://@10.3.1.204:16420 UDP / H264)播放该视频流。这似乎表明,我选择分辨率是没有问题的;至少没有直接。

UPDATE: The MediaStream class in this library has a static initializer that looks for a class with the name "android.media.MediaCodec". When I force it to use sSuggestedMode = MODE_MEDIARECORDER_API instead of the MediaCodec, there are no errors, regardless of the resolution I choose, and Wireshark sees packets flowing from the phone. But VLC cannot play this video stream for some reason (udp/h264://@10.3.1.204:16420). This seems to indicate that the resolution I choose is not the problem; at least not directly.

的错误在Session.syncConfigure()调用发生(它甚至没有获得到Session.start())。它是能够成功地配置音频流,但在调用Stream.configure()用于视频流失败。该syncConfigure()调用,最终使得其H264Stream.testMedia codeCAPI(),这使得恩coderDebugger.debug通话方式()。这调试()方法是抛出原来的两个错误:输入缓冲区不够大,或者去codeR没去code什么

The errors are occurring in the Session.syncConfigure() call (it's not even getting to Session.start()). It is able to configure the audio stream successfully, but the call to Stream.configure() for the video stream is failing. The syncConfigure() call eventually makes its way to H264Stream.testMediaCodecAPI(), which makes a call to EncoderDebugger.debug(). That debug() method is throwing the two original errors: input buffer not big enough, or decoder did not decode anything.

东西,可能透出(包含在我提供的原始日志):我似乎总是在启动时的dalvikm标签得到一个调试错误:无法找到方法android.media.Media codeC .createInputSurface,从方法net.majorkernalpanic.streaming.video.VideoStream.en codeWithMedia codecMethod2引用。该日志条目后,立即有一个警告,再次从dalvikm的标签:VFY:无法解析虚方法377:Landroid /媒体/媒体codeC; .createInputSurface()Landroid /视图/表面;可这有什么关系呢?为什么能够从的Class.forName(找媒体codec类)的MediaStream可电话,但后来又当它试图从媒体codeC(createInputSurface)访问记录的方法,有警告,它不能找到方法?我的Andr​​oidManifest.xml文件(在主项目和libstreaming库项目)都指定最低SDK 16和目标SDK 19. API版本16,所以我不应该得到这些警告加入媒体codec类。这是否表明,我莫名其妙地错误配置?难道这些警告可能与我有什么问题?

Something that may be revealing (included in the original logs I provided): I always seem to be getting a debug error from the "dalvikm" tag on startup: "Could not find method android.media.MediaCodec.createInputSurface, referenced from method net.majorkernalpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2". Immediately after that log entry, there is a warning, again from the "dalvikm" tag: "VFY: unable to resolve virtual method 377: Landroid/media/MediaCodec;.createInputSurface ()Landroid/view/Surface;" Could this have anything to do with it? Why is it able to find the MediaCodec class from the Class.forName() call in MediaStream, but yet later when it tries to access a documented method from MediaCodec (createInputSurface), there are warnings that it can't find that method?? My AndroidManifest.xml files (in the main project and the libstreaming library project) both specify min SDK 16 and target SDK 19. The MediaCodec class was added in API version 16 so I should not be getting these warnings. Does this indicate that I'm mis-configured somehow? Could these warnings be related to the problem I'm having?

推荐答案

其实createInputSurface()不强制使用libstreaming,使用模式MODE_MEDIA codeC_API_2时,才需要。 Lisbtreaming应该作为媒体codeC API是在手机上正常运行在Android 4.1和4.2的工作,只要

说明:

在使用libstreaming在Android 4.1和4.2,你确实会看到虚拟机中的createInputSurface()不存在日志抱怨。这不会使应用程序崩溃,因为这种方法永远不会被调用(除非你试图强制MODE_MEDIA codeC_API_2不知)。

When using libstreaming on Android 4.1 and 4.2 you will indeed see the VM complains in the logs that the createInputSurface() does not exist. That won't make the app crash since this method won't ever get called (unless you try to force MODE_MEDIACODEC_API_2 somehow).

现在让我解释一下如何去codeR输入缓冲区不够大可能发生的错误。

当libstreaming使用带有分辨率为MODE_MEDIA codeC_API从未被使用过的用户的手机上首次尝试,看看是否至少有一个连接通过媒体codeC API访问codeR在该决议中正常工作。要做到这一点,它会利用一切EN codeR和电话上的可用德codeR尝试连接code和德code一个简单的视频。当德codeR一直没能脱code。通过一个连接codeR产生的H264流你提的这个错误发生。

When libstreaming is used with the MODE_MEDIACODEC_API with a resolution that has never been used before on the user's phone it first tries to see if at least one encoder accessible through the MediaCodec API is working properly at that resolution. To do that, it will try to encode and decode a simple video using every encoder and decoder available on the phone. The bug you are mentioning happens when a decoder has not been able to decode the H264 stream produced by an encoder.

如果已经发现,测试停止,没有有效的EN codeR /德codeR对,该决议被认为是不支持在手机上。那么Libstreaming将尝试回退的MODE_MEDIARECORDER_API模式。

If that test stops and no valid encoder/decoder pair has been found, the resolution is considered not supported on the phone. Libstreaming will then try to fallback on the MODE_MEDIARECORDER_API mode.

要找出那些模式实际上是,仅仅阅读的主页该项目中,我解释这一切更长的长度存在。

To find out what those "modes" actually are, just read the main page of the project, I explained all this in greater lengths there.

重要的是要了解该测试

这是测试的结果存储在共享preference,所以如果你想重试可以清除您的应用程序的缓存或更改恩coderDebugger.java布尔DEBUG为true。另外,如果机器人的变化的版本的测试(例如在更新后)将被再次运行

The result of that test is stored in the SharedPreference, so if you want to retry you can clear the cache of your app or change the boolean DEBUG to true in EncoderDebugger.java. Also, if the version of android changes (after an update for example) the test will be run again.

这个测试背后的原因是,媒体codeC API是越野车的地狱,如果你已经尝试使用它,你可能已经知道了。

(这个测试实际上是写在恩coderDebugger类,你可以检查它在github上。)

(This test is actually written in the EncoderDebugger class, you can check it out on github.)

到底发生了什么业务方案的手机上?

那么,他的手机不与Android 4.2通过测试,但它与Android 4.3一样。媒体codeC API已经修补了电话在两者之间。

Well, his phone do not pass the test with Android 4.2, but it does with android 4.3. The MediaCodec API has been patched on that phone in-between.

有可能仍然是改善该测试,使其对Galaxy Nexus的工作,与Android 4.2虽然方式。例如,它目前仅支持下列颜色格式:

There may still be a way to improve that test to make it work on the Galaxy Nexus with Android 4.2 though. For example, it currently only supports the following color formats:


  • COLOR_FormatYUV420SemiPlanar

  • COLOR_FormatYUV420PackedSemiPlanar

  • COLOR_TI_FormatYUV420PackedSemiPlanar:

  • COLOR_FormatYUV420Planar:

  • COLOR_FormatYUV420PackedPlanar:

(声明,我写的lib)

这篇关于Libstreaming错误(德codeR缓冲区不够大,德codeR没去code的任何东西)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆