异步MFT没有发送MFTransformHaveOutput事件(英特尔硬件MJPEG解码器MFT) [英] Asynchronous MFT is not sending MFTransformHaveOutput Event(Intel Hardware MJPEG Decoder MFT)

查看:309
本文介绍了异步MFT没有发送MFTransformHaveOutput事件(英特尔硬件MJPEG解码器MFT)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用MediaFoundation SourceReader技术开发USB摄像头流桌面应用程序.该相机具有USB3.0支持,可提供60fps的1080p MJPG视频格式分辨率.

I'm developing USB camera streaming Desktop application using MediaFoundation SourceReader technique. The camera is having USB3.0 support and gives 60fps for 1080p MJPG video format resolution.

我使用Software MJPEG Decoder MFT将MJPG转换为YUY2帧,然后转换为RGB32帧以在窗口上绘制.使用此软件解码器时,我只能在窗口上渲染30fps,而不是60fps.我已经在该站点上发布了一个问题,并提出了一些建议,以使用英特尔®硬件MJPEG解码器MFT解决帧丢失问题.

I used Software MJPEG Decoder MFT to convert MJPG to YUY2 frames and then converted into the RGB32 frame to draw on the window. Instead of 60fps, I'm able to render only 30fps on the window when using this software decoder. I have posted a question on this site and got some suggestion to use Intel Hardware MJPEG Decoder MFT to solve frame drop issue.

要使用此硬件MJPEG解码器,我已经解决了异步MFT处理模型,并通过IMFTransform接口为IMFMediaEventGenerator配置了一个异步回调.

To use this Hardware MJPEG decoder, I have addressed Asynchronous MFT processing model and configured an asynchronous callback for IMFMediaEventGenerator through IMFTransform interface.

在使用ProcessMessage方法调用MFT_MESSAGE_NOTIFY_START_OF_STREAM之后,我两次收到MFTransfromNeedInput事件,但没有收到来自MFT的MFTransformHaveOutput事件.

After called MFT_MESSAGE_NOTIFY_START_OF_STREAM using ProcessMessage method, I received the MFTransfromNeedInput event twice but I didn't receive the MFTransformHaveOutput event from MFT.

我在这里共享了我的代码供您参考:

I have shared my code here for your reference:

IMFTransform* m_pTransform = NULL;

HRESULT EnumDecoderMFT ()
{
    HRESULT hr;
    IMFActivate** ppActivate;   
    UINT32 numDecodersMJPG = 0;
    LPWSTR lpMFTName = 0;

    MFT_REGISTER_TYPE_INFO inputFilter = {MFMediaType_Video,MFVideoFormat_MJPG};
    MFT_REGISTER_TYPE_INFO outputFilter = {MFMediaType_Video,MFVideoFormat_YUY2};

    UINT32 unFlags = MFT_ENUM_FLAG_SYNCMFT | MFT_ENUM_FLAG_ASYNCMFT | MFT_ENUM_FLAG_LOCALMFT | MFT_ENUM_FLAG_HARDWARE | MFT_ENUM_FLAG_SORTANDFILTER;

    hr = MFTEnumEx(MFT_CATEGORY_VIDEO_DECODER, unFlags, &inputFilter, &outputFilter, &ppActivate, &numDecodersMJPG);
    if (FAILED(hr)) return hr; 

    hr = ppActivate[0]->GetAllocatedString(MFT_FRIENDLY_NAME_Attribute,&lpMFTName,0);
    if (FAILED(hr)) return hr;

    // Activate transform
    hr = ppActivate[0]->ActivateObject(__uuidof(IMFTransform), (void**)&m_pTransform);
    if (FAILED(hr)) return hr;

    hr = hr = m_pTransform->GetAttributes(&pAttributes);
    if (SUCCEEDED(hr))
    {
        hr = pAttributes->SetUINT32(MF_TRANSFORM_ASYNC_UNLOCK, TRUE);           
        if(FAILED(hr)) return hr;

        hr = pAttributes->SetUINT32(MFT_SUPPORT_DYNAMIC_FORMAT_CHANGE,TRUE);        
        if(FAILED(hr)) return hr;

        hr = pAttributes->SetUINT32(MF_READWRITE_ENABLE_HARDWARE_TRANSFORMS, TRUE);
        if(FAILED(hr)) return hr;

        hr = m_pTransform->QueryInterface(IID_IMFMediaEventGenerator,(void**)&m_pEventGenerator);           
        if(FAILED(hr)) return hr;

        hr = m_pEventGenerator->BeginGetEvent((IMFAsyncCallback*)this,NULL);
        if(FAILED(hr)) return hr;

        pAttributes->Release();
    }

    SafeRelease(&ppActivate[0]);

    CoTaskMemFree(ppActivate);

    return hr;      
}

HRESULT Invoke(IMFAsyncResult *pResult)
{
    HRESULT hr = S_OK,hrStatus;
    MediaEventType meType = MEUnknown;  // Event type
    IMFMediaEvent *pEvent = NULL;

    // Get the event from the event queue.
    hr = m_pEventGenerator->EndGetEvent(pResult, &pEvent);      //Completes an asynchronous request for the next event in the queue.
    if(FAILED(hr)) return hr;

    // Get the event type. 
    hr = pEvent->GetType(&meType);
    if(FAILED(hr)) return hr;

    hr = pEvent->GetStatus(&hrStatus);
    if(FAILED(hr)) return hr;

    if(SUCCEEDED(hrStatus))
    {
        if(meType == METransformNeedInput)
        {       
            SetEvent(m_hNeedInputEvent);
        }
        else if(meType == METransformHaveOutput)
        {           
            SetEvent(m_hHaveOutputEvent);
        }
        else if(meType == METransformDrainComplete)
        {
            hr = m_pTransform->ProcessMessage(MFT_MESSAGE_COMMAND_FLUSH,0);
            if(FAILED(hr)) return hr;
        }
        else if(meType == MEError)
        {
            PROPVARIANT pValue;
            hr = pEvent->GetValue(&pValue);         
            if(FAILED(hr)) return hr;   
        }

        hr = m_pEventGenerator->BeginGetEvent((IMFAsyncCallback*)this,NULL);    
        if(FAILED(hr)) return hr;
    }

done:
    SafeRelease(&pEvent);
    return S_OK;
}

HRESULT CMFSourceReader::OnReadSample(
    HRESULT hrStatus,
    DWORD  dwStreamIndex ,
    DWORD  dwStreamFlags ,
    LONGLONG  llTimestamp ,
    IMFSample *pSample      // Can be NULL
    )
{
    HRESULT hr = S_OK;
    IMFMediaBuffer *pBuffer = NULL;
    DWORD dwcbTotLen = 0;           
    IMFSample *mftOutSample = NULL;

    EnterCriticalSection(&m_critsec);

    if (FAILED(hrStatus))
    {
        hr = hrStatus;
    }

    if (SUCCEEDED(hr))
    {
        if (pSample != NULL)
        {
            if(dwStreamIndex == 0)      //VideoStream
            {                   
                if(m_pTransform)
                {   
                    hr = m_pTransform->ProcessMessage(MFT_MESSAGE_NOTIFY_START_OF_STREAM, 0);
                    if(FAILED(hr))  return hr;

                    m_dwWaitObj = WaitForSingleObject(m_hNeedInputEvent,INFINITE);
                    if(m_dwWaitObj == WAIT_OBJECT_0)
                    {                           
                        hr = ProcessInputSample(pSample);
                        if(FAILED(hr))  return hr;
                    }

                    m_dwWaitObj = WaitForSingleObject(m_hHaveOutputEvent,INFINITE);
                    if(m_dwWaitObj == WAIT_OBJECT_0)
                    {
                        hr = ProcessOutputSample(&mftOutSample);
                        if(FAILED(hr))  return hr;
                    }
                }
            }
        }
    }

    if(SUCCEEDED(hr))
    {
        if(m_pReader != NULL)
        {
            hr = m_pReader->ReadSample(
                (DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,
                0,
                NULL,   // actual
                NULL,   // flags
                NULL,   // timestamp
                NULL    // sample
                );
            if(FAILED(hr)) return hr;
        }
    }

    SafeRelease(&mftOutSample);

    LeaveCriticalSection(&m_critsec);
    return hr; 
}

HRESULT ProcessOutputSample(IMFSample **pOutSample)
{
    HRESULT hr = S_OK;
    MFT_OUTPUT_DATA_BUFFER outputDataBuffer;
    DWORD processOutputStatus = 0,mftOutFlags = 0;
    MFT_OUTPUT_STREAM_INFO StreamInfo;
    IMFSample *mftOutSample = NULL;
    IMFMediaBuffer *pOutBuffer = NULL;

    if(m_pTransform != NULL)
    {   
        hr = m_pTransform->GetOutputStreamInfo(0, &StreamInfo);
        if(FAILED(hr)) return hr;

        DWORD status = 0;
        hr = m_pTransform->GetOutputStatus(&status);
        if (FAILED(hr)) return hr;

        hr = MFCreateSample(&mftOutSample);
        if(FAILED(hr)) return hr;

        hr = MFCreateMemoryBuffer(StreamInfo.cbSize, &pOutBuffer);
        if(FAILED(hr)) return hr;

        hr = mftOutSample->AddBuffer(pOutBuffer);
        if(FAILED(hr)) return hr;

        outputDataBuffer.dwStreamID = 0;
        outputDataBuffer.dwStatus = 0;
        outputDataBuffer.pEvents = NULL;
        outputDataBuffer.pSample = mftOutSample;

        hr = m_pTransform->ProcessOutput(0, 1, &outputDataBuffer, &processOutputStatus);            
        if(FAILED(hr)) return hr;

        hr = m_pTransform->ProcessMessage(MFT_MESSAGE_NOTIFY_END_OF_STREAM, 0);
        if (FAILED(hr))  return hr;

        hr = m_pTransform->ProcessMessage(MFT_MESSAGE_COMMAND_DRAIN, 0);
        if (FAILED(hr))  return hr;

        if(mftOutSample)
        {
            *pOutSample = mftOutSample;
            (*pOutSample)->AddRef();
        }

        ResetEvent(m_hHaveOutputEvent);
    }

    SafeRelease(&mftOutSample);
    SafeRelease(&pOutBuffer);

    return hr;
}

HRESULT ProcessInputSample(IMFSample *pInputSample)
{
    HRESULT hr;

    if(m_pTransform != NULL)
    {               
        hr = m_pTransform->ProcessInput(0, pInputSample, 0);
        if(FAILED(hr)) return hr;

        hr = m_pTransform->ProcessMessage(MFT_MESSAGE_NOTIFY_END_OF_STREAM,0);
        if(FAILED(hr)) return hr;

        ResetEvent(m_hNeedInputEvent);
    }

    return hr;
}

我已经在代码中注释了ProcessOutputSample()方法并进行了检查,并连续MFT发送MFTransformNeedInput事件类型.在ProcessInput示例之后,我有了ProcessOutput方法,但是它返回了E_UNEXPECTED错误.我已经在MSDN中阅读了有关此错误的信息,他们提到我不应该在没有接收MFTransformHaveOutput事件的情况下调用IMFTransform :: ProcessOutput方法.

I have commented ProcessOutputSample() method in my code and checked, continuously MFT send MFTransformNeedInput event type. After ProcessInput sample, I have ProcessOutput method but it returned an E_UNEXPECTED error. I have read about this error in MSDN, they have mentioned that I should not call IMFTransform:: ProcessOutput method without receive MFTransformHaveOutput event.

我有什么遗漏吗?我可以在MediaFoundation中使用英特尔硬件MJPEG解码器MFT吗?有人提供使用此解码器的示例吗?过去4天,我一直在努力解决这个问题.

Am I missing anything?Can I use Intel Hardware MJPEG Decoder MFT inside MediaFoundation? Someone provide a sample to use this decoder? Past 4 days, I'm struggling with this issue.

谢谢.

推荐答案

我也遇到了同样的E_UNEXPECTED(未指定错误")错误,该错误在输入第一个输入样本到Intel硬件后立即由Transform的事件生成器返回刚刚返回的调用转换需要更多输入",但未产生任何输出.不过,相同的代码在Nvidia机器上也可以正常工作.经过大量的实验和研究,我发现我创建了太多的D3d11Device实例,以我为例,我分别创建了2到3个用于捕获,颜色转换和硬件编码器的设备.而我可以简单地重用一个D3dDevice实例.但是,创建多个D3d11Device实例可能在高端计算机上工作.这没有记录在任何地方.我什至找不到"E_UNEXPECTED"错误原因的线索.没有提及.有很多与此类似的StackOverflow线程一直未得到解答,即使有完整的源代码,即使Microsoft人员也无法指出问题所在.

I too had the same E_UNEXPECTED ("Unspecified Error") error getting returned by Transform's Event generator immediately after feeding the first input sample on Intel hardware, and, further calls just returned "Transform Need more input", but no output was produced. The same code worked fine on Nvidia machines though. After experimenting and researching a lot, I figured out that I was creating too many instances of D3d11Device, In my case, I created 2 to 3 devices for Capturing, color conversion, and Hardware encoder respectively. Whereas, I could simply have reused a single D3dDevice instance. Creating multiple D3d11Device instances might work on high-end machines though. This is not documented anywhere. I was unable to find even a clue for the causes of the "E_UNEXPECTED" error. It's mentioned nowhere. There were many StackOverflow threads similar to this kept unanswered, even Microsoft people were unable to point out the problem, given the complete source code.

重用D3D11Device实例解决了该问题.

Reusing the D3D11Device instance solved the problem.

这篇关于异步MFT没有发送MFTransformHaveOutput事件(英特尔硬件MJPEG解码器MFT)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆