如何使用DirectShow捕获RGB中的活动相机帧 [英] How to capture live camera frames in RGB with DirectShow

查看:279
本文介绍了如何使用DirectShow捕获RGB中的活动相机帧的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在通过DirectShow实现实时视频捕获,用于实时处理和显示。 (增强现实应用程序)。

I'm implementing live video capture through DirectShow for live processing and display. (Augmented Reality app).

我可以轻松地访问像素,但似乎我无法获得SampleGrabber提供RGB数据。设备(一个iSight - 在VMWare中运行VC ++ Express)只报告MEDIASUBTYPE_YUY2。

I can access the pixels easily enough, but it seems I can't get the SampleGrabber to provide RGB data. The device (an iSight -- running VC++ Express in VMWare) only reports MEDIASUBTYPE_YUY2.

经过大量的Google搜索,我还是无法确定DirectShow是否应该提供 - 在这种事情的颜色空间转换。一些网站报告没有内置YUV< - > RGB转换,其他报告,您只需要在您的ISampleGrabber上使用RGB子类型调用SetMediaType。

After extensive Googling, I still can't figure out whether DirectShow is supposed to provide built-in color space conversion for this sort of thing. Some sites report that there is no YUV<->RGB conversion built in, others report that you just have to call SetMediaType on your ISampleGrabber with an RGB subtype.

任何建议都非常感谢,在这一个。代码如下。请注意

Any advice is greatly appreciated, I'm going nuts on this one. Code provided below. Please note that


  • 该代码的工作原理,除了它不提供RGB数据

  • p>我知道我可以实现自己的转换过滤器,但这是不可行的,因为我必须预计每一种可能的设备格式,这是一个相对较小的项目
  • The code works, except that it doesn't provide RGB data
  • I'm aware that I can implement my own conversion filter, but this is not feasible because I'd have to anticipate every possible device format, and this is a relatively small project

// Playback
IGraphBuilder *pGraphBuilder = NULL;
ICaptureGraphBuilder2 *pCaptureGraphBuilder2 = NULL;
IMediaControl *pMediaControl = NULL;
IBaseFilter *pDeviceFilter = NULL;
IAMStreamConfig *pStreamConfig = NULL;
BYTE *videoCaps = NULL;
AM_MEDIA_TYPE **mediaTypeArray = NULL;

// Device selection
ICreateDevEnum *pCreateDevEnum = NULL;
IEnumMoniker *pEnumMoniker = NULL;
IMoniker *pMoniker = NULL;
ULONG nFetched = 0;

HRESULT hr = CoInitializeEx(NULL, COINIT_MULTITHREADED);

// Create CreateDevEnum to list device
hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER, IID_ICreateDevEnum, (PVOID *)&pCreateDevEnum);
if (FAILED(hr)) goto ReleaseDataAndFail;

// Create EnumMoniker to list devices 
hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory, &pEnumMoniker, 0);
if (FAILED(hr)) goto ReleaseDataAndFail;

pEnumMoniker->Reset();

// Find desired device
while (pEnumMoniker->Next(1, &pMoniker, &nFetched) == S_OK) 
{
  IPropertyBag *pPropertyBag;
  TCHAR devname[256];

  // bind to IPropertyBag
  hr = pMoniker->BindToStorage(0, 0, IID_IPropertyBag, (void **)&pPropertyBag);

  if (FAILED(hr))
  {
    pMoniker->Release();
    continue;
  }

  VARIANT varName;
  VariantInit(&varName);
  HRESULT hr = pPropertyBag->Read(L"DevicePath", &varName, 0);

  if (FAILED(hr))
  {
    pMoniker->Release();
    pPropertyBag->Release();
    continue;
  }

  char devicePath[DeviceInfo::STRING_LENGTH_MAX] = "";

  wcstombs(devicePath, varName.bstrVal, DeviceInfo::STRING_LENGTH_MAX);

  if (strcmp(devicePath, deviceId) == 0)
  {
    // Bind Moniker to Filter
    pMoniker->BindToObject(0, 0, IID_IBaseFilter, (void**)&pDeviceFilter);

    break;
  }

  pMoniker->Release();
  pPropertyBag->Release();
}

if (pDeviceFilter == NULL) goto ReleaseDataAndFail;

// Create sample grabber
IBaseFilter *pGrabberF = NULL;
hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (void**)&pGrabberF);
if (FAILED(hr)) goto ReleaseDataAndFail;

hr = pGrabberF->QueryInterface(IID_ISampleGrabber, (void**)&pGrabber);
if (FAILED(hr)) goto ReleaseDataAndFail;

// Create FilterGraph
hr = CoCreateInstance(CLSID_FilterGraph,
NULL,
CLSCTX_INPROC,
IID_IGraphBuilder,
(LPVOID *)&pGraphBuilder);
if (FAILED(hr)) goto ReleaseDataAndFail;

// create CaptureGraphBuilder2
hr = CoCreateInstance(CLSID_CaptureGraphBuilder2, NULL, CLSCTX_INPROC, IID_ICaptureGraphBuilder2, (LPVOID *)&pCaptureGraphBuilder2);
if (FAILED(hr)) goto ReleaseDataAndFail;

// set FilterGraph
hr = pCaptureGraphBuilder2->SetFiltergraph(pGraphBuilder);
if (FAILED(hr)) goto ReleaseDataAndFail;

// get MediaControl interface
hr = pGraphBuilder->QueryInterface(IID_IMediaControl, (LPVOID *)&pMediaControl);
if (FAILED(hr)) goto ReleaseDataAndFail;

// Add filters
hr = pGraphBuilder->AddFilter(pDeviceFilter, L"Device Filter");
if (FAILED(hr)) goto ReleaseDataAndFail;

hr = pGraphBuilder->AddFilter(pGrabberF, L"Sample Grabber");
if (FAILED(hr)) goto ReleaseDataAndFail;

// Set sampe grabber options
AM_MEDIA_TYPE mt;
ZeroMemory(&mt, sizeof(AM_MEDIA_TYPE));
mt.majortype = MEDIATYPE_Video;
mt.subtype = MEDIASUBTYPE_RGB32;
hr = pGrabber->SetMediaType(&mt);
if (FAILED(hr)) goto ReleaseDataAndFail;

hr = pGrabber->SetOneShot(FALSE);
if (FAILED(hr)) goto ReleaseDataAndFail;

hr = pGrabber->SetBufferSamples(TRUE);
if (FAILED(hr)) goto ReleaseDataAndFail;

// Get stream config interface
hr = pCaptureGraphBuilder2->FindInterface(NULL, &MEDIATYPE_Video, pDeviceFilter, IID_IAMStreamConfig, (void **)&pStreamConfig);
if (FAILED(hr)) goto ReleaseDataAndFail;

int streamCapsCount = 0, capsSize, bestFit = -1, bestFitPixelDiff = 1000000000, desiredPixelCount = _width * _height,
bestFitWidth = 0, bestFitHeight = 0;

float desiredAspectRatio = (float)_width / (float)_height;

hr = pStreamConfig->GetNumberOfCapabilities(&streamCapsCount, &capsSize);
if (FAILED(hr)) goto ReleaseDataAndFail;

videoCaps = (BYTE *)malloc(capsSize * streamCapsCount);
mediaTypeArray = (AM_MEDIA_TYPE **)malloc(sizeof(AM_MEDIA_TYPE *) * streamCapsCount);

for (int i = 0; i < streamCapsCount; i++)
{
  hr = pStreamConfig->GetStreamCaps(i, &mediaTypeArray[i], videoCaps + capsSize * i);
  if (FAILED(hr)) continue;

  VIDEO_STREAM_CONFIG_CAPS *currentVideoCaps = (VIDEO_STREAM_CONFIG_CAPS *)(videoCaps + capsSize * i);

  int closestWidth = MAX(currentVideoCaps->MinOutputSize.cx, MIN(currentVideoCaps->MaxOutputSize.cx, width));
  int closestHeight = MAX(currentVideoCaps->MinOutputSize.cy, MIN(currentVideoCaps->MaxOutputSize.cy, height));

  int pixelDiff = ABS(desiredPixelCount - closestWidth * closestHeight);

  if (pixelDiff < bestFitPixelDiff && ABS(desiredAspectRatio - (float)closestWidth / (float)closestHeight) < 0.1f)
  {
    bestFit = i;
    bestFitPixelDiff = pixelDiff;
    bestFitWidth = closestWidth;
    bestFitHeight = closestHeight;
  }
}

if (bestFit == -1) goto ReleaseDataAndFail;

AM_MEDIA_TYPE *mediaType;
hr = pStreamConfig->GetFormat(&mediaType);
if (FAILED(hr)) goto ReleaseDataAndFail;

VIDEOINFOHEADER *videoInfoHeader = (VIDEOINFOHEADER *)mediaType->pbFormat;
videoInfoHeader->bmiHeader.biWidth = bestFitWidth;
videoInfoHeader->bmiHeader.biHeight = bestFitHeight;
//mediaType->subtype = MEDIASUBTYPE_RGB32;
hr = pStreamConfig->SetFormat(mediaType);
if (FAILED(hr)) goto ReleaseDataAndFail;

pStreamConfig->Release();
pStreamConfig = NULL;

free(videoCaps);
videoCaps = NULL;
free(mediaTypeArray);
mediaTypeArray = NULL;

// Connect pins
IPin *pDeviceOut = NULL, *pGrabberIn = NULL;

if (FindPin(pDeviceFilter, PINDIR_OUTPUT, 0, &pDeviceOut) && FindPin(pGrabberF, PINDIR_INPUT, 0, &pGrabberIn))
{
  hr = pGraphBuilder->Connect(pDeviceOut, pGrabberIn);
  if (FAILED(hr)) goto ReleaseDataAndFail;
}
else
{
  goto ReleaseDataAndFail;
}

// start playing
hr = pMediaControl->Run();
if (FAILED(hr)) goto ReleaseDataAndFail;

hr = pGrabber->GetConnectedMediaType(&mt);

// Set dimensions
width = bestFitWidth;
height = bestFitHeight;
_width = bestFitWidth;
_height = bestFitHeight;

// Allocate pixel buffer
pPixelBuffer = (unsigned *)malloc(width * height * 4);

// Release objects
pGraphBuilder->Release();
pGraphBuilder = NULL;
pEnumMoniker->Release();
pEnumMoniker = NULL;
pCreateDevEnum->Release();
pCreateDevEnum = NULL;

return true;


推荐答案

库存颜色空间转换器不支持YUY2到RGB转换。但是,有一些应用程序和设备安装某种转换器,如果这是正确注册,dshow会自动使用它。这就是为什么有些人报告它只是工作。 (当然有些设备提供RGB,因此在这些情况下不需要转换。)

The stock colour space converter does not support YUY2 to RGB conversion. However, there are a number of apps and devices that install a converter of some sort, and if this is properly registered, dshow will use it automatically. That's why some people report that it just works. (of course some devices offer RGB, so no conversion is needed in those cases).

您可以从下载免费的YUV转换过滤器yuvxfm href =http://www.gdcl.co.uk/downloads.htm =nofollow noreferrer> YUV Transform(在页面底部)。在您的系统上注册,它应该允许以任何合理的RGB或YUV格式捕获。

You can download a freely-available YUV conversion filter, "yuvxfm" from YUV Transform (at the bottom of the page). Register this on your system and it should allow capture in any reasonable RGB or YUV format.

G

这篇关于如何使用DirectShow捕获RGB中的活动相机帧的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆