iOS从网络接收视频 [英] IOS Receiving video from Network

查看:126
本文介绍了iOS从网络接收视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

更新 -我更正了以下代码中的一些错误,并且图像在另一台设备上显示,但是我还有另一个问题.打开视频捕获后,主"设备会连续发送数据,有时此捕获会出现在从"设备上,并且在很短的时间内,图像会闪烁"为空白,并在很短的时间内重复一次.有什么想法吗?

UPDATE - I have fixed some mistakes in the code below and the images are displayed on the other device, but I have another problem. While video capture is open, the "master" device sends data continuously, sometimes this capture appears on "slave" device and in a very short time, the image "blinks" to blank and repeat this all time for a short period. Any idea about this?

我正在开发一个需要将实时摄像头捕获和实时麦克风捕获发送到网络中另一台设备的应用程序.

I'm working on a app that's need to send live camera capture and live microphone capture to another device in network.

我已经使用TCP服务器完成了设备之间的连接,并使用bonjour发布了它,这就像是一种魅力.

I have done the connection between devices using a TCP server and publish it with bonjour, this works like a charm.

最重要的部分是要从主"设备发送和接收视频和音频,并将其呈现在从"设备上.

The most important part is about to send and receive video and audio from "master" device and render it on "slave" device.

首先,这里是一段代码,应用程序在其中获取相机样本缓冲区并在UIImage中进行转换:

First, here a piece of code where the app get the camera sample buffer and transform in UIImage:

@implementation AVCaptureManager (AVCaptureVideoDataOutputSampleBufferDelegate)

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
  dispatch_sync(dispatch_get_main_queue(), ^{
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
    NSData *data = UIImageJPEGRepresentation(image, 0.2);

    [self.delegate didReceivedImage:image];
    [self.delegate didReceivedFrame:data];

    [pool drain];
  });
}


- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
  CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

  CVPixelBufferLockBaseAddress(imageBuffer, 0);

  size_t width = CVPixelBufferGetWidth(imageBuffer);
  size_t height = CVPixelBufferGetHeight(imageBuffer);
  size_t bytesPerRow = width * 4;

  CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

  void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

  CGContextRef context = CGBitmapContextCreate(
                                               baseAddress,
                                               width,
                                               height,
                                               8,
                                               bytesPerRow,
                                               colorSpace,
                                               kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little
                                               );

  CGImageRef quartzImage = CGBitmapContextCreateImage(context);

  UIImage *image = [UIImage imageWithCGImage:quartzImage];
  CGImageRelease(quartzImage);
  CGColorSpaceRelease(colorSpace);
  CGContextRelease(context);
  CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

  return image;
}

@end

消息"[[self.delegate didReceivedImage:image];"只是为了测试在主设备上捕获的图像,并且该图像在捕获设备上工作.

The message "[self.delegate didReceivedImage:image];" is just to test the image capture on master device, and this image works on capture device.

接下来是有关如何将其发送到网络的信息:

The next is about how to I send it to network:

- (void) sendData:(NSData *)data
{
  if(_outputStream && [_outputStream hasSpaceAvailable])
  {
    NSInteger bytesWritten = [_outputStream write:[data bytes] maxLength:[data length]];

    if(bytesWritten < 0)
      NSLog(@"[ APP ] Failed to write message");

  }
}

看我正在使用RunLoop来写入和读取流,我认为这比不断打开和关闭流更好.

Look I'm using RunLoop to write and read streams, I think this is better than open and closes streams constantly.

接下来,我在从设备上收到"NSStreamEventHasBytesAvailable"事件,该代码是处理此事件的代码片段:

Next, I receive the "NSStreamEventHasBytesAvailable" event on the slave device, the piece of code where handle this is:

case NSStreamEventHasBytesAvailable:
      /*I can't to start a case without a expression, why not?*/
      NSLog(@"[ APP ] stream handleEvent NSStreamEventHasBytesAvailable");
      NSUInteger bytesRead;
      uint8_t buffer[BUFFER_SIZE];

      while ([_inputStream hasBytesAvailable])
      {
        bytesRead = [_inputStream read:buffer maxLength:BUFFER_SIZE];
        NSLog(@"[ APP ] bytes read:  %i", bytesRead);

        if(bytesRead)
          [data appendBytes:(const void *)buffer length:sizeof(buffer)];
      }


      [_client writeImageWithData:data];

      break;

BUFFER_SIZE的值为32768. 我认为while块不是必需的,但是我使用了它,因为如果我不能在第一次迭代中读取所有可用字节,就可以在下一次读取.

The value of BUFFER_SIZE is 32768. I think the while block is not necessary, but I use it because if I can't read all available bytes at first iteration, I can read in the next.

这就是重点,流正确地到达了,但是在NSData上序列化的映像似乎已损坏,接下来,我只是将数据发送到客户端...

So, this is the point, the stream comes correctly but the image serialized on NSData seems be corrupted, in the next, I just send data to client...

[_client writeImageWithData:data];

...并像这样简单地在客户端类中使用数据创建UIImage ...

... and create a UIImage with data in client class simple like this...

[camPreview setImage:[UIImage imageWithData:data]];

在camPreview(是UIImageView)中,我有一幅图像只是在屏幕上显示占位符,当我从网络获取图像并将其传递给camPreview时,占位符将变为空白.

In the camPreview (yes is a UIImageView), I have a image just to display the placeholder on the screen, when I get the imagem from network and pass to camPreview, the placeholder gets blank.

其他想法是关于输出的,当我开始捕获时,即我接收数据的第一部分,我是从系统得到此消息的:

Other think is about the output, when I start the capture, first parts where I receive data, I get this message from system:

错误:ImageIO:JPEG损坏的JPEG数据:标记0xbf之前的28个无关字节

Error: ImageIO: JPEG Corrupt JPEG data: 28 extraneous bytes before marker 0xbf

错误:ImageIO:JPEG不支持的标记类型0xbf

Error: ImageIO: JPEG Unsupported marker type 0xbf

过一会儿,我再收到此消息.

After some little time, I get this messages anymore.

发现问题的原因是图像未显示在从属"设备上.

The point is find the cause of the image not are displayed on the "slave" device.

谢谢.

推荐答案

我不确定您发送图像的频率,但是即使不是很频繁,我也认为我会扫描JPEG中的SOI和EOI标记数据以确保您拥有所有数据.这是帖子我很快找到的

I am not sure how often you are sending images, but even if it is not very often I think I would scan for the SOI and EOI markers in the JPEG data to insure you have all the data. Here is a post I quickly found

这篇关于iOS从网络接收视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆