如何使用RTMPStreamPublisher发布视频时在iPhone上存储视频? [英] How to store video on iPhone while publishing video with RTMPStreamPublisher?

查看:124
本文介绍了如何使用RTMPStreamPublisher发布视频时在iPhone上存储视频?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

现在我正在使用 RTMPStreamPublisher 在wowzaserver上发布视频。它正在成功上传,但有人能告诉我如何在上传到服务器时在iPhone上存储相同的视频吗?

Right now I am using RTMPStreamPublisher to publish the video at wowzaserver. It's uploading there successfully, but can anyone tell me how I can store the same video on the iPhone while uploading to the server?

我正在使用 https://github.com/slavavdovichenko/MediaLibDemos ,但没有太多文档可用。如果我可以存储发送的数据,那么我的工作就会成功。

I am using https://github.com/slavavdovichenko/MediaLibDemos, but there is not much documentation available. If I can just store the data that is sent for publication then my work will be successful.

这是他们用来上传流的方法,但我可以'找到一种方法在我的iPhone设备上存储相同的视频:

Here is the method they are using to upload the stream, but I can't find a way to store the same video on my iPhone device:

// ACTIONS

-(void)doConnect {
#if 0 // use ffmpeg rtmp 
    NSString *url = [NSString stringWithFormat:@"%@/%@", hostTextField.text, streamTextField.text];
    upstream = [[BroadcastStreamClient alloc] init:url  resolution:RESOLUTION_LOW];
    upstream.delegate = self;
    upstream.encoder = [MPMediaEncoder new];
    [upstream start];
    socket = [[RTMPClient alloc] init:host]
    btnConnect.title = @"Disconnect";     
    return;
#endif

#if 0 // use inside RTMPClient instance
    upstream = [[BroadcastStreamClient alloc] init:hostTextField.text resolution:RESOLUTION_LOW];
    //upstream = [[BroadcastStreamClient alloc] initOnlyAudio:hostTextField.text];
    //upstream = [[BroadcastStreamClient alloc] initOnlyVideo:hostTextField.text resolution:RESOLUTION_LOW];

#else // use outside RTMPClient instance

    if (!socket) {
        socket = [[RTMPClient alloc] init:hostTextField.text];
        if (!socket) {
            [self showAlert:@"Socket has not be created"];
            return;
        }
        [socket spawnSocketThread];
   }
    upstream = [[BroadcastStreamClient alloc] initWithClient:socket resolution:RESOLUTION_LOW];
#endif

    [upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
    //[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
    //[upstream setVideoBitrate:512000];
    upstream.delegate = self;
    [upstream stream:streamTextField.text publishType:PUBLISH_LIVE];
    //[upstream stream:streamTextField.text publishType:PUBLISH_RECORD];
    //[upstream stream:streamTextField.text publishType:PUBLISH_APPEND];
    btnConnect.title = @"Disconnect";     
}

我确实发现了 BroadcastStreamClient <的实例/ code>命名为upstream我可以通过以下行获得 AVCaptureSession

[upstream getCaptureSession];

如何使用此 AVCaptureSession 进行录制iPhone上的视频?

How can I use this AVCaptureSession for recording the video on the iPhone?

推荐答案

一旦你获得了 AVCaptureSession 你可以添加一个 AVCaptureMovieFileOutput 的实例,如下所示:

Once you got the AVCaptureSession you can add to it an instance of AVCaptureMovieFileOutput like this:

AVCaptureMovieFileOutput *movieFileOutput = [AVCaptureMovieFileOutput new];
if([captureSession canAddOutput:movieFileOutput]){
    [captureSession addOutput:movieFileOutput];
}

// Start recording
NSURL *outputURL = …
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

资料来源:
https://www.objc.io/issues/23-video/capturing-video/

另外看看这个是为了更好地理解如何使用 AVCaptureFileOutput https://developer.apple.com/library/mac/documentation/AVFoundation /Reference/AVCaptureFileOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureFileOutput

Also take a look at this in order to better understand how to use an AVCaptureFileOutput: https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureFileOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureFileOutput

这篇关于如何使用RTMPStreamPublisher发布视频时在iPhone上存储视频?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆