通过Encoder SDK平滑流式传输 [英] Smooth streaming via the Encoder SDK

查看:77
本文介绍了通过Encoder SDK平滑流式传输的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我会使用C#和Expression Encoder SDK将wmv文件编码为流畅的流式格式文件
我总是收到此错误:错误的视频配置文件:VC的视频配置文件设置无效-1高级档案
这是我的代码。我有什么不对?


I would do encode a wmv file to a smooth streaming format file with C# and the Expression Encoder SDK

I always get this error : Bad video profile:  Invalid video profile settings for VC-1 Advanced profile

And that's my code. What do I wrong ?

			using (Job job = new Job())
			{
				VideoProfile videoprofile = new VideoProfile();
				videoprofile.AdaptiveGop = false;
				videoprofile.AdaptiveQuantization = 16;
				videoprofile.BFrameNumber = 1;
				videoprofile.Bitrate = CameraStream1Bitrate * 1000;
				videoprofile.BufferWindow = CameraBufferWindow * 1000;
				videoprofile.ClosedGop = true;
				videoprofile.Codec = VideoCodec.VC1;
				videoprofile.CodecProfile = VideoCodecProfile.Advanced;
				videoprofile.DefaultFileOutputMode = FileOutputMode.IisSmoothStreamingMultipleFile;
				videoprofile.DenoiseFilter = true;
				videoprofile.DQuant = DQuant.IBPFrames;
				videoprofile.FrameRate = CameraFrameRate == -1 ? mediaItem.OriginalFrameRate : CameraFrameRate;
				videoprofile.Height = CameraStream1Height;
				videoprofile.InLoopFilter = true;
				videoprofile.InsertSkippedFrames = true;
				videoprofile.KeyFrameSpacing = CameraFrameRate;
				videoprofile.MaxBitrate = 5000 * 1000;
				videoprofile.MaximumQP = 30;
				videoprofile.MbrItems.Add(new VideoProfileMbrDelta
				{
					Width = CameraStream1Width,
					Height = CameraStream1Height,
					Bitrate = CameraStream1Bitrate * 1000
				});
				videoprofile.MbrItems.Add(new VideoProfileMbrDelta
				{
					Width = CameraStream2Width,
					Height = CameraStream2Height,
					Bitrate = CameraStream2Bitrate * 1000
				});
				videoprofile.MbrItems.Add(new VideoProfileMbrDelta
				{
					Width = CameraStream3Width,
					Height = CameraStream3Height,
					Bitrate = CameraStream3Bitrate * 1000
				});
				videoprofile.MbrItems.Add(new VideoProfileMbrDelta
				{
					Width = CameraStream4Width,
					Height = CameraStream4Height,
					Bitrate = CameraStream4Bitrate * 1000
				});
				videoprofile.Mode = VideoMode.Cbr;
				videoprofile.MotionChromaSearch = MotionChromaSearch.MacroblockAdaptiveWithTrueChroma;
				videoprofile.MotionMatchMethod = MotionMatchMethod.MacroblockAdaptiveSADHadamard;
				videoprofile.MotionSearchRange = MotionSearchRange.MacroblockAdaptive;
				videoprofile.NoiseEdgeRemovalFilter = true;
				videoprofile.OverlapFilter = true;
				videoprofile.SceneChangeDetection = true;
				videoprofile.VC1OutputMode = VC1OutputMode.ElementaryStreamSequenceHeader;
				videoprofile.VideoCodecPreset = VideoCodecPreset.AdaptiveStreaming;
				videoprofile.Width = CameraStream1Width;
				mediaItem.VideoProfile = videoprofile;

				mediaItem.TwoPassEncoding = false;
				mediaItem.AspectRatioMode = AspectRatioMode.Source;
				mediaItem.DeinterlaceMode = DeinterlaceMode.Off;

				job.MediaItems.Add(mediaItem);
				job.OutputDirectory = string.Format("{0}\\Session7\\Web\\", path);

				job.EncodeProgress += new EventHandler<EncodeProgressEventArgs>(OnProgress);

				job.Encode();
			}

推荐答案

我看到的一个问题是你使用与第二个流相同的stream1比特率。在V2中,MbrItems只包含额外的流。因此,第一个流由videoprofile.Width / Height / Bitrate控制,当您添加第一个VideoProfileMbrDelta时,您将添加第二个流的详细信息。我刚做了以下内容,它对我有用。此代码与您的代码相同,只是我注释掉了VideoProfileMbrDelta的第一个添加。如果您仍有问题,请告诉我您正在使用的其他设置。


One problem I see is that you're using the same bitrate for stream1 as the second stream. In V2 the MbrItems just contains the extra streams. So the first stream is governed by videoprofile.Width/Height/Bitrate and when you add the first VideoProfileMbrDelta you're adding the details of the second stream. I just did the following and it worked for me. This code is the same as your code except I commented out the first add of VideoProfileMbrDelta. If you still have problems let me know what the other settings are that you're using.

int CameraStream1Bitrate = 500;
int CameraBufferWindow = 5;
int CameraFrameRate = 30;
int CameraStream1Width = 640;
int CameraStream1Height = 480;
int CameraStream2Bitrate = 450;
int CameraStream2Width = 620;
int CameraStream2Height = 460;
int CameraStream3Bitrate = 400;
int CameraStream3Width = 600;
int CameraStream3Height = 440;
int CameraStream4Bitrate = 380;
int CameraStream4Width = 580;
int CameraStream4Height = 420;
string path = @"C:\temp";
using (Job job = new Job())
{
    VideoProfile videoprofile = new VideoProfile();
    videoprofile.AdaptiveGop = false;
    videoprofile.AdaptiveQuantization = 16;
    videoprofile.BFrameNumber = 1;
    videoprofile.Bitrate = CameraStream1Bitrate * 1000;
    videoprofile.BufferWindow = CameraBufferWindow * 1000;
    videoprofile.ClosedGop = true;
    videoprofile.Codec = VideoCodec.VC1;
    videoprofile.CodecProfile = VideoCodecProfile.Advanced;
    videoprofile.DefaultFileOutputMode = 
        FileOutputMode.IisSmoothStreamingMultipleFile;
    videoprofile.DenoiseFilter = true;
    videoprofile.DQuant = DQuant.IBPFrames;
    videoprofile.FrameRate = CameraFrameRate == -1 ? 
        mediaItem.OriginalFrameRate : CameraFrameRate;
    videoprofile.Height = CameraStream1Height;
    videoprofile.InLoopFilter = true;
    videoprofile.InsertSkippedFrames = true;
    videoprofile.KeyFrameSpacing = CameraFrameRate;
    videoprofile.MaxBitrate = 5000 * 1000;
    videoprofile.MaximumQP = 30;
    //videoprofile.MbrItems.Add(new VideoProfileMbrDelta
    //{
    //    Width = CameraStream1Width,
    //    Height = CameraStream1Height,
    //    Bitrate = CameraStream1Bitrate * 1000
    //});
    videoprofile.MbrItems.Add(new VideoProfileMbrDelta
    {
        Width = CameraStream2Width,
        Height = CameraStream2Height,
        Bitrate = CameraStream2Bitrate * 1000
    });
    videoprofile.MbrItems.Add(new VideoProfileMbrDelta
    {
        Width = CameraStream3Width,
        Height = CameraStream3Height,
        Bitrate = CameraStream3Bitrate * 1000
    });
    videoprofile.MbrItems.Add(new VideoProfileMbrDelta
    {
        Width = CameraStream4Width,
        Height = CameraStream4Height,
        Bitrate = CameraStream4Bitrate * 1000
    });
    videoprofile.Mode = VideoMode.Cbr;
    videoprofile.MotionChromaSearch = 
        MotionChromaSearch.MacroblockAdaptiveWithTrueChroma;
    videoprofile.MotionMatchMethod = 
        MotionMatchMethod.MacroblockAdaptiveSADHadamard;
    videoprofile.MotionSearchRange = 
        MotionSearchRange.MacroblockAdaptive;
    videoprofile.NoiseEdgeRemovalFilter = true;
    videoprofile.OverlapFilter = true;
    videoprofile.SceneChangeDetection = true;
    videoprofile.VC1OutputMode = 
        VC1OutputMode.ElementaryStreamSequenceHeader;
    videoprofile.VideoCodecPreset = 
        VideoCodecPreset.AdaptiveStreaming;
    videoprofile.Width = CameraStream1Width;
    mediaItem.VideoProfile = videoprofile;

    mediaItem.TwoPassEncoding = false;
    mediaItem.AspectRatioMode = AspectRatioMode.Source;
    mediaItem.DeinterlaceMode = DeinterlaceMode.Off;

    job.MediaItems.Add(mediaItem);
    job.OutputDirectory = string.Format("{0}\\Session7\\Web\\", path);

    job.EncodeProgress += 
        new EventHandler<EncodeProgressEventArgs>(OnProgress);

    job.Encode();
}


这篇关于通过Encoder SDK平滑流式传输的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆