从 VideoPlayer 获取当前帧纹理 [英] Get current frame Texture from VideoPlayer

查看:45
本文介绍了从 VideoPlayer 获取当前帧纹理的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这篇文章中有说明 使用新的Unity VideoPlayer 和 VideoClip API 来播放视频 可以根据需要检索每一帧的纹理"

It is stated in this post Using new Unity VideoPlayer and VideoClip API to play video that one can "retrieve texture for each frame if needed"

请问获取当前帧作为 Texture2D 的正确方法是什么?

What's the proper way to get the current frame as a Texture2D, please?

回答后我做了这个但它不起作用:

After the answer I did this but it's not working:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Video;

public class AverageColorFromTexture : MonoBehaviour {

    public VideoClip videoToPlay;
    public Light lSource;

    private Color targetColor;
    private VideoPlayer videoPlayer;
    private VideoSource videoSource;
    private Renderer rend;
    private Texture tex;
    private AudioSource audioSource;

    void Start()
    {
        Application.runInBackground = true;
        StartCoroutine(playVideo());
    }

    IEnumerator playVideo()
    {

        rend = GetComponent<Renderer>();

        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();

        //Disable Play on Awake for both Video and Audio
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;

        videoPlayer.source = VideoSource.VideoClip;
        videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
        videoPlayer.EnableAudioTrack(0, true);
        videoPlayer.SetTargetAudioSource(0, audioSource);

        //Set video To Play then prepare Audio to prevent Buffering
        videoPlayer.clip = videoToPlay;
        videoPlayer.Prepare();

        //Wait until video is prepared
        while (!videoPlayer.isPrepared)
        {
            Debug.Log("Preparing Video");
            yield return null;
        }
        Debug.Log("Done Preparing Video");

        //Assign the Texture from Video to Material texture
        tex = videoPlayer.texture;
        rend.material.mainTexture = tex;

        //Enable new frame Event
        videoPlayer.sendFrameReadyEvents = true;

        //Subscribe to the new frame Event
        videoPlayer.frameReady += OnNewFrame;

        //Play Video
        videoPlayer.Play();

        //Play Sound
        audioSource.Play();

        Debug.Log("Playing Video");
        while (videoPlayer.isPlaying)
        {
            Debug.LogWarning("Video Time: " + Mathf.FloorToInt((float)videoPlayer.time));
            yield return null;
        }
        Debug.Log("Done Playing Video");
    }

    void OnNewFrame(VideoPlayer source, long frameIdx)
    {
        Texture2D videoFrame = (Texture2D)source.texture;

        targetColor = CalculateAverageColorFromTexture(videoFrame);
        lSource.color = targetColor ;
    }


    Color32 CalculateAverageColorFromTexture(Texture2D tex)
    {
        Color32[] texColors = tex.GetPixels32();
        int total = texColors.Length;
        float r = 0;
        float g = 0;
        float b = 0;

        for(int i = 0; i < total; i++)
        {
            r += texColors[i].r;
            g += texColors[i].g;
            b += texColors[i].b;
        }
        return new Color32((byte)(r / total) , (byte)(g / total) , (byte)(b / total) , 0);
    }
}

推荐答案

您可以通过三个步骤正确地做到这一点:

You can do that properly in three steps:

  1. 通过设置VideoPlayer.sendFrameReadyEventstrue.

订阅 VideoPlayer.frameReady 事件

您分配给VideoPlayer.frameReady 事件将在新帧可用时调用.只需从 VideoPlayer 访问视频帧,它将通过投射 VideoPlayer.textureTexture2D.

The function you assigned to the VideoPlayer.frameReady event will be called when new frame is available. Just access the video frame from the VideoPlayer it will pass into the parameter by casting VideoPlayer.texture to Texture2D.

就是这样.

在代码中:

video.Play() 之前添加这些:

// Enable new frame Event
videoPlayer.sendFrameReadyEvents = true;

// Subscribe to the new frame Event
videoPlayer.frameReady += OnNewFrame;

这是您的 OnNewFrame 函数签名.

This is your OnNewFrame function signature.

void OnNewFrame(VideoPlayer source, long frameIdx)
{
    Texture2D videoFrame = (Texture2D)source.texture;
    // Do anything with the videoFrame Texture.
}

值得注意的是,启用该事件的成本很高.在执行此操作之前,请确保您需要每一帧.

Texture2D videoFrame = (Texture2D)source.texture;Texture2D videoFrame = source.texture as Texture2D; 都失败了.

我将 Debug.Log(source.texture); 放在 OnNewFrame 函数中并得到:

I put Debug.Log(source.texture); inside the OnNewFrame function and got:

TempBuffer 294 320x240 (UnityEngine.RenderTexture)

TempBuffer 294 320x240 (UnityEngine.RenderTexture)

所以,看起来 Video.texture 属性返回的是 RenderTexture 类型,而不是它应该返回的 Texture 类型.

So, it looks like the Video.texture property is returning RenderTexture type not Texture type like it should.

我们必须将 RenderTexture 转换为 Texture2D.

We have to convert the RenderTexture to Texture2D.

void Start()
{
    videoFrame = new Texture2D(2, 2);]
    ...
}

//Initialize in the Start function
Texture2D videoFrame;

void OnNewFrame(VideoPlayer source, long frameIdx)
{
    RenderTexture renderTexture = source.texture as RenderTexture;

    if (videoFrame.width != renderTexture.width || videoFrame.height != renderTexture.height)
    {
        videoFrame.Resize(renderTexture.width, renderTexture.height);
    }
    RenderTexture.active = renderTexture;
    videoFrame.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
    videoFrame.Apply();
    RenderTexture.active = null;

    targetColor = CalculateAverageColorFromTexture(videoFrame);
    lSource.color = targetColor;
}

您问题中的完整代码:

public class AverageColorFromTexture : MonoBehaviour
{
    public VideoClip videoToPlay;
    public Light lSource;

    private Color targetColor;
    private VideoPlayer videoPlayer;
    private VideoSource videoSource;
    private Renderer rend;
    private Texture tex;
    private AudioSource audioSource;

    void Start()
    {
        videoFrame = new Texture2D(2, 2);
        Application.runInBackground = true;
        StartCoroutine(playVideo());
    }

    IEnumerator playVideo()
    {
        rend = GetComponent<Renderer>();

        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();

        //Disable Play on Awake for both Video and Audio
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;

        videoPlayer.source = VideoSource.VideoClip;
        videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
        videoPlayer.EnableAudioTrack(0, true);
        videoPlayer.SetTargetAudioSource(0, audioSource);

        //Set video To Play then prepare Audio to prevent Buffering
        videoPlayer.clip = videoToPlay;
        videoPlayer.Prepare();

        //Wait until video is prepared
        while (!videoPlayer.isPrepared)
        {
            Debug.Log("Preparing Video");
            yield return null;
        }
        Debug.Log("Done Preparing Video");

        //Assign the Texture from Video to Material texture
        tex = videoPlayer.texture;
        rend.material.mainTexture = tex;

        //Enable new frame Event
        videoPlayer.sendFrameReadyEvents = true;

        //Subscribe to the new frame Event
        videoPlayer.frameReady += OnNewFrame;

        //Play Video
        videoPlayer.Play();

        //Play Sound
        audioSource.Play();

        Debug.Log("Playing Video");
        while (videoPlayer.isPlaying)
        {
            Debug.LogWarning("Video Time: " + Mathf.FloorToInt((float)videoPlayer.time));
            yield return null;
        }
        Debug.Log("Done Playing Video");
    }

    //Initialize in the Start function
    Texture2D videoFrame;

    void OnNewFrame(VideoPlayer source, long frameIdx)
    {
        RenderTexture renderTexture = source.texture as RenderTexture;


        if (videoFrame.width != renderTexture.width || videoFrame.height != renderTexture.height)
        {
            videoFrame.Resize(renderTexture.width, renderTexture.height);
        }
        RenderTexture.active = renderTexture;
        videoFrame.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
        videoFrame.Apply();
        RenderTexture.active = null;

        targetColor = CalculateAverageColorFromTexture(videoFrame);
        lSource.color = targetColor;
    }

    Color32 CalculateAverageColorFromTexture(Texture2D tex)
    {
        Color32[] texColors = tex.GetPixels32();
        int total = texColors.Length;
        float r = 0;
        float g = 0;
        float b = 0;

        for (int i = 0; i < total; i++)
        {
            r += texColors[i].r;
            g += texColors[i].g;
            b += texColors[i].b;
        }
        return new Color32((byte)(r / total), (byte)(g / total), (byte)(b / total), 0);
    }
}

这篇关于从 VideoPlayer 获取当前帧纹理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆