ArCore Sceneform:检测图像时播放 .mp4 视频 [英] ArCore Sceneform: Play .mp4 video when detect image

查看:26
本文介绍了ArCore Sceneform:检测图像时播放 .mp4 视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我找到一张图片时,我想在它上面放置一个文本和一个视频.文本视图放置在场景中,但视频没有放置,它只是添加到我的主布局中间.我正在使用组件 VideoView,我不确定这是问题

override fun onCreate(savedInstanceState: Bundle?) {(……)arFragment!!.arSceneView.scene.addOnUpdateListener { this.onUpdateFrame(it) }arSceneView = arFragment!!.arSceneView}私人乐趣 onUpdateFrame(frameTime: FrameTime) {val frame = arFragment!!.arSceneView.arFrameval 增强图像 = frame.getUpdatedTrackables(AugmentedImage::class.java)对于(增强图像中的增强图像){if (augmentedImage.trackingState == TrackingState.TRACKING) {if (augmentedImage.name.contains("car") && !modelCarAdded) {渲染视图(arFragment!!,增强图像.createAnchor(增强图像.centerPose))模型车添加 = 真}}}}

text_info 只是一个 TextView 组件,video_youtube 是一个带有 VideoView 的 RelativeLayout.

 私人乐趣 renderView(fragment: ArFragment, anchor: Anchor) {//在职的ViewRenderable.builder().setView(this, R.layout.text_info).建造().thenAccept { 可渲染 ->(renderable.view as TextView).text = "示例"addNodeToScene(片段,锚点,可渲染,Vector3(0f,0.2f,0f))}. 例外 { 可抛出 ->val builder = AlertDialog.Builder(this)builder.setMessage(throwable.message).setTitle("错误!")val 对话框 = builder.create()对话框显示()空值}//不工作ViewRenderable.builder().setView(this, R.layout.video_youtube).建造().thenAccept { 可渲染 ->val 视图 = renderable.viewvideoRenderable = 可渲染val path = "android.resource://" + packageName + "/" + R.raw.googlepixelview.video_player.setVideoURI(Uri.parse(path))renderable.material.setExternalTexture("videoTexture", 纹理)val videoNode = addNodeToScene(fragment, anchor, renderable, Vector3(0.2f, 0.5f, 0f))如果(!view.video_player.isPlaying){view.video_player.start()质地.表面纹理.setOnFrameAvailableListener {videoNode.renderable = videoRenderabletexture.surfaceTexture.setOnFrameAvailableListener(null)}} 别的 {videoNode.renderable = videoRenderable}}. 例外 { 可抛出 ->空值}}私人乐趣 addNodeToScene(片段:ArFragment,锚点:锚点,可渲染:可渲染,vector3:Vector3):节点{val anchorNode = AnchorNode(anchor)val 节点 = TransformableNode(fragment.transformationSystem)node.renderable = 可渲染node.setParent(anchorNode)node.localPosition = vector3fragment.arSceneView.scene.addChild(anchorNode)返回节点}

我尝试使用色度键视频示例,但我不希望视频的白色部分是透明的.而且我不确定我是否需要模型 (.sfb) 来显示视频.

解决方案

我使用了

When I find an image, I want to place a text and a video above it. The text view is placed on the scene but the video is not, it is just added to my main layout in the middle. I'm using the component VideoView, I'm not sure that's the problem

override fun onCreate(savedInstanceState: Bundle?) {
         (....)
        arFragment!!.arSceneView.scene.addOnUpdateListener { this.onUpdateFrame(it) }
        arSceneView = arFragment!!.arSceneView

}

private fun onUpdateFrame(frameTime: FrameTime) {
    val frame = arFragment!!.arSceneView.arFrame

    val augmentedImages = frame.getUpdatedTrackables(AugmentedImage::class.java)

    for (augmentedImage in augmentedImages) {
        if (augmentedImage.trackingState == TrackingState.TRACKING) {

            if (augmentedImage.name.contains("car") && !modelCarAdded) {
                renderView(arFragment!!,
                        augmentedImage.createAnchor(augmentedImage.centerPose))
                modelCarAdded = true
            }
        }
    }

}

The text_info is only a TextView component, the video_youtube is a RelativeLayout with the VideoView inside.

   private fun renderView(fragment: ArFragment, anchor: Anchor) {
    //WORKING
    ViewRenderable.builder()
            .setView(this, R.layout.text_info)
            .build()
            .thenAccept { renderable ->
                (renderable.view as TextView).text = "Example"
                addNodeToScene(fragment, anchor, renderable, Vector3(0f, 0.2f, 0f))

            }
            .exceptionally { throwable ->
                val builder = AlertDialog.Builder(this)
                builder.setMessage(throwable.message)
                        .setTitle("Error!")
                val dialog = builder.create()
                dialog.show()
                null
            }
    //NOT WORKING
    ViewRenderable.builder()
            .setView(this, R.layout.video_youtube)
            .build()
            .thenAccept { renderable ->
                val view = renderable.view
                videoRenderable = renderable
                val path = "android.resource://" + packageName + "/" + R.raw.googlepixel
                view.video_player.setVideoURI(Uri.parse(path))
                renderable.material.setExternalTexture("videoTexture", texture)
                val videoNode = addNodeToScene(fragment, anchor, renderable, Vector3(0.2f, 0.5f, 0f))
                if (!view.video_player.isPlaying) {
                    view.video_player.start()
                    texture
                            .surfaceTexture
                            .setOnFrameAvailableListener {
                                videoNode.renderable = videoRenderable
                                texture.surfaceTexture.setOnFrameAvailableListener(null)
                            }
                } else {
                    videoNode.renderable = videoRenderable
                }

            }
            .exceptionally { throwable ->
                null
            }
}

private fun addNodeToScene(fragment: ArFragment, anchor: Anchor, renderable: Renderable, vector3: Vector3): Node {
        val anchorNode = AnchorNode(anchor)
        val node = TransformableNode(fragment.transformationSystem)
        node.renderable = renderable
        node.setParent(anchorNode)
        node.localPosition = vector3
        fragment.arSceneView.scene.addChild(anchorNode)
        return node
    }

I tried using the Chroma Key Video example but I don't want the white parts of the video to be transparent. And I'm not sure that I need the model (.sfb) to show a video.

解决方案

I used the ChromaKey sample as a starting point.

First I changed the custom material used for the video by adding a flag to disable the chromakey filtering.

material {
    "name" : "Chroma Key Video Material",
    "defines" : [
        "baseColor"
    ],
    "parameters" : [
        {
           // The texture displaying the frames of the video.
           "type" : "samplerExternal",
           "name" : "videoTexture"
        },
        {
            // The color to filter out of the video.
            "type" : "float4",
            "name" : "keyColor"
        },
        {
            "type" : "bool",
            "name" : "disableChromaKey",
        }
    ],
    "requires" : [
        "position",
        "uv0"
    ],
    "shadingModel" : "unlit",
    // Blending is "masked" instead of "transparent" so that the shadows account for the
    // transparent regions of the video instead of just the shape of the mesh.
    "blending" : "masked",
    // Material is double sided so that the video is visible when walking behind it.
    "doubleSided" : true
}

fragment {
    vec3 desaturate(vec3 color, float amount) {
        // Convert color to grayscale using Luma formula:
        // https://en.wikipedia.org/wiki/Luma_%28video%29
        vec3 gray = vec3(dot(vec3(0.2126, 0.7152, 0.0722), color));

        return vec3(mix(color, gray, amount));
    }

    void material(inout MaterialInputs material) {
        prepareMaterial(material);

        vec2 uv = getUV0();

        if (!gl_FrontFacing) {
          uv.x = 1.0 - uv.x;
        }

        vec4 color = texture(materialParams_videoTexture, uv).rgba;

        if (!materialParams.disableChromaKey) {
            vec3 keyColor = materialParams.keyColor.rgb;

            float threshold = 0.675;
            float slope = 0.2;

            float distance = abs(length(abs(keyColor - color.rgb)));
            float edge0 = threshold * (1.0 - slope);
            float alpha = smoothstep(edge0, threshold, distance);
            color.rgb = desaturate(color.rgb, 1.0 - (alpha * alpha * alpha));

            material.baseColor.a = alpha;
            material.baseColor.rgb = inverseTonemapSRGB(color.rgb);
            material.baseColor.rgb *= material.baseColor.a;
        } else {
            material.baseColor = color;
        }
    }
}

Then set `disableChromaKey' to false in the .sfa file:

 materials: [
    {
      name: 'DefaultMaterial',
      parameters: [
        {
          videoTexture: {
            external_path: 'MISSING_PATH',
          },
        },
        {
          keyColor: [
            0,
            0,
            0,
            0,
          ],
        },
        {
            disableChromaKey : true,
        }
      ],
      source: 'sampledata/models/chroma_key_video_material.mat',
    },
  ],

Then I placed the video node based on the anchor from a hittest, and placed a ViewRenderable node above it for the text.

   private Node createVideoDisplay(final AnchorNode parent, Vector3 localPosition, String title) {
        // Create a node to render the video and add it to the anchor.
        Node videoNode = new Node();
        videoNode.setParent(parent);
        videoNode.setLocalPosition(localPosition);

        // Set the scale of the node so that the aspect ratio of the video is correct.
        float videoWidth = mediaPlayer.getVideoWidth();
        float videoHeight = mediaPlayer.getVideoHeight();
        videoNode.setLocalScale(
                new Vector3(
                        VIDEO_HEIGHT_METERS * (videoWidth / videoHeight),
                        VIDEO_HEIGHT_METERS, 1.0f));

        // Place the text above the video
        final float videoNodeHeight = VIDEO_HEIGHT_METERS+ localPosition.y;
        ViewRenderable.builder().setView(this,R.layout.video_title)
                .build().thenAccept(viewRenderable -> {
                   Node titleNode =  new Node();
                   titleNode.setLocalPosition(new Vector3(0,videoNodeHeight,0));
                   titleNode.setParent(parent);
                   titleNode.setRenderable(viewRenderable);
            ((TextView)viewRenderable.getView().findViewById(R.id.video_text))
                           .setText(title);
        });

        return videoNode;
    }

这篇关于ArCore Sceneform:检测图像时播放 .mp4 视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆