SKVideoNode 仅在 SCNSphere 的一小部分 [英] SKVideoNode only on a small part of SCNSphere
问题描述
我使用 SKVideoNode 作为 360 度视频球体的材质,但它只在球体的 xy 正部分呈现视频,我从 URL 流式传输视频,它是 .m3u8
参考 - SKVideoNode 作为 SCNSphere 的纹理很多人似乎和我有同样的问题
func makeSphere() {让sceneView = SCNView(frame: self.view.frame);self.view.addSubview(sceneView);var screenSize: CGRect = UIScreen.mainScreen().bounds;var screenWidth = screenSize.width;var screenHeight = screenSize.height;场景视图.frame.size.height = screenHeight * 1;场景视图.frame.size.width = screenWidth * 1;sceneView.center.x = screenWidth * 0.5;让场景 = SCNScene();sceneView.scene = 场景;sphereGeometry = SCNSphere(半径:5);sphereNode = SCNNode(geometry: sphereGeometry);sphereNode.position = SCNVector3(x: 0, y: 0, z: 0);sphereGeometry.segmentCount = 55;约束 = SCNLookAtConstraint(目标:sphereNode);让相机 = SCNCamera();让 cameraNode = SCNNode();cameraNode.camera = 相机;cameraNode.position = SCNVector3(x: 0, y: 0, z: 0);让光 = SCNLight();light.type = SCNLightTypeOmni;让 lightNode = SCNNode();lightNode.light = 光;lightNode.position = SCNVector3(x: 0, y: 0, z: 0);cameraNode.constraints = [约束];scene.rootNode.addChildNode(cameraNode);scene.rootNode.addChildNode(sphereNode);让 videoMaterial = SCNMaterial();让路径 = "http://video-url.m3u8";让 url = NSURL(string: path);让资产 = AVURLAsset(URL: url!,options: nil);让 playerItem = AVPlayerItem(资产:资产);让玩家 = AVPlayer(playerItem: playerItem);让 videoNode = SKVideoNode(AVPlayer: player);让大小 = CGFloat(100.0);让 spriteScene = SKScene(size: CGSizeMake(size,size));videoNode.size.width = 大小;videoNode.size.height = 大小;spriteScene.addChild(videoNode);videoMaterial.diffuse.contents = spriteScene;videoMaterial.specular.contents = UIColor.redColor();videoMaterial.shininess = 1.0;videoMaterial.doubleSided = true;sphereGeometry.materials = [videoMaterial];videoNode.play();}
你可以使用上面的代码来重现我的问题,如果它有所不同,当我显示图像时它工作得很好.
编辑
使用 videoMaterial.diffuse.contents.transfom(SCNMatrix4MakeScale(0,-1,1));
和 videoMaterial.diffuse.wrapT = SCNWrapMode.Repeat;
会导致视频要投影到球体的下半部分,但我能看到的只是拉伸的环,而不是正确显示,更改 WrapMode
可以使 iOS 6 屏幕仅显示 1 种颜色.>
使用 videoMaterial.diffuse.contents.transfom(SCNMatrix4MakeScale(1,0,1));
和 videoMaterial.diffuse.wrapT = SCNWrapMode.Repeat;
渲染视频在球体的左侧,但拉伸纹理/视频.
很难说具体出了什么问题,但我在这里有一个可行的解决方案:https://github.com/alfiehanssen/ThreeSixtyPlayer
它使用 SKVideoNode 来处理单视场和立体球面 360 度视频.
我确实注意到您没有设置 SKScene 的 position
或 anchorPoint
,我认为您必须这样做才能获得 SKVideoNode (材料)正确定位.
I'm using SKVideoNode as a material for my sphere for 360 video, but it only renders the video on the xy positive part of the sphere, I'm streaming the video from a URL it's a .m3u8
for reference see - SKVideoNode as texture for SCNSphere
Multiple people seem to be having the same issue as me
func makeSphere() {
let sceneView = SCNView(frame: self.view.frame);
self.view.addSubview(sceneView);
var screenSize: CGRect = UIScreen.mainScreen().bounds;
var screenWidth = screenSize.width;
var screenHeight = screenSize.height;
sceneView.frame.size.height = screenHeight * 1;
sceneView.frame.size.width = screenWidth * 1;
sceneView.center.x = screenWidth * 0.5;
let scene = SCNScene();
sceneView.scene = scene;
sphereGeometry = SCNSphere(radius: 5);
sphereNode = SCNNode(geometry: sphereGeometry);
sphereNode.position = SCNVector3(x: 0, y: 0, z: 0);
sphereGeometry.segmentCount = 55;
constraint = SCNLookAtConstraint(target: sphereNode);
let camera = SCNCamera();
let cameraNode = SCNNode();
cameraNode.camera = camera;
cameraNode.position = SCNVector3(x: 0, y: 0, z: 0);
let light = SCNLight();
light.type = SCNLightTypeOmni;
let lightNode = SCNNode();
lightNode.light = light;
lightNode.position = SCNVector3(x: 0, y: 0, z: 0);
cameraNode.constraints = [constraint];
scene.rootNode.addChildNode(cameraNode);
scene.rootNode.addChildNode(sphereNode);
let videoMaterial = SCNMaterial();
let path = "http://video-url.m3u8";
let url = NSURL(string: path);
let asset = AVURLAsset(URL: url!,options: nil);
let playerItem = AVPlayerItem(asset: asset);
let player = AVPlayer(playerItem: playerItem);
let videoNode = SKVideoNode(AVPlayer: player);
let size = CGFloat(100.0);
let spriteScene = SKScene(size: CGSizeMake(size,size));
videoNode.size.width = size;
videoNode.size.height = size;
spriteScene.addChild(videoNode);
videoMaterial.diffuse.contents = spriteScene;
videoMaterial.specular.contents = UIColor.redColor();
videoMaterial.shininess = 1.0;
videoMaterial.doubleSided = true;
sphereGeometry.materials = [videoMaterial];
videoNode.play();
}
You can use the code above to reproduce my problem, if it makes a difference, when I display an image it works just fine.
EDIT
Using videoMaterial.diffuse.contents.transfom(SCNMatrix4MakeScale(0,-1,1));
and videoMaterial.diffuse.wrapT = SCNWrapMode.Repeat;
causes the video to be project on the lower half of the sphere, but instead of appearing correctly all I can see is stretched rings, changing the WrapMode
makes it so that the iOS 6 screen only shows 1 colour.
Using videoMaterial.diffuse.contents.transfom(SCNMatrix4MakeScale(1,0,1));
and videoMaterial.diffuse.wrapT = SCNWrapMode.Repeat;
renders the video on the left side of the sphere, but stretches the texture / video.
It's hard to say specifically what's going wrong, but I have a working solution here: https://github.com/alfiehanssen/ThreeSixtyPlayer
It uses an SKVideoNode for both monoscopic and stereoscopic spherical 360 video.
I do notice that you're not setting the position
or anchorPoint
of your SKScene, and this is something I believe you must do in order to get the SKVideoNode (material) positioned properly.
这篇关于SKVideoNode 仅在 SCNSphere 的一小部分的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!