在ThreeJS中将立体投影映射到球体内部 [英] Mapping a stereographic projection to the inside of a sphere in ThreeJS

查看:981
本文介绍了在ThreeJS中将立体投影映射到球体内部的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

说到3D动画,有一些我不熟悉的术语和概念的 很多 (可能是附加到这个的第二个问题:什么是熟悉这些概念的好书?)。我不知道什么是UV(在3D渲染的背景下),我不熟悉将图像上的像素映射到网格上的点的工具。



我有360度相机生成以下图像(它实际上是HTML video 元素的输出):





我希望此图片的中心成为顶部球体,以及此图像中圆圈的任何半径都是从上到下沿球体的弧线。



这是我的起点(直接从复制代码行) Three.JS文档):

  var video = document.getElementById(texture-video); 

var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera(75,window.innerWidth / window.innerHeight,0.1,1000);

var renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth,window.innerHeight);
document.body.appendChild(renderer.domElement);

var texture = new THREE.VideoTexture(video);
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearFilter;
texture.format = THREE.RGBFormat;

var material = new THREE.MeshBasicMaterial({map:texture});

var geometry = new THREE.SphereGeometry(0.5,100,100);
var mesh = new THREE.Mesh(几何,材质);

scene.add(mesh);

camera.position.z = 1

函数animate()
{
mesh.rotation.y + = 0.01;
requestAnimationFrame(animate);
renderer.render(场景,相机);
}
animate();

这会产生以下结果:



< a href =https://i.stack.imgur.com/Ny1nT.png =nofollow noreferrer>



有一些问题:




  • 纹理旋转90度

  • 地面扭曲,但如果旋转固定可以修复?

  • 更新:在对正在生产的球体进行进一步调查后,它实际上并未旋转90度。相反,图像的顶部是球体的顶部,图像的底部是球体的底部。这会导致图像的左右边缘变成我看到的扭曲的侧面地面

  • 这是在球体的外部上。我想把它投射到球体的里面(并将相机放在球体内)



目前,如果我将相机放在球体内,我会变成黑色。我不认为这是一个照明问题,因为Three.JS文档说 MeshBasicMaterial 不需要照明。我认为问题可能是所有球体面的法线指向外部,我需要反转它们。我不确定会怎么做 - 但我很确定这是可能的,因为我认为这就是天空盒的工作方式。



做一些研究我很漂亮我确定需要修改紫外线来解决这个问题,我只是不知道这甚至是什么意思......

解决方案

工作示例



我分叉@ manthrax的CodeSandbox.io解决方案并用我自己的更新:








一个我仍然不明白



在球体底部的洞周围有一个多色环。它几乎看起来像天空的镜子。我不知道为什么会存在或者它是如何存在的。任何人都可以在评论中阐明这一点吗?


When it comes to 3D animation, there are a lot of terms and concepts that I'm not familiar with (maybe a secondary question to append to this one: what are some good books to get familiar with the concepts?). I don't know what a "UV" is (in the context of 3D rendering) and I'm not familiar with what tools exist for mapping pixels on an image to points on a mesh.

I have the following image being produced by a 360-degree camera (it's actually the output of an HTML video element):

I want the center of this image to be the "top" of the sphere, and any radius of the circle in this image to be an arc along the sphere from top to bottom.

Here's my starting point (copying lines of code directly from the Three.JS documentation):

var video = document.getElementById( "texture-video" );

var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 0.1, 1000 );

var renderer = new THREE.WebGLRenderer();
renderer.setSize( window.innerWidth, window.innerHeight );
document.body.appendChild( renderer.domElement );

var texture = new THREE.VideoTexture( video );
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearFilter;
texture.format = THREE.RGBFormat;

var material = new THREE.MeshBasicMaterial( { map: texture } );

var geometry = new THREE.SphereGeometry(0.5, 100, 100);
var mesh = new THREE.Mesh( geometry, material );

scene.add( mesh );

camera.position.z = 1

function animate()
{
    mesh.rotation.y += 0.01;
    requestAnimationFrame( animate );
    renderer.render( scene, camera );
}
animate();

This produces the following:

There are a few problems:

  • The texture is rotated 90 degrees
  • The ground is distorted, although this may be fixed if the rotation is fixed?
  • Update: Upon further investigation of the sphere being produced, it's not actually rotated 90 degrees. Instead, the top of the image is the top of the sphere and the bottom of the image is the bottom of the sphere. This causes the left and right edges of the image to become the distorted "sideways ground" I saw
  • This is on the outside of the sphere. I want to project this to the inside of the sphere (and place the camera inside the sphere)

Currently if I place the camera inside the sphere I get solid black. I don't think it's a lighting issue because the Three.JS docs said that a MeshBasicMaterial didn't need lighting. I think the issue may be that the normals of all of the sphere faces point outward and I need to reverse them. I'm not sure how one would do this - but I'm pretty sure it's possible since I think this is how skyboxes work.

Doing some research I'm pretty sure I need to modify the "UV"s to fix this, I just don't know how or really what that even means...

解决方案

Working Example

I forked @manthrax's CodeSandbox.io solution and updated it with my own:

https://codesandbox.io/s/4w1njkrv9

The Solution

So after spending a day researching UV mapping to understand what it meant and how it worked, I was able to sit down and scratch out some trig to map points on a sphere to points on my stereographic image. It basically came down to the following:

  1. Use arccosine of the Y coordinate to determine the magnitude of a polar coordinate on the stereographic image
  2. Use the arctangent of the X and Z coordinates to determine the angle of the polar coordinate on the stereographic image
  3. Use x = Rcos(theta), y = Rcos(theta) to compute the rectangular coordinates on the stereographic image

If time permits I may draw a quick image in Illustrator or something to explain the math, but it's standard trigonometry

I went a step further after this, because the camera I was using only has a 240 degree vertical viewing angle - which caused the image to get slightly distorted (especially near the ground). By subtracting the vertical viewing angle from 360 and dividing by two, you get an angle from the vertical within which no mapping should occur. Because the sphere is oriented along the Y axis, this angle maps to a particular Y coordinate - above which there's data, and below which there isn't.

  1. Calculate this "minimum Y value"
  2. For all points on the sphere:
    • If the point is above the minimum Y value, scale it linearly so that the first such value is counted as "0" and the top of the sphere is still counted as "1" for mapping purposes
    • If the point is below the minimum Y value, return nothing

Weird Caveats

For some reason the code I wrote flipped the image upside down. I don't know if I messed up on my trigonometry or if I messed up on my understanding of UV maps. Whatever the case, this was trivially fixed by flipping the sphere 180 degrees after mapping

As well, I don't know how to "return nothing" in the UV map, so instead I mapped all points below the minimum Y value to the corner of the image (which was black)

With a 240-degree viewing angle the space at the bottom of the sphere with no image data was sufficiently large (on my monitor) that I could see the black circle when looking directly ahead. I didn't like the visual appearance of this, so I plugged in 270 for the vertical FOV. this leads to minor distortion around the ground, but not as bad as when using 360.

The Code

Here's the code I wrote for updating the UV maps:

// Enter the vertical FOV for the camera here
var vFov = 270; // = 240;

var material = new THREE.MeshBasicMaterial( { map: texture, side: THREE.BackSide } );
var geometry = new THREE.SphereGeometry(0.5, 200, 200);

function updateUVs()
{
    var maxY = Math.cos(Math.PI * (360 - vFov) / 180 / 2);
    var faceVertexUvs = geometry.faceVertexUvs[0];
    // The sphere consists of many FACES
    for ( var i = 0; i < faceVertexUvs.length; i++ )
    {
        // For each face...
        var uvs = faceVertexUvs[i];
        var face = geometry.faces[i];
        // A face is a triangle (three vertices)
        for ( var j = 0; j < 3; j ++ )
        {
            // For each vertex...
            // x, y, and z refer to the point on the sphere in 3d space where this vertex resides
            var x = face.vertexNormals[j].x;
            var y = face.vertexNormals[j].y;
            var z = face.vertexNormals[j].z;

            // Because our stereograph goes from 0 to 1 but our vertical field of view cuts off our Y early
            var scaledY = (((y + 1) / (maxY + 1)) * 2) - 1;

            // uvs[j].x, uvs[j].y refer to a point on the 2d texture
            if (y < maxY)
            {
                var radius = Math.acos(1 - ((scaledY / 2) + 0.5)) / Math.PI;
                var angle = Math.atan2(x, z);

                uvs[j].x = (radius * Math.cos(angle)) + 0.5;
                uvs[j].y = (radius * Math.sin(angle)) + 0.5;
            } else {
                uvs[j].x = 0;
                uvs[j].y = 0;
            }
        }
    }
    // For whatever reason my UV mapping turned everything upside down
    // Rather than fix my math, I just replaced "minY" with "maxY" and
    // rotated the sphere 180 degrees
    geometry.rotateZ(Math.PI);
    geometry.uvsNeedUpdate = true;
}
updateUVs();

var mesh = new THREE.Mesh( geometry, material );

The Results

Now if you add this mesh to a scene everything looks perfect:

One Thing I Still Don't Understand

Right around the "hole" at the bottom of the sphere there's a multi-colored ring. It almost looks like a mirror of the sky. I don't know why this exists or how it got there. Could anyone shed light on this in the comments?

这篇关于在ThreeJS中将立体投影映射到球体内部的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆