拉贾瓦利旋转相机带有Sensor.TYPE_ROTATION_VECTOR奇怪的行为 [英] Rajawali rotating camera with Sensor.TYPE_ROTATION_VECTOR strange behavior

查看:96
本文介绍了拉贾瓦利旋转相机带有Sensor.TYPE_ROTATION_VECTOR奇怪的行为的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在创建一个全景视图,该视图允许用户通过旋转智能手机来查看球形图像.为此,我将Rajawali的Skybox与TYPE_ROTATION_VECTOR传感器一起使用.

我让它工作了,但是只有当我向前看时(这实际上是基于我的旋转(偏航))

这是行为:

  • 向前看:偏航=偏航,俯仰=俯仰和横滚=横滚
  • 向左看:偏航=偏航,俯仰=横滚和横滚=俯仰
  • 向后看:偏航=偏航,俯仰=俯仰* -1和侧倾=侧倾* -1.

现在我预感发生了什么.似乎相机对象"一直朝着相同的方向,即使它似乎没有出现.这意味着俯仰似乎与滚动相同,但实际上,由于物体没有旋转,它仍然在俯仰.我正在将其与在飞机上环顾四周进行比较.

我需要做些什么来解决这个问题?

我有种感觉,我将不得不使用lookAt()旋转摄像机,但是我不确定如何旋转.

public class SkyboxFragment extends RajawaliFragment implements SensorEventListener {

    public static final String TAG = "SkyBoxFragment";
    private SensorManager mSensorManager;
    private float[] orientationVals = new float[3];
    private float[] mRotationMatrix = new float[16];
    private Sensor mRotVectSensor;

    @Override
    public View onCreateView(LayoutInflater inflater, ViewGroup container,
                             Bundle savedInstanceState) {
        super.onCreateView(inflater, container, savedInstanceState);

        LinearLayout ll = new LinearLayout(getActivity());
        ll.setOrientation(LinearLayout.VERTICAL);
        ll.setGravity(Gravity.CENTER_HORIZONTAL | Gravity.TOP);
        mSensorManager = (SensorManager) getActivity().getSystemService(
                Context.SENSOR_SERVICE);
        mRotVectSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
        mLayout.addView(ll);
        mSensorManager.registerListener(this,
                mRotVectSensor,
                10000);
        return mLayout;
    }

    @Override
    public AExampleRenderer createRenderer() {
        mRenderer = new SkyboxRenderer(getActivity());
        return ((SkyboxRenderer) mRenderer);
    }

    @Override
    public void onClick(View v) {

    }

    @Override
    public void onSensorChanged(SensorEvent event) {

        if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
            SensorManager.getRotationMatrixFromVector(mRotationMatrix, event.values);
            SensorManager.remapCoordinateSystem(mRotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, mRotationMatrix);
            SensorManager.getOrientation(mRotationMatrix, orientationVals);
            orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
            orientationVals[1] = (float) Math.toDegrees(orientationVals[1]) * -1;
            orientationVals[2] = (float) Math.toDegrees(orientationVals[2]) * -1;
            //Log.d(TAG,  "YAW:" + orientationVals[0] + " PITCH:" + orientationVals[1] + "ROLL:" + orientationVals[2]);
        }

    }

    @Override
    public void onAccuracyChanged(Sensor sensor, int accuracy) {

    }

    private final class SkyboxRenderer extends AExampleRenderer implements View.OnClickListener {

        private final Vector3 mAccValues;
        boolean odd = true;

        public SkyboxRenderer(Context context) {
            super(context);
            mAccValues = new Vector3();
        }

        @Override
        protected void initScene() {
            getCurrentCamera().setFarPlane(1000);
            /**
             * Skybox images by Emil Persson, aka Humus. http://www.humus.name humus@comhem.se
             */
            try {
                getCurrentScene().setSkybox(R.drawable.posx, R.drawable.negx,
                        R.drawable.posy, R.drawable.negy, R.drawable.posz, R.drawable.negz);
            } catch (ATexture.TextureException e) {
                e.printStackTrace();
            }
        }

        @Override
        protected void onRender(long ellapsedRealtime, double deltaTime) {
            super.onRender(ellapsedRealtime, deltaTime);
            getCurrentCamera().setRotation(orientationVals[2], orientationVals[0], orientationVals[1]);
        }

        @Override
        public void onClick(View v) {
            try {
                if (odd) {
                    /**
                     * Skybox images by Emil Persson, aka Humus. http://www.humus.name humus@comhem.se
                     */
                    getCurrentScene().updateSkybox(R.drawable.posx2, R.drawable.negx2,
                            R.drawable.posy2, R.drawable.negy2, R.drawable.posz2, R.drawable.negz2);
                } else {
                    /**
                     * Skybox images by Emil Persson, aka Humus. http://www.humus.name humus@comhem.se
                     */
                    getCurrentScene().updateSkybox(R.drawable.posx, R.drawable.negx,
                            R.drawable.posy, R.drawable.negy, R.drawable.posz, R.drawable.negz);
                }
            } catch (Exception e) {
                e.printStackTrace();
            } finally {
                odd = !odd;
            }
        }

        public void setAccelerometerValues(float x, float y, float z) {
            mAccValues.setAll(-x, -y, -z);
        }
    }

}

解决方案

您有两个问题.首先是您正在描述的问题,但另一个问题是TYPE_ROTATION_VECTOR的传感器受到附近磁铁的影响,例如在手机壳中发现的磁铁.

解决磁铁问题

解决方案可能是结合使用加速度计和陀螺仪.幸运的是, Google Cardboard SDK 已经对此进行了抽象.

您可以通过使用HeadTracker.createFromContext(this.getActivity())实例化com.google.vrtoolkit.cardboard.sensors.HeadTracker的实例并在其上调用startTracking()来跟踪当前旋转.

现在您不再需要onSensorChanged.相反,您可以在onRender中执行以下操作:

float[] R = new float[16];
headTracker.getLastHeadView(R, 0);

获得旋转矩阵.这样可以解决您的磁场问题.

让相机正确看向四周

使用此旋转矩阵将相机指向正确方向的最简单方法是将其转换为org.rajawali3d.math.Quaternion,然后调用getCurrentCamera().setCameraOrientation(quaternion);

float[16]到四元数的转换可能很难实现,但是Google Cardboard SDK再一次为您完成了转换.在这种情况下,它位于不再使用的旧类的源代码中: HeadTransform .

您可以轻松地将代码修改为return new Quaternion(w, x, y, z);.

现在,如果不将orientationVals[1]orientationVals[2]乘以-1,这将导致与当前代码相同的问题.

但是,通过反转旋转矩阵可以轻松解决该问题.这将导致在onRender中生成以下代码(假设getQuaternion(R)返回org.rajawali3d.math.Quaternion):

@Override
protected void onRender(long ellapsedRealtime, double deltaTime) {
    super.onRender(ellapsedRealtime, deltaTime);

    float[] R = new float[16];
    headTracker.getLastHeadView(R, 0);

    android.opengl.Matrix.invertM(R, 0, R, 0);

    Quaternion q = getQuaternion(R);

    getCurrentCamera().setCameraOrientation(q);
}

I'm creating a Panorama view which allows the user to look around in a spherical image by rotating his smartphone. I used Rajawali's Skybox for this together with the TYPE_ROTATION_VECTOR sensor.

I got it working, but only when I look forward (it's literally based on my rotation (yaw))

This is the behavior:

  • looking forward: yaw = yaw, pitch = pitch and roll = roll
  • looking to the left: yaw = yaw, pitch = roll and roll = pitch
  • looking backwards: yaw = yaw, pitch = pitch * -1 and roll = roll * -1.

Now I do have a hunch what is going on. It seems the "camera object" keeps looking at the same direction, even if it doesn't appear to. This means that pitching seems to be the same as rolling, but in stead it's still pitching because the object hasn't rotated. I'm comparing it to being in an airplane and looking around.

What do I need to do to solve this?

I have a feeling I'm going to have to rotate the camera with lookAt(), but I'm not sure how.

public class SkyboxFragment extends RajawaliFragment implements SensorEventListener {

    public static final String TAG = "SkyBoxFragment";
    private SensorManager mSensorManager;
    private float[] orientationVals = new float[3];
    private float[] mRotationMatrix = new float[16];
    private Sensor mRotVectSensor;

    @Override
    public View onCreateView(LayoutInflater inflater, ViewGroup container,
                             Bundle savedInstanceState) {
        super.onCreateView(inflater, container, savedInstanceState);

        LinearLayout ll = new LinearLayout(getActivity());
        ll.setOrientation(LinearLayout.VERTICAL);
        ll.setGravity(Gravity.CENTER_HORIZONTAL | Gravity.TOP);
        mSensorManager = (SensorManager) getActivity().getSystemService(
                Context.SENSOR_SERVICE);
        mRotVectSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
        mLayout.addView(ll);
        mSensorManager.registerListener(this,
                mRotVectSensor,
                10000);
        return mLayout;
    }

    @Override
    public AExampleRenderer createRenderer() {
        mRenderer = new SkyboxRenderer(getActivity());
        return ((SkyboxRenderer) mRenderer);
    }

    @Override
    public void onClick(View v) {

    }

    @Override
    public void onSensorChanged(SensorEvent event) {

        if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
            SensorManager.getRotationMatrixFromVector(mRotationMatrix, event.values);
            SensorManager.remapCoordinateSystem(mRotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, mRotationMatrix);
            SensorManager.getOrientation(mRotationMatrix, orientationVals);
            orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
            orientationVals[1] = (float) Math.toDegrees(orientationVals[1]) * -1;
            orientationVals[2] = (float) Math.toDegrees(orientationVals[2]) * -1;
            //Log.d(TAG,  "YAW:" + orientationVals[0] + " PITCH:" + orientationVals[1] + "ROLL:" + orientationVals[2]);
        }

    }

    @Override
    public void onAccuracyChanged(Sensor sensor, int accuracy) {

    }

    private final class SkyboxRenderer extends AExampleRenderer implements View.OnClickListener {

        private final Vector3 mAccValues;
        boolean odd = true;

        public SkyboxRenderer(Context context) {
            super(context);
            mAccValues = new Vector3();
        }

        @Override
        protected void initScene() {
            getCurrentCamera().setFarPlane(1000);
            /**
             * Skybox images by Emil Persson, aka Humus. http://www.humus.name humus@comhem.se
             */
            try {
                getCurrentScene().setSkybox(R.drawable.posx, R.drawable.negx,
                        R.drawable.posy, R.drawable.negy, R.drawable.posz, R.drawable.negz);
            } catch (ATexture.TextureException e) {
                e.printStackTrace();
            }
        }

        @Override
        protected void onRender(long ellapsedRealtime, double deltaTime) {
            super.onRender(ellapsedRealtime, deltaTime);
            getCurrentCamera().setRotation(orientationVals[2], orientationVals[0], orientationVals[1]);
        }

        @Override
        public void onClick(View v) {
            try {
                if (odd) {
                    /**
                     * Skybox images by Emil Persson, aka Humus. http://www.humus.name humus@comhem.se
                     */
                    getCurrentScene().updateSkybox(R.drawable.posx2, R.drawable.negx2,
                            R.drawable.posy2, R.drawable.negy2, R.drawable.posz2, R.drawable.negz2);
                } else {
                    /**
                     * Skybox images by Emil Persson, aka Humus. http://www.humus.name humus@comhem.se
                     */
                    getCurrentScene().updateSkybox(R.drawable.posx, R.drawable.negx,
                            R.drawable.posy, R.drawable.negy, R.drawable.posz, R.drawable.negz);
                }
            } catch (Exception e) {
                e.printStackTrace();
            } finally {
                odd = !odd;
            }
        }

        public void setAccelerometerValues(float x, float y, float z) {
            mAccValues.setAll(-x, -y, -z);
        }
    }

}

解决方案

You have two problems. The first is the problem you are describing, but another problem is that the sensor for TYPE_ROTATION_VECTOR is affected by nearby magnets such as those found in phone cases.

Solving the magnet problem

A solution could be to use a combination of the accelerometer and the gyroscope. Luckily, the Google Cardboard SDK already abstracted this away.

You can track the current rotation by instantiating an instance of com.google.vrtoolkit.cardboard.sensors.HeadTracker using HeadTracker.createFromContext(this.getActivity()) and calling startTracking() on it.

Now you don't need onSensorChanged anymore. Instead, in your onRender, you can do this:

float[] R = new float[16];
headTracker.getLastHeadView(R, 0);

to get the rotation matrix. This solves your unstated problem of magnetic fields.

Getting the camera to look around properly

The easiest way to use this rotation matrix to point the camera in the right direction is to convert it to an org.rajawali3d.math.Quaternion and then call getCurrentCamera().setCameraOrientation(quaternion);

The conversion from float[16] to a quaternion can be difficult to get right, but once again, the Google Cardboard SDK did it for you. In this case, it's in the source code of an old class that is no longer used: HeadTransform.

You can easily adapt that code to return new Quaternion(w, x, y, z);.

Now this will result in the same issues as current code would have if you did not multiply orientationVals[1] and orientationVals[2] by -1.

That problem, however, is easily solved by inverting the rotation matrix. That would result in the following code in onRender (assuming getQuaternion(R) returns an org.rajawali3d.math.Quaternion):

@Override
protected void onRender(long ellapsedRealtime, double deltaTime) {
    super.onRender(ellapsedRealtime, deltaTime);

    float[] R = new float[16];
    headTracker.getLastHeadView(R, 0);

    android.opengl.Matrix.invertM(R, 0, R, 0);

    Quaternion q = getQuaternion(R);

    getCurrentCamera().setCameraOrientation(q);
}

这篇关于拉贾瓦利旋转相机带有Sensor.TYPE_ROTATION_VECTOR奇怪的行为的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆