使用互补滤波器传感器融合 [英] Sensor fusion using complementary filter

查看:1031
本文介绍了使用互补滤波器传感器融合的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想学习传感器融合和我已经通过Android应用程序记录的加速度计,Gryoscope和磁强计的原始数据。

I am trying to learn sensor fusion and for that I have recorded raw data for Accelerometer, Gryoscope and magnetometers via an android app.

我碰到卡尔曼滤波,但他们过于复杂,难于理解,我不希​​望只是采取任何code和实现它没有正确的认识。

I came across Kalman filters but they are too complex to understand and I do not want to just take any code and implement it without proper understanding.

然后我发现链接互补滤波器和看起来非常有前途的,因为它是很容易理解的。所以,我有以下的疑问。 (这是我第一次处理所有这些传感器,所以我要问的所有问题我有)

I then found this link for Complementary filter and that looks very promising as it is very easy to understand. So I have the following doubt. (This is the first time I am dealing with all these sensors so I am going to ask all questions I have)


  1. 互补过滤器需要从传感器和输出方向信号的俯仰,滚转和偏航的条款。这是否意味着它过滤时域信号,并在角度方面提供输出?在这种情况下,是有可能得到滤波的时域信号

  1. The Complementary filter takes signals from sensors and outputs orientation in terms of Pitch, Roll and Yaw. Does that mean it filters the time domain signals and provides output in terms of angles? In that case, is it possible to obtain filtered time domain signal?

我遇到的著名谈话谷歌视频中,他提到了进来即获得线性加速,你需要从原始加速度计数据中减去重力。如何获得重力矢量?

I came across this famous Google Talk video and in that he mentions that to obtain Linear Acceleration, you need to subtract Gravity from Raw Accelerometer data. How do I obtain the Gravity vector?

另外,我稍微搞不清楚为什么加速度信号转换为地球坐标系。我看了一些文件,但我仍然感到困惑。我可以看到它为什么它做,但所需要的旋转矩阵是如何计算的。

Also I am slightly confused about why the Acceleration signal has to be converted to Earth coordinate system. I have read some documents but I am still confused. I can see it why it is done but how the required Rotation matrix is calculated.

最后(但肯定不是最后一次),我怎么估计的标题?

Last (but surely not final), how do I estimate heading?

所以基本上,我有传感器数据,我想跟踪设备的方向和方向的人是标题。该问题可能听起来很基本的,但我需要一些澄清关于这一主题的专家,所以后来我可以去工作,对一些花哨的算法。

So basically, I have the sensor data and I want to track the orientation of the device and in which direction the person is heading. The questions may sound very basic but I need some clarification from experts on this topic, so I can then go and work on some fancy algorithms.

我真的AP preciate如果有人可以点我在正确的方向。

I would really appreciate if someone can point me in right direction.

此致

Chintan

推荐答案

从我的经验,没有AHRS算法,能够在准确的方式扩展卡尔曼滤波竞争。如果要计算用户加速精度是非常重要的,因为你的旋转矩阵的误差将导致用户加速漂移。

From my experience there is no AHRS-Algorithm that can compete with an extended kalman filter in means of accuracy. And accuracy is very important if you want to calculate the user-acceleration, because the inaccuracy of your rotation-matrix will result in a drift in your user-acceleration.

问题1:
我不明白究竟你的意思是过滤后的时域信号的。测量样品总是提供一个时间戳。

To question 1: I dont understand exactly what you mean with filtered time domain signals. The measurement samples always provide a timestamp.

应答以2和3:

要计算用户加速你需要计算的态度(旋转矩阵)事前。因为你需要与你的AHRS算法计算出来的态度,从手机空间到世界空间得到它旋转传入ACC-数据。让手机的向上运动(无论哪个方向),总是会导致增加Y值在用户加速。我希望你明白我的意思。现在我们有原始加速度计数据在世界空间和减去重力(的Vector3(0,9.81f,0))。因此,我们的新用户加速始终显示(0,0,0),如果没有运动。

To calculate the user-acceleration you need to calculate the attitude (rotation-matrix) beforehand. Because you need to rotate the incoming ACC-Data with the attitude calculated by your AHRS-Algorithm to get it from "phone-space" to "world-space". So that an Up-Movement of the phone (no matter which orientation) will always result in an increased Y Value in your user-acceleration. I hope you get what i mean. Now we have the Raw-Accelerometer-Data in world-space and subtract the gravity ( vector3(0,9.81f, 0) ). So that our new user-acceleration always shows (0,0,0) if there is no movement.

这是比较容易的部分。现在,我们在世界空间用户加速。但是,我们希望一个位置偏移(路径)。你不能只是加速整合,速度,之后速度为路径/路。 (请原谅我的英语;-))由于加速度的测量样品是永远不够精确推导2次的路径。你必须编程限制来控制你的派生的速度,因此,它会被设置回零,如果加速度的值和斜率为零。否则总是会有导致随着时间的推移计算的路径的一个巨大的漂移速度的剩余量。我认为,最好由内而外的位置跟踪你需要做的(世界空间)用​​户加速了一些分析,重建一个干净的速度,图形,获得流畅的动作总是返回到零的时候,有没有加速。我这个程序我自己,和它的作品,但它并不确切。一个问题是,被识别的运动取决于所述速度/ accelration。越慢动作,下部是加速度计的值,直到它们在传感器噪声迷路。另一个问题是要认识到,当一个运动已经结束,以除去其上所产生的速度均影响

This was the easy part. We now have the user-acceleration in world space. But we want a positional offset (the path). You can not just integrate acceleration to velocity and afterwards velocity to path/way. (Excuse my english ;-)) Because the measuring samples of the accelerometer are never exact enough to derive 2 times to the path. You have to program constraints to control your derived velocity, so that it will be set back to zero if the value and slope of the acceleration is zero. Otherwise there will always be a remaining amount of velocity resulting in a huge drift of the calculated path over time. I think for the best inside-out positional tracking you will need to do some analysis on the (world-space) user-acceleration and reconstruct a clean velocity-graph, to get smooth movements always returning to zero, when there is no acceleration. I programmed this myself, and it works but it is not exact. One problem is, that the recognized movement is depending on the velocity/accelration. The slower the movements are, the lower are the values of the accelerometer, until they get lost in the sensor-noise. Another problem is to recognize, when a movement has ended to remove all its influence on the resulting velocity.

无需为AHRS-算法的磁传感器,因为它不是足够可靠并总是引入误差。磁力计可以由对环境的影响,以多。例如看一下谷歌纸板磁力开关。打开传感器测试应用看磁力计传感器,当你拉谷歌纸板触发。它会产生哪些不会重新present标题在所有的磁力一个巨大的价值。同样的事情有微波炉等可能发生
因此,要获得良好的不断向北前往你检查
自从一个特定的时间方向和磁场处理不当的幅度改变,并且是合理的值。然后你可以使用磁强计数据为旋转方向的旋转矩阵的参考,你从AHRS算法得到纠正航向向北。

The Magnetometer-sensor is not needed for the AHRS-Algorithm, because it is not reliable enough and will always introduce errors. The magnetometer can be affected to much by the environment. For example look at the google Cardboard magnetometer switch. Open up a sensor-test-app look at the magnetometer-sensor when you pull the google Cardboard Trigger. It will produce a huge value on the magnetometer which will not represent the heading at all. Same things may happen with microwaves etc. So to get good north heading you constantly have to check if the direction and magnitude of the magnetic field hasnt changed since a specific time and are reasonable values. Then you can use the magnetometer-data as a reference for rotating your Orientation-Rotation-Matrix you got from the AHRS-Algorithm to correct the heading to north.

应答以4:
你从你的旋转矩阵的标题。

Answer to 4: You get the heading from your rotation Matrix.

vector3 headingDirection = new vector3(rotMat[8], rotMat[9], rotMat[10]);

根据您的旋转矩阵的形式(主要列或行大),你可能需要调整的指数上。
看看约翰·舒尔茨的答案在这里:
http://www.gamedev.net/topic/319213-方向矢量从旋转矩阵/

该rotationMatrix应通过将当前转速(陀螺仪)估计乘以你的最后估计旋转和现在之间的时间。

The rotationMatrix should be estimated by adding the current rotation speed (gyroscope) multiplied with the elapsed time between your last estimated rotation and now.

注释:

我觉得如果你要玩的传感器融合和用户加速它可能是最好使用扩展卡尔曼滤波从cardboard.jar为出发点。你可以把它比作你的算法。

I think if you want to play around with sensor fusion and user-acceleration it may be the best to use the extended kalman filter from the cardboard.jar as a starting point. You can compare it to your algorithm.

看这里:
<一href=\"https://github.com/Zomega/Cardboard/blob/master/src/com/google/vrtoolkit/cardboard/sensors/internal/OrientationEKF.java\" rel=\"nofollow\">https://github.com/Zomega/Cardboard/blob/master/src/com/google/vrtoolkit/cardboard/sensors/internal/OrientationEKF.java

虽然它使用magentometer(processMag)的方法,这种方法不会被调用在纸板的API。

Although it has a method for using the magentometer (processMag), this method never gets called in the cardboard-api.

在链接文件的方法GET predictedGLMatrix显示谷歌是如何估算的当前旋转矩阵。

The Method "getPredictedGLMatrix" in the linked file shows how google is estimating the "current" rotation Matrix.

我希望这回答大家的一些问题。

I hope this answers some of your questions.

这篇关于使用互补滤波器传感器融合的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆