将平面拟合到 3D 中的一组点:scipy.optimize.minimize vs scipy.linalg.lstsq [英] Fit plane to a set of points in 3D: scipy.optimize.minimize vs scipy.linalg.lstsq

查看:27
本文介绍了将平面拟合到 3D 中的一组点:scipy.optimize.minimize vs scipy.linalg.lstsq的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

给定一组 3D 点,一般的问题是求平面方程的 a, b, c 系数,形式如下:

Given a set of points in 3D, the general problem is to find the a, b, c coefficients of a plane equation in the form:

z = a*x + b*y + c

这样得到的平面最适合那组点.

such that the resulting plane is the best fit possible to that set of points.

  1. 这个 SO 答案中,函数 scipy.optimize.minimize 就是用来解决这个问题的.

  1. In this SO answer, the function scipy.optimize.minimize is used to solve this problem.

它依赖于对系数的初始猜测,并最小化一个误差函数,该函数将每个点到平面表面的距离相加.

It relies on initial guesses for the coefficients, and minimizes an error function that sums the distances of each point to the surface of the plane.

此代码(基于这个其他 SO 答案) scipy.linalg.lstsq 函数用于解决相同的问题(当限制为一阶多项式时).

In this code (based on this other SO answer) the scipy.linalg.lstsq function is used to tackle the same problem (when restricted to a 1st order polynomial).

它在方程 z = A*C 中求解 C,其中 Ax,y 的串联点集合的坐标,z是集合的z坐标,Ca,b,c 系数.

It solves for C in the equation z = A*C, where A is the concatenation of the x,y coordinates of the set of points, z is the z coordinate of the set, and C are the a,b,c coefficients.

与上述方法中的代码不同,此方法似乎不需要对平面系数进行初始猜测.

Unlike the code in the method above, this one does not seem to require initial guesses for the plane coefficients.

由于 minimize 函数需要初始猜测,这意味着它可能会也可能不会收敛到最佳解决方案(取决于猜测的好坏).第二种方法是否有类似的警告,还是会返回始终精确的解决方案?

Since the minimize function requires an initial guess, this means that it may or may not converge to the optimal solution (depending on how good the guess is). Does the second method have a similar caveat or will it return an always exact solution?

推荐答案

最小二乘法 (scipy.linalg.lstsq) 保证收敛.事实上,有一个封闭形式的解析解(由 (A^TA)^-1 A^Tb 给出(其中 ^T 是矩阵转置,^-1 是矩阵求逆)

Least squares (scipy.linalg.lstsq) is guaranteed to converge. In fact, there is a closed form analytical solution (given by (A^T A)^-1 A^Tb (where ^T is matrix transpose and ^-1 is matrix inversion)

然而,标准优化问题通常无法解决 - 我们不能保证找到最小值.然而,对于给定的方程,找到一些 a, b, c 使得 z = a*x + b*y + c,我们有一个线性优化问题(约束和目标在我们试图优化的变量中是线性的).线性优化问题一般是可以解决的,所以scipy.optimize.minimize应该收敛到最优值.

The standard optimization problem, however, is not generally solvable - we are not guaranteed to find a minimizing value. However, for the given equation, find some a, b, c such that z = a*x + b*y + c, we have a linear optimization problem (the constraints and objective are linear in the variables that we're trying to optimize for). Linear optimization problems are generally solvable, so scipy.optimize.minimize should converge to the optimal value.

注意:即使我们做 z = a*x + b*y + d*x^2 + e*y^2 + f - 我们不'不必将自己限制在 (x,y) 的线性空间中,因为我们已经有了这些点 (x, y, x^2, y^2).对于我们的算法,这些看起来就像矩阵 A 中的点.所以我们实际上可以使用最小二乘法得到一个更高阶的多项式!

Note: this is linear in our constraints even if we do z = a*x + b*y + d*x^2 + e*y^2 + f -- we don't have to restrict ourselves to a linear space of (x,y), since we will have these points (x, y, x^2, y^2) already. To our algorithm, these just look like points in the matrix A. So we can actually get a higher order polynomial using least squares!

小结:实际上,所有不使用精确解析解的求解器通常都停留在实际答案的某个可接受范围内,因此我们很少会得到exact 解决方案,但它往往非常接近,以至于我们在实践中接受它.此外,即使是最小二乘法求解器也很少使用解析解,而是求助于更快的方法,例如牛顿法.

A brief aside: In reality, all of the solvers which don't use an exact analytical solution generally stop within some acceptable range of the actual answer, so it is rarely the case that we get the exact solution, but it tends to be so close that we accept it as exact in practice. Additionally, even least-squares solvers rarely use the analytical solution and instead resort to something quicker like Newton's Method.

如果您要更改优化问题,则情况并非如此.对于某些类别的问题,我们通常可以找到最佳值(其中最大的一类称为凸优化问题——尽管在某些条件下我们可以找到许多非凸问题的最佳值).

If you were to change the optimization problem, this would not be true. There are certain classes of problems for which we can generally find an optimal value (the largest class of these are called convex optimization problems -- although there are many non-convex problems which we can find an optimal value for under certain conditions).

如果您有兴趣了解更多信息,请查看凸优化 由 Boyd 和 Vandenberghe 撰写.第一章不需要太多的数学背景,它概述了一般优化问题以及它与线性和凸规划等可解优化问题的关系.

If you're interested in learning more, take a look at Convex Optimization by Boyd and Vandenberghe. The first chapter doesn't require much mathematical background and it overviews the general optimization problem and how it relates to solvable optimization problems like linear and convex programming.

这篇关于将平面拟合到 3D 中的一组点:scipy.optimize.minimize vs scipy.linalg.lstsq的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆