R中的(矩阵的)最小二乘最优化 [英] Least square optimization (of matrices) in R

查看:188
本文介绍了R中的(矩阵的)最小二乘最优化的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

昨天我问了一个问题有关R中的最小二乘优化的问题证明 lm函数是我一直在寻找的东西

Yesterday I asked a question about least square optimization in R and it turned out that lm function is the thing that I was looking for.

另一方面,现在我还有另一个最小二乘优化问题,我想知道lm是否也可以解决此问题,或者如果不能解决,该如何在R中处理.

On the other hand, now I have an other least square optimization question and I am wondering if lm could also solve this problem, or if not, how it can be handled in R.

我有固定矩阵 B (尺寸为nxm)和 V (尺寸为nxn),我正在寻找 m -long向量 u 这样

I have fixed matrices B (of dimension n x m) and V (of dimension n x n), I am looking for an m-long vector u such that

       sum( ( V - ( B %*% diag(u) %*% t(B)) )^2 )

已最小化.

推荐答案

1)lm.fit 使用

vec(AXA')=(A⊗A)vec(X)

vec(AXA') = (A ⊗ A ) vec(X)

如此:

k <- ncol(A)
AA1 <- kronecker(A, A)[, c(diag(k)) == 1]
lm.fit(AA1, c(V))

这是一个自包含的示例:

Here is a self contained example:

# test data
set.seed(123)
A <- as.matrix(BOD)
u <- 1:2
V <- A %*% diag(u) %*% t(A) + rnorm(36)

# solve
k <- ncol(A)
AA1 <- kronecker(A, A)[, c(diag(k)) == 1]
fm1 <- lm.fit(AA1, c(V))

大致给出了原始系数1:2:

giving roughly the original coefficients 1:2 :

> coef(fm1)
      x1       x2 
1.011206 1.999575 

2)nls 我们可以像这样交替使用nls:

2) nls We can alternately use nls like this:

fm2 <- nls(c(V) ~ c(A %*% diag(x) %*% t(A)), start = list(x = numeric(k)))

为上述示例提供以下信息:

giving the following for the above example:

> fm2
Nonlinear regression model
  model: c(V) ~ c(A %*% diag(x) %*% t(A))
   data: parent.frame()
   x1    x2 
1.011 2.000 
 residual sum-of-squares: 30.52

Number of iterations to convergence: 1 
Achieved convergence tolerance: 1.741e-09

更新:更正和第二个解决方案.

Update: Corrections and second solution.

这篇关于R中的(矩阵的)最小二乘最优化的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆