Θ(n)和O(n)有什么区别? [英] What is the difference between Θ(n) and O(n)?

查看:1047
本文介绍了Θ(n)和O(n)有什么区别?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

有时我看到Θ(n)带有一个奇怪的Θ符号,中间带有一些符号,有时甚至是O(n).仅仅是因为没有人知道如何键入此符号而使键入变得懒惰,还是意味着有所不同?

解决方案

简短说明:

如果算法为Θ(g(n)),则意味着随着n(输入大小)变大,算法的运行时间与g(n)成比例.

如果算法为O(g(n)),则意味着随着n变大,算法的运行时间最多与g(n)成比例.

通常,即使当人们谈论O(g(n))时,他们实际上的意思是Θ(g(n)),但从技术上讲,还是有区别的.


从技术上讲:

O(n)表示上限. Θ(n)表示紧密边界. Ω(n)表示下界.

f(x)=Θ(g(x))iff f(x)= O(g(x))和f(x)=Ω(g(x))

基本上,当我们说算法是O(n)时,它也是O(n 2 ),O(n 1000000 ),O(2 n ),...,但是Θ(n)算法不是不是Θ(n 2 ).

实际上,由于f(n)=Θ(g(n))意味着对于足够大的n值,f(n)可以绑定在c 1 g(n)和c 2 g(n)对于c 1 和c 2 的某些值,即f的增长率渐近等于g:g可以是的下限,并且是f的上限.这直接暗示f也可以是g的下限和上限.因此,

f(x)=Θ(g(x))iff g(x)=Θ(f(x))

同样,要显示f(n)=Θ(g(n)),足以表明g是f的上限(即f(n)= O(g(n))),而f是a g的下界(即f(n)=Ω(g(n)),与g(n)= O(f(n))完全相同).简而言之,

f(x)=Θ(g(x))iff f(x)= O(g(x))且g(x)= O(f(x))


也有little-oh和little-omega(ω)表示法来表示函数的宽松上限和宽松下限.

总结:

f(x) = O(g(x))(big-oh)表示 f(x)的增长率是 渐近小于或等于 g(x)的增长率.

f(x) = Ω(g(x))(big-omega)表示 f(x)的增长率是 渐近地大于或 等于 g(x)

的增长率

f(x) = o(g(x))(小哦)表示 f(x)的增长率是 渐近地小于 g(x)的增长率.

f(x) = ω(g(x))(小Ω)表示 f(x)的增长率是 渐近于大于 g(x)

的增长率

f(x) = Θ(g(x))(theta)表示 f(x)的增长率是 渐近地等于 g(x)

的增长率

要进行更详细的讨论,您可以在Wikipedia上阅读定义或查阅经典教科书例如Cormen等人的《算法导论》.

Sometimes I see Θ(n) with the strange Θ symbol with something in the middle of it, and sometimes just O(n). Is it just laziness of typing because nobody knows how to type this symbol, or does it mean something different?

解决方案

Short explanation:

If an algorithm is of Θ(g(n)), it means that the running time of the algorithm as n (input size) gets larger is proportional to g(n).

If an algorithm is of O(g(n)), it means that the running time of the algorithm as n gets larger is at most proportional to g(n).

Normally, even when people talk about O(g(n)) they actually mean Θ(g(n)) but technically, there is a difference.


More technically:

O(n) represents upper bound. Θ(n) means tight bound. Ω(n) represents lower bound.

f(x) = Θ(g(x)) iff f(x) = O(g(x)) and f(x) = Ω(g(x))

Basically when we say an algorithm is of O(n), it's also O(n2), O(n1000000), O(2n), ... but a Θ(n) algorithm is not Θ(n2).

In fact, since f(n) = Θ(g(n)) means for sufficiently large values of n, f(n) can be bound within c1g(n) and c2g(n) for some values of c1 and c2, i.e. the growth rate of f is asymptotically equal to g: g can be a lower bound and and an upper bound of f. This directly implies f can be a lower bound and an upper bound of g as well. Consequently,

f(x) = Θ(g(x)) iff g(x) = Θ(f(x))

Similarly, to show f(n) = Θ(g(n)), it's enough to show g is an upper bound of f (i.e. f(n) = O(g(n))) and f is a lower bound of g (i.e. f(n) = Ω(g(n)) which is the exact same thing as g(n) = O(f(n))). Concisely,

f(x) = Θ(g(x)) iff f(x) = O(g(x)) and g(x) = O(f(x))


There are also little-oh and little-omega (ω) notations representing loose upper and loose lower bounds of a function.

To summarize:

f(x) = O(g(x)) (big-oh) means that the growth rate of f(x) is asymptotically less than or equal to to the growth rate of g(x).

f(x) = Ω(g(x)) (big-omega) means that the growth rate of f(x) is asymptotically greater than or equal to the growth rate of g(x)

f(x) = o(g(x)) (little-oh) means that the growth rate of f(x) is asymptotically less than the growth rate of g(x).

f(x) = ω(g(x)) (little-omega) means that the growth rate of f(x) is asymptotically greater than the growth rate of g(x)

f(x) = Θ(g(x)) (theta) means that the growth rate of f(x) is asymptotically equal to the growth rate of g(x)

For a more detailed discussion, you can read the definition on Wikipedia or consult a classic textbook like Introduction to Algorithms by Cormen et al.

这篇关于Θ(n)和O(n)有什么区别?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆