大O vs小欧米茄 [英] Big O vs Small omega

查看:139
本文介绍了大O vs小欧米茄的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

为什么ω(n)小于O(n)?

Why is ω(n) smaller than O(n)?

我知道什么是小欧米茄(例如, n =ω (log n)),但我不明白为什么ω(n)小于O(n)。

I know what is little omega (for example, n = ω(log n)), but I can't understand why ω(n) is smaller than O(n).

推荐答案

算法复杂度具有数学定义。

Algorithmic complexity has a mathematic definition.

如果 f g 是两个函数,如果您可以找到两个常量 c f = O(g) c>(> 0)和 n ,例如 f(x)< c * g(x) x> n

If f and g are two functions, f = O(g) if you can find two constants c (> 0) and n such as f(x) < c * g(x) for every x > n.

对于Ω,情况恰恰相反:您可以找到这样的常数如 f(x)> c * g(x)

For Ω, it is the opposite: you can find constants such as f(x) > c * g(x).

f =Θ(g)是三个常量 c d n ,例如 c * g(x)< f(x)< d * g(x) x> n

f = Θ(g) if there are three constants c, d and n such as c * g(x) < f(x) < d * g(x) for every x > n.

然后, O 表示您的函数处于主导地位,Θ您的函数等同于其他函数,Ω您的函数有一个下限。
因此,当您使用Θ时,您的近似值更好,因为您可以包装函数在两条边之间;而 O 仅设置最大值。

Then, O means your function is dominated, Θ your function is equivalent to the other function, Ω your function has a lower limit. So, when you are using Θ, your approximation is better for you are "wrapping" your function between two edges ; whereas O only set a maximum. Ditto for Ω (minimum).

总结如下:


  • O(n):在最坏的情况下,算法的复杂度为 n

  • Ω(n):在最佳情况下,您的算法的复杂度为 n

  • Θ(n):在每种情况下,您的算法的复杂度均 n

  • O(n): in worst situations, your algorithm has a complexity of n
  • Ω(n): in best case, your algorithm has a complexity of n
  • Θ(n): in every situation, your algorithm has a complexity of n

最后,您的假设是错误的:它是Θ,而不是Ω。如您所知, n>如果 n 具有巨大的价值,则返回log(n)。然后,根据先前的定义,说 n =Θ(log(n))是逻辑。

To conclude, your assumption is wrong: it is Θ, not Ω. As you may know, n > log(n) when n has a huge value. Then, it is logic to say n = Θ(log(n)), according to previous definitions.

这篇关于大O vs小欧米茄的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆