大O表示最坏情况下的运行时间,而Ω则表示最佳情况,但是为什么有时在最坏情况下使用Ω? [英] Big O for worst-case running time and Ω is for the best-case, but why is Ω used in worst case sometimes?

查看:334
本文介绍了大O表示最坏情况下的运行时间,而Ω则表示最佳情况,但是为什么有时在最坏情况下使用Ω?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我很困惑,我认为您使用Big O表示最坏的运行时间,而Ω表示最佳情况?有人可以解释一下吗?



(lg n)不是最好的情况吗? (nlg n)是最坏的情况吗?还是我误会了什么?


证明在大小为
n的堆上,Max-Heapify的最坏运行时间是Ω(lg n)。 (提示:对于具有n个节点的堆,请提供
导致从根到叶的路径
上的每个节点上递归调用Max-Heapify的节点值。)


编辑:不,这不是家庭作业。我正在练习,这有一个答案,我很困惑。
http://www-scf.usc.edu/~csci303/cs303hw4solutions .pdf 问题4(6.2-6)



编辑2:所以我误解了有关大O和Ω的问题吗?

解决方案

区分大小写和界限很重要。



最佳,平均和最差是分析算法时常见的情况



上限(O,o)和下限(Omega, Ω和theta是函数上常见的界线



当我们说算法X的最坏情况下的时间复杂度是O(n ),这就是说,当我们将输入限制为最坏情况的输入时,代表算法X性能的函数从上方渐近地受某个线性函数的限制。您可以说在最坏情况下的输入下限。或平均或最佳案例行为的上限或下限。



Case!=绑定。就是说,最坏的较高和最坏的较低是非常明智的指标……它们为算法的性能提供了绝对的界限。这并不意味着我们不能谈论其他指标。



编辑以回答您更新的问题:



该问题要求您证明Omega(lg n)是最坏情况行为的下界。换句话说,当该算法为一类输入完成尽可能多的工作时,它的工作量渐近地至少增长为(lg n)。因此,您的步骤如下:(1)确定算法的最坏情况; (2)在属于最坏情况的输入上找到算法运行时间的下限。



下面是线性搜索的示例: / p>

在线性搜索的最坏情况下,目标项目不在列表中,必须检查列表中的所有项目才能确定这一点。因此,该算法在最坏情况下的复杂度的下限是O(n)。



需要注意的是:对于许多算法,大多数情况下的复杂度将从上方受一组通用函数的限制。 Theta绑定很常见。因此,很可能在任何情况下,您对Omega的回答都不会与对O的回答不同。


I'm confused, I thought that you use Big O for worst-case running time and Ω is for the best-case? Can someone please explain?

And isn't (lg n) the best-case? and (nlg n) is the worst case? Or am I misunderstanding something?

Show that the worst-case running time of Max-Heapify on a heap of size n is Ω(lg n). ( Hint: For a heap with n nodes, give node values that cause Max-Heapify to be called recursively at every node on a path from the root down to a leaf.)

Edit: no this is not homework. im practicing and this has an answer key buy im confused. http://www-scf.usc.edu/~csci303/cs303hw4solutions.pdf Problem 4(6.2 - 6)

Edit 2: So I misunderstood the question not about Big O and Ω?

解决方案

It is important to distinguish between the case and the bound.

Best, average, and worst are common cases of interest when analyzing algorithms.

Upper (O, o) and lower (Omega, omega), along with Theta, are common bounds on functions.

When we say "Algorithm X's worst-case time complexity is O(n)", we're saying that the function which represents Algorithm X's performance, when we restrict inputs to worst-case inputs, is asymptotically bounded from above by some linear function. You could speak of a lower bound on the worst-case input; or an upper or lower bound on the average, or best, case behavior.

Case != Bound. That said, "upper on the worst" and "lower on the best" are pretty sensible sorts of metrics... they provide absolute bounds on the performance of an algorithm. It doesn't mean we can't talk about other metrics.

Edit to respond to your updated question:

The question asks you to show that Omega(lg n) is a lower bound on the worst case behavior. In other words, when this algorithm does as much work as it can do for a class of inputs, the amount of work it does grows at least as fast as (lg n), asymptotically. So your steps are the following: (1) identify the worst case for the algorithm; (2) find a lower bound for the runtime of the algorithm on inputs belonging to the worst case.

Here's an illustration of the way this would look for linear search:

In the worst case of linear search, the target item is not in the list, and all items in the list must be examined to determine this. Therefore, a lower bound on the worst-case complexity of this algorithm is O(n).

Important to note: for lots of algorithms, the complexity for most cases will be bounded from above and below by a common set of functions. It's very common for the Theta bound to apply. So it might very well be the case that you won't get a different answer for Omega than you do for O, in any event.

这篇关于大O表示最坏情况下的运行时间,而Ω则表示最佳情况,但是为什么有时在最坏情况下使用Ω?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆