Big-O符号的定义 [英] Big-O notation's definition

查看:75
本文介绍了Big-O符号的定义的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我真的很想知道真实的定义.我试图读一本书,但听不懂.

I really want to know the real definition. I have tried to read a book, but couldn't understand it.

O:Big-O表示法最坏的情况.
Θ:Theta表示法的平均情况.
Ω:Omega表示法的最佳情况.

O: Big-O notation worst case.
Θ: Theta notation average case.
Ω: Omega notation best case.

为什么Wikipedia仅在Big-O中代表算法的速度,包括其平均,最佳和最差情况?他们为什么没有用那些正式的关键字代替?

Why does Wikipedia represent the speed of algorithms just in Big-O including its average, best and worst cases? How come they did not replace with those formal keywords?

推荐答案

O,Θ和Ω不代表最差,平均和最佳情况;尽管它们具有相似的含义.

O, Θ and Ω do not represent worst, average and best case ; although they have a similar meaning.

Big-O表示法f(n)= O(g(n))表示对于较大的n值,f的增长速度比g慢("n> n 0 "的意思是对于较大的n值n").这并不意味着g是最坏的情况:g可能比最坏的情况还差(例如,快速排序也是 O(n!)).对于更复杂的算法,目前正在进行研究以确定其实际复杂度最小的Big-O:原始作者大多会发现Big-O上限.

Big-O notation f(n) = O(g(n)) means f grows slower than g for large values of n ("n > n0" means "for large values of n" in this context). This does not mean that g is the worst case: g could be worse than the worst case (quick sort is also O(n!) for instance). For the more complicated algorithms, there is ongoing research to determine the smallest Big-O for their actual complexity: the original author mostly finds a Big-O upper-bound.

Ω表示反向(f的增长快于g),这意味着它可能比最佳情况要好(例如,所有算法均为Ω(1)).

Ω notation means the reverse (f grows faster than g), which means it could be better than the best case (all algorithms are Ω(1) for instance).

对于许多算法来说,没有单个函数g使得复杂度同时为O(g)和Ω(g).例如,插入排序的Big-O下限为O(n²)(这意味着您找不到小于n²的任何东西),而Ω上限为Ω(n).

There are many algorithms for which there is no single function g such that the complexity is both O(g) and Ω(g). For instance, insertion sort has a Big-O lower bound of O(n²) (meaning you can't find anything smaller than n²) and an Ω upper bound of Ω(n).

其他算法可以:合并排序既是O(n log n),又是Ω(n log n).发生这种情况时,将其写为Θ(n log n),这意味着并非所有算法都具有Θ表示法的复杂度(并且值得注意的是,情况最糟或情况最好的算法都没有).

Other algorithms do: merge sort is both O(n log n) and Ω(n log n). When this happens, it is written as Θ(n log n), which means that not all algorithms have a Θ-notation complexity (and notably, algorithms with worst cases or best cases do not have one).

要摆脱概率极低的最坏情况,通常要检查平均情况下的复杂性-用不同的函数"f avg ",仅考虑最可能的结果.因此,对于快速排序,f = O(n²)是可以获得的最好结果,但是f avg = O(n log n).

To get rid of worst cases that carry a very low probability, it's fairly common to examine average-case complexity - replacing the standard "f" value on the left-hand side by a different function "favg" that only takes into account the most probable outcomes. So, for quick sort, f = O(n²) is the best you can get, but favg = O(n log n).

这篇关于Big-O符号的定义的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆