与大O有点困惑 [英] A little confused with Big O

查看:87
本文介绍了与大O有点困惑的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

所以我有一个关于如何验证函数的大O的快速问题.

So I have a quick question on how to verify the big O of a function.

例如:对500万个元素的数组进行排序的快速排序算法产生的时间间隔为0.008524秒,运行1000000个元素的相同算法的结果为0.017909.如果我的快速排序不是n * log(n)的大O,我将如何检查大O?

for example: a quicksort algorithm sorting an array of 5000000 element yields a time interval of 0.008524 seconds, running the same algorithm with 1000000 element yields 0.017909. How would I check the big O if my quicksort is/is not big O of n*log(n)??

我认为我的理解是:n增加2,因此运行时应该增加2 * log(2)?

What I think I understand: n increased by 2 thus the runtime should increase by 2*log(2)?

f(n)= 0.008524-> n log(n)

f(n) = 0.008524 -> n log (n)

f(2n)= 0.017909-> 2n * log(2n)

f(2n) = 0.017909 ->2n*log(2n)

推荐答案

Big-O表示法是渐近的.这意味着它仅在n变大时才适用于极限.

Big-O notation is asymptotic. That means it only applies in the limit as n gets large.

使用50和100个元素进行排序可能无法跟踪O(n log n)的原因有很多,缓存效果可能是候选对象.如果您尝试使用100,000对200,000对100万,您可能会发现它的跟踪效果更好.

There are many reasons why sorting with 50 and 100 elements might not track O(n log n), caching effects being a likely candidate. If you try 100,000 versus 200,000 versus 1 million, you will probably find it tracks a little better.

另一种可能性是,大多数快速排序实现平均只有O(n log n).一些输入将花费更长的时间. 50个元素遇到这种病理输入的几率比10万个元素高.

Another possibility is that most quicksort implementations are only O(n log n) on average; some inputs will take longer. The odds of encountering such a pathological input are higher for 50 elements than for 100,000.

但是,最终您不会验证" big-O运行时间;您可以根据算法的操作证明.

Ultimately, though, you do not "verify" big-O run time; you prove it based on what the algorithm does.

这篇关于与大O有点困惑的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆