典型表达式的渐近复杂度 [英] Asymptotic complexity for typical expressions

查看:139
本文介绍了典型表达式的渐近复杂度的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

下图所示的以下函数的渐近复杂度的递增顺序为:

The increasing order of following functions shown in the picture below in terms of asymptotic complexity is:

(A)f1(n); f4(n); f2(n); f3(n)

(A) f1(n); f4(n); f2(n); f3(n)

(B)f1(n); f2(n); f3(n); f4(n);

(B) f1(n); f2(n); f3(n); f4(n);

(C)f2(n); f1(n); f4(n); f3(n)

(C) f2(n); f1(n); f4(n); f3(n)

(D)f1(n); f2(n); f4(n); f3(n)

(D) f1(n); f2(n); f4(n); f3(n)

a)这个简单问题的时间复杂度顺序为--->(n ^ 0.99)*(logn)< n ......怎么样? log可能是一个缓慢增长的函数,但它仍然比常数增长

a)time complexity order for this easy question was given as--->(n^0.99)*(logn) < n ......how? log might be a slow growing function but it still grows faster than a constant

b)考虑函数f1假设它是f1(n)=(n ^ 1.0001)(logn )那么答案是什么?

b)Consider function f1 suppose it is f1(n) = (n^1.0001)(logn) then what would be the answer?

何时存在涉及对数和多项式表达式相乘的表达式,对数函数是否超过多项式表达式?

whenever there is an expression which involves multiplication between logarithimic and polynomial expression , does the logarithmic function outweigh the polynomial expression?

c)在这种情况下如何检查

c)How to check in such cases suppose

1)(n ^ 2)logn vs(n ^ 1.5)其中时间复杂度更高?
2)(n ^ 1.5)登录与(n ^ 2)登录哪个时间复杂度较高?

1)(n^2)logn vs (n^1.5) which has higher time complexity? 2) (n^1.5)logn vs (n^2) which has higher time complexity?

推荐答案

如果我们认为C_1和C_2使得C_1< C_2,那么我们可以肯定地说以下内容

If we consider C_1 and C_2 such that C_1 < C_2, then we can say the following with certainty

(n^C_2)*log(n) grows faster than (n^C_1)

这是因为
(n ^ C_1)的增长慢于(n ^ C_2 )(显然)

This is because (n^C_1) grows slower than (n^C_2) (obviously)

also, for values of n larger than 2 (for log in base 2), log(n) grows faster than 
1.

in fact, log(n) is asymptotically greater than any constant C,
because log(n) -> inf as n -> inf

if both (n^C_2) is asymptotically than (n^C_1) AND log(n) is asymptotically greater 
than 1, then we can certainly say that 
(n^2)log(n) has greater complexity than (n^1.5)

具有更大的复杂度(n)作为缓慢增长的功能,但它的增长速度仍然快于1,这是关键。

We think of log(n) as a "slowly growing" function, but it still grows faster than 1, which is the key here.

coder101在评论中提出了一个有趣的问题,

coder101 asked an interesting question in the comments, essentially,

is n^e = Ω((n^c)*log_d(n))?
where e = c + ϵ for arbitrarily small ϵ

让我们做一些代数。

n^e = (n^c)*(n^ϵ)
so the question boils down to
is n^ϵ = Ω(log_d(n))
or is it the other way around, namely:
is log_d(n) = Ω(n^ϵ)

为此,让我们找到满足n ^ ϵ> log_d(n)的the的值。

In order to do this, let us find the value of ϵ that satisfies n^ϵ > log_d(n).

n^ϵ > log_d(n)
ϵ*ln(n) > ln(log_d(n))
ϵ > ln(log_d(n)) / ln(n)

因为我们知道这样的事实

Because we know for a fact that

ln(n) * c > ln(ln(n))        (1)
as n -> infinity

We can say that, for an arbitrarily small ϵ, there exists an n large enough to 
satisfy ϵ > ln(log_d(n)) / ln(n)

because, by (1), ln(log_d(n)) / ln(n) ---> 0 as n -> infinity.

有了这些知识,我们可以说

With this knowledge, we can say that

is n^ϵ = Ω(log_d(n))
for arbitrarily small ϵ

which means that
n^(c + ϵ) = Ω((n^c)*log_d(n))
for arbitrarily small ϵ.

以外行人的话

n^1.1 > n * ln(n)
for some n

also

n ^ 1.001 > n * ln(n)
for some much, much bigger n

and even
n ^ 1.0000000000000001 > n * ln(n)
for some very very big n.

这篇关于典型表达式的渐近复杂度的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆