程序的时间复杂度? [英] Time complexity of the program?
问题描述
1我←1
2而我< n / 4
3做
4 j←2i
5而j< n
6做
7 j←j + 1
8 i←i + 1
)1 i←n
2而我> 1
3做
4 j←i
5而j< n
6做
7 j←2j
8 i←i − 1
c)1 i←1
2而i< ; n
3做
4 j←0
5而j≤i
6做
7 j←j + 1
8 i←2i
鉴于这三个程序,找到其中每个程序的时间复杂度最简单的方法是什么?我可以说第一个可能是O(n ^ 2)。但是,有一种简便的方法可以始终如一地解决它吗?
(在总和中,我将忽略1减,四舍五入等)。限制)
A)
在内部循环中,您执行 n-2i-1
操作。在外部循环中,您执行 n / 4-1
。所以复杂度是:
所以您的答案
B)
在内部循环中, j
每次都会翻倍,因此呈指数增长。因此,内循环需要对数时间- log(n / i)
。 (以2为基数,但忽略WLOG)。用 i
进行除法是因为 j
从 i
开始,所以等效于从1开始一直增长到 n / i
。在外部循环中,您执行 n-1
个操作。
NB我注意到TypeKazt的答案存在重大差异;这说明您不能简单地将内循环的复杂性与外循环的相乘,并且内循环的边界条件在确定性上很重要。
编辑:作为我对TypeKazt的大胆指控的补充证明(嘿,哥们),这是我从C#实现中获得的一些测试数据:
如您所见,A(线性标度)的复杂度为 O(n ^ 2)
B和C(对数-对数刻度)是
代码:
静态int A(int n)
{
int c = 0;
for(int i = 0; i< n / 4; i ++)
for(int j = 2 * i; j< n; j ++)
c ++;
返回c;
}
static int B(int n)
{
int c = 0;
for(int i = n; i> 1; i--)
for(int j = i; j< n; j * = 2)
c ++;
返回c;
}
static int C(int n)
{
int c = 0;
for(int i = 1; i< n; i * = 2)
for(int j = 0; j< = i; j ++)
c ++;
返回c;
}
P.S。我知道SO的政策不是看护人,而是我通过实例很好地学习,所以我希望这可以帮助您了解一种简单的分析方法来进行复杂性分析。
1 i ← 1
2 while i < n/4
3 do
4 j ← 2i
5 while j < n
6 do
7 j ← j + 1
8 i ← i + 1
b) 1 i ← n
2 while i > 1
3 do
4 j ← i
5 while j < n
6 do
7 j ← 2j
8 i ← i − 1
c) 1 i ← 1
2 while i < n
3 do
4 j ← 0
5 while j ≤ i
6 do
7 j ← j + 1
8 i ← 2i
Given these three programs, what would be the easiest approach in finding the time complexity for each one of these? I can tell that the first one is probably going to be O(n^2). But is there an easy approach to these to solve it consistently? I have an exam about it tomorrow.
(I'll ignore off-by-1, rounding etc. in the summation limits)
A)
In the inner loop you do n - 2i - 1
operations. In the outer loop you do n/4 - 1
. So the complexity is:
So your answer to the first one is correct.
B)
In the inner loop, j
gets doubled every time, so grows exponentially; therefore the inner loop takes logarithmic time - log (n / i)
. (base 2 but ignore WLOG). The division by i
inside is because j
starts from i
, so is equivalent to starting from 1 and growing up to n / i
. In the outer loop you do n - 1
ops.
(Last step is from Stirling's approximation)
C)
In the inner loop you do i + 1
ops. In the outer loop i
grows exponentially from 1 to n
, so log n
.
N.B. I notice the significant discrepancy with TypeKazt's answers; this illustrates you can't simply "multiply" the inner loop's complexity with that of the outer loop, and that the boundary conditions for the inner loop are deterministically important.
EDIT: As added proof to my bold accusations against TypeKazt (hehe sorry buddy), here is some test data I got from a C# implementation:
As you can see, the complexity of A (linear scale) is O(n^2)
, whereas those of B and C (log-log scales) are linear.
Code:
static int A(int n)
{
int c = 0;
for (int i = 0; i < n / 4; i++)
for (int j = 2 * i; j < n; j++)
c++;
return c;
}
static int B(int n)
{
int c = 0;
for (int i = n; i > 1; i--)
for (int j = i; j < n; j *= 2)
c++;
return c;
}
static int C(int n)
{
int c = 0;
for (int i = 1; i < n; i *= 2)
for (int j = 0; j <= i; j++)
c++;
return c;
}
P.S. I know SO's policy is not to babysit people, but I for one learn well by examples, so I hope this will help you to understand a simple analytical approach to complexity analysis.
这篇关于程序的时间复杂度?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!