确定的复杂性给codeS [英] Determining the complexities given codes

查看:200
本文介绍了确定的复杂性给codeS的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

鉴于code一snipplet,你将如何确定,一般的复杂性。我发现自己变得很迷茫与大O问题。例如,一个很简单的问题:

 的for(int i = 0;我n种;我++){
    为(诠释J = 0; J&n种; J ++){
        的System.out.println(*);
    }
}
 

电讯局长解释说这与类似的组合。像这样为n选择2 =(N(N-1))/ 2 = N ^ 2 + 0.5,然后取出不变,所以就变成N ^ 2。我可以把INT测试值和尝试,但它如何结合的东西进来吗?

如果孤单if语句是什么?如何在复杂性决定的?

 的for(int i = 0;我n种;我++){
    如果(我%2 == 0){
        对于(INT J =; J&n种; J ++){...}
    } 其他 {
        对于(INT J = 0; J<我; J ++){...}
    }
}
 

那么怎么样递归...

  INT FIB(INT A,INT B,INT N){
    如果(N == 3){
        返回A + B;
    } 其他 {
        返回FIB(B,A + B,N-1);
    }
}
 

解决方案

一般而言的,也没有办法来确定一个给定的功能的复杂性

警告!墙上的文字传入的!

1。有没有人知道他们是否甚至停止与否很简单算法。

没有算法,可以决定一个给定的程序是否停止与否,如果有一定的投入。计算计算复杂度,因为不仅是一个更难的问题我们需要证明,该算法将暂停,但我们需要证明的的速度有多快的它这样做。

//该考拉兹猜想指出,顺序由下产生的 //算法总是达到1,对于任何初始正整数。它一直 //为70+年了一个开放的问题。 功能COL(N){     如果(正== 1){         返回0;     }否则如果(N%2 == 0){//甚至         返回1 + COL(N / 2);     }其他{//奇数         返回1 +山口(3 * N + 1);     } }

2。 一些有奇怪的和非搏动的复杂算法

一个普通的复杂性决定计划将很容易获得,因为这些家伙太复杂

//阿克曼功能。之一的非原始递归算法的第一实施例。 功能ACK(M,N){     如果(M == 0){         返回N + 1;     }否则,如果(N == 0){         返回肯定应答(M-1,1);     }其他{         返回肯定应答(M-1,ACK(M,N-1));     } } 函数f(n)的{返回ACK(N,N); } // F(1)= 3 // F(2)= 7 // F(3)= 61 // F(4)需要更长的时间,然后你最疯狂的梦想终止。

3。 有些功能是很简单的,但会迷惑很多种类的静态分析的尝试

// Mc'Carthy的91功能。尝试猜测它做什么,不 //运行它或阅读维基百科页面) 功能F91(N){     如果(正→100){         返回N - 10;     }其他{         返回F91(F91(N + 11));     } }


尽管如此,我们仍然需要一种方法来寻找的东西的复杂性,对吧? For循环是一种简单和常见的模式。把你的第一个示例:

为(i = 0; I&n种;我++){    为(J = 0; J<我; J ++){        打印的东西    } }

由于每个打印内容是O(1),该算法的时间复杂度将有多少次,我们跑这条线来确定。那么,作为你的TA所提到的,我们这样做是看在这种情况下,组合。内部循环将运行(N +(N-1)+ ... + 1)次,总共(N + 1)* N / 2

由于我们忽视的常量,我们得到O(N 2 )。

现在对于比较棘手的情况下,我们可以得到更多的数学。尝试创建,其价值重估presents多久算法需要运行一个函数,给定大小的输入N个。 通常我们可以直接从算法本身构造这个函数的递归版本,因此计算的复杂性成为把边界上的功能的问题。我们调用这个函数一个复发

例如:

函数fib_like(N){     如果(正&其中; = 1){         返回17;     }其他{         返回42+ fib_like(N-1)+ fib_like(N-2);     }  }

很容易地看到,运行时间,在N而言,将通过

给予

T(N)= 1,如果(N< = 1) T(N)= T(N-1)+ T(N-2),否则

嗯,T(N)就是很好的老斐波那契功能。我们可以用归纳法把一些边界上。

有关,例如,让我们证明,通过归纳,即T(N)< = 2 ^ n表示所有N(即T(N)是O(2 ^ N))

  • 基本情况:n = 0或N = 1

T(0)= 1< = 1 = 2 ^ 0     T(1)= 1&其中; = 2 = 2 ^ 1

  • 在感性的情况下(N> 1):

T(N)= T(N-1)+ T(N-2)     aplying在T(N-1)和T归纳假设第(n-2)...     T(N)&其中; = 2 ^(N-1)+ 2 ^(N-2)     所以..     T(N)&其中; = 2 ^(N-1)+ 2 ^(N-1)          < = 2 ^ N

(我们可以尝试做类似的事情来证明下界太)

在大多数情况下,其在功能上的最终运行一个很好的猜测将让你轻松解决复发问题,归纳法证明。当然,这需要你能够先猜 - 只有大量的练习可以帮助你在这里

和为f最后一点,我想指出,有关 主定理,唯一的规则更多的困难复发的问题我现在能想到的是常用的。使用它时,你必须处理一个棘手的分而治之算法。


此外,在你的,如果案的例子,我会解决,通过欺骗和分裂成两个独立的循环,不要;如果里面吨有一个。

的for(int i = 0;我n种;我++){     如果(我%2 == 0){         对于(INT J =; J&n种; J ++){...}     } 其他 {         对于(INT J = 0; J<我; J ++){...}     } }

有相同的运行时间为

的for(int i = 0;我n种;我+ = 2){     对于(INT J =; J&n种; J ++){...} } 对(INT I = 1; I&n种; I + = 2){     对于(INT J = 0; J<我; J ++){...} }

和每两部分的可容易地看出是O(N ^ 2),共这也是O(N ^ 2)。

请注意,我用的好招绝招摆脱的如果在这里。 有这样做没有一般规则,如通过在Collat​​z算法的例子

Given a snipplet of code, how will you determine the complexities in general. I find myself getting very confused with Big O questions. For example, a very simple question:

for (int i = 0; i < n; i++) {
    for (int j = 0; j < n; j++) {
        System.out.println("*");
    }
}

The TA explained this with something like combinations. Like this is n choose 2 = (n(n-1))/2 = n^2 + 0.5, then remove the constant so it becomes n^2. I can put int test values and try but how does this combination thing come in?

What if theres an if statement? How is the complexity determined?

for (int i = 0; i < n; i++) {
    if (i % 2 ==0) {
        for (int j = i; j < n; j++) { ... }
    } else {
        for (int j = 0; j < i; j++) { ... }
    }
}

Then what about recursion ...

int fib(int a, int b, int n) {
    if (n == 3) {
        return a + b;
    } else {
        return fib(b, a+b, n-1);
    }
}

解决方案

In general, there is no way to determine the complexity of a given function

Warning! Wall of text incoming!

1. There are very simple algorithms that no one knows whether they even halt or not.

There is no algorithm that can decide whether a given program halts or not, if given a certain input. Calculating the computational complexity is an even harder problem since not only do we need to prove that the algorithm halts but we need to prove how fast it does so.

//The Collatz conjecture states that the sequence generated by the following
// algorithm always reaches 1, for any initial positive integer. It has been
// an open problem for 70+ years now.
function col(n){
    if (n == 1){
        return 0;
    }else if (n % 2 == 0){ //even
        return 1 + col(n/2);
    }else{ //odd
        return 1 + col(3*n + 1);
    }
}

2. Some algorithms have weird and off-beat complexities

A general "complexity determining scheme" would easily get too complicated because of these guys

//The Ackermann function. One of the first examples of a non-primitive-recursive algorithm.
function ack(m, n){
    if(m == 0){
        return n + 1;
    }else if( n == 0 ){
        return ack(m-1, 1);
    }else{
        return ack(m-1, ack(m, n-1));
    }
}

function f(n){ return ack(n, n); }

//f(1) = 3
//f(2) = 7
//f(3) = 61
//f(4) takes longer then your wildest dreams to terminate.

3. Some functions are very simple but will confuse lots of kinds of static analysis attempts

//Mc'Carthy's 91 function. Try guessing what it does without
// running it or reading the Wikipedia page ;)
function f91(n){
    if(n > 100){
        return n - 10;
    }else{
        return f91(f91(n + 11));
    }
}


That said, we still need a way to find the complexity of stuff, right? For loops are a simple and common pattern. Take your initial example:

for(i=0; i<N; i++){
   for(j=0; j<i; j++){
       print something
   }
}

Since each print something is O(1), the time complexity of the algorithm will be determined by how many times we run that line. Well, as your TA mentioned, we do this by looking at the combinations in this case. The inner loop will run (N + (N-1) + ... + 1) times, for a total of (N+1)*N/2.

Since we disregard constants we get O(N2).

Now for the more tricky cases we can get more mathematical. Try to create a function whose value represents how long the algorithm takes to run, given the size N of the input. Often we can construct a recursive version of this function directly from the algorithm itself and so calculating the complexity becomes the problem of putting bounds on that function. We call this function a recurrence

For example:

function fib_like(n){
    if(n <= 1){
        return 17;
    }else{
        return 42 + fib_like(n-1) + fib_like(n-2);
    }
 }

it is easy to see that the running time, in terms of N, will be given by

T(N) = 1 if (N <= 1)
T(N) = T(N-1) + T(N-2) otherwise

Well, T(N) is just the good-old Fibonacci function. We can use induction to put some bounds on that.

For, example, Lets prove, by induction, that T(N) <= 2^n for all N (ie, T(N) is O(2^n))

  • base case: n = 0 or n = 1

    T(0) = 1 <= 1 = 2^0
    T(1) = 1 <= 2 = 2^1

  • inductive case (n > 1):

    T(N) = T(n-1) + T(n-2)
    aplying the inductive hypothesis in T(n-1) and T(n-2)...
    T(N) <= 2^(n-1) + 2^(n-2)
    so..
    T(N) <= 2^(n-1) + 2^(n-1)
         <= 2^n

(we can try doing something similar to prove the lower bound too)

In most cases, having a good guess on the final runtime of the function will allow you to easily solve recurrence problems with an induction proof. Of course, this requires you to be able to guess first - only lots of practice can help you here.

And as f final note, I would like to point out about the Master theorem, the only rule for more difficult recurrence problems I can think of now that is commonly used. Use it when you have to deal with a tricky divide and conquer algorithm.


Also, in your "if case" example, I would solve that by cheating and splitting it into two separate loops that don; t have an if inside.

for (int i = 0; i < n; i++) {
    if (i % 2 ==0) {
        for (int j = i; j < n; j++) { ... }
    } else {
        for (int j = 0; j < i; j++) { ... }
    }
}

Has the same runtime as

for (int i = 0; i < n; i += 2) {
    for (int j = i; j < n; j++) { ... }
}

for (int i = 1; i < n; i+=2) {
    for (int j = 0; j < i; j++) { ... }
}

And each of the two parts can be easily seen to be O(N^2) for a total that is also O(N^2).

Note that I used a good trick trick to get rid of the "if" here. There is no general rule for doing so, as shown by the Collatz algorithm example

这篇关于确定的复杂性给codeS的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆