big o - Using big-O to prove N^2 is O(2^N) -
I can see clearly in comparison to N ^ 2, surrounded by c2 ^ N, but I have given it a formal definition How can I prove the use of big-o I just got it M.I. I can prove by
Here's my attempt ... by definition, there is no n & gt; For n0, a stable c is present which is f (n) & lt; = Cg (n) where f (n) = n ^ 2 and g (n) = 2 ^ n
Should I log in both ways and solve for C?
And another question about the Fibonacci sequence, I want to solve the repetition relation.
int fib (int n) {if (n & lt; = 1) return n; And reboot fib (N-1) + fib (N-2); The equation is T (n) = t (n -1) + t (n -2) + c // where c < / (N) + 2 T (N3) + T (N-4) + 3C / Code> and another
T (n) = t (n3) + 3 t (n4) + 3 t (n-5) + T (N-6) + 6C Then I'm lost to make the normal equation. Is the pattern like some sort of Pascal triangle?
T (n) = t (ni) + at (ni-1) + bt (ni2) + ... + kt (nii) + c f (x) Ïμ O (g (x))>
< / Em> You need to find ...
- ... something c & gt; 0 and
- ... some x 0
such f ( X) & lt; C for all (x) x & gt; X 0 . In this case, you can select c = 1 and x 0 = 2 you need to prove it Is that x 2 & lt; 2 x for all x & gt; 2 At this point you can log both sides (because log ( x )> log ( y ) , Then x> Y .) Assume that you are using base-2 log, you get the following log (x 2 ) & lt; By logging (2 x ) and the standard laws of logarithm, you
2 · log (x) & lt ; log (2) = 1 can be written as 2 · Log (x) log (2) X) & lt; X
If we set x = 2, then 2 · Log (2) = 2 gets faster than << p> and x log (x) we know that 2 · Log (x) & lt; X all x & gt; 2 .
Comments
Post a Comment