算法导论第四章学习笔记

来源:互联网 发布:iphone在线软件安卓版 编辑:程序博客网 时间:2024/06/10 04:17

递归式

Substitution method

-----a damn clever method to prove the complexity of algorithms.

 

The substitution method for solving recurrences entails two steps:

1>     Guess the form of the solution.

2>     Use mathematical induction to find the constants and show that the solution works.

The substitution method can be used to establish either upper or lower bounds on a recurrence. As an example, let us determine an upper bound on the recurrence

Example: T(n) = 4T(n/2) + n

[Assume that T(1) = Θ(1).]

Guess O(n3) . (Prove O and Ω separately.)

Assume that T(k) ck3 for k < n .

Prove T(n) cn3 by induction.

We must also handle the initial conditions, that is, ground the induction with base cases.

Base: T(n) = Θ(1) for all n < n0, where n0 is a suitable constant.

For 1 n < n0, we have “Θ(1)” cn3, if pick c big enough.

 

Typically, it’s too easy to get an bound but not tight bound, just as what has been shown above, The tight bound is T(n) = O(n2).So the substitution is often used to justify rather than finding the complexity of algorithms.

Recursion-tree method

------an excellent method to guess the complexity of algorithms.

 

A recursion tree models the costs (time) of a recursive execution of an algorithm.

The recursion tree method is good for generating guesses for the substitution method.

The recursion-tree method can be unreliable, just like any method that uses ellipses (…).

The recursion-tree method promotes intuition, however.

Ø  In a recursion tree, each node represents the cost of a single subproblem somewhere in the set of recursive function invocations. We sum the costs within each level of the tree to obtain a set of per-level costs, and then we sum all the per-level costs to determine the total cost of all levels of the recursion. Recursion trees are particularly useful when the recurrence describes the running time of a divide-and-conquer algorithm.

Example: T (n) = 3T (n/4) + Θ(n2).

-            The cn2 term at the root represents the cost at the top level of recursion, and the three subtrees of the root represent the costs incurred by the subproblems of size n/4.

-            Since the root's contribution to the total cost is cn2, the root contributes a constant fraction of the total cost. In other words, the total cost of the tree is dominated by the cost of the root.

The master method

----- an great method to find the complexity of algorithms.

Three common cases

Compare f (n) with nlogba:

1.        f (n) = O(nlogba ε) for some constant ε > 0.

          f (n) grows polynomially slower than nlogba (by an nε factor).

Solution: T(n) = Θ(nlogba) .

2.        f (n) = Θ(nlogba lgkn) for some constant k 0.

          f (n) and nlogba grow at similar rates.

Solution: T(n) = Θ(nlogba lgk+1n) .

3.        f (n) = Ω(nlogba + ε) for some constant ε > 0.

          f (n) grows polynomially faster than nlogba (by an nε factor), and f (n) satisfies the regularity condition that a f (n/b) c f (n) for some constant c < 1.

Solution: T(n) = Θ( f (n)) .

Idea of master theorem

Why Case1 ?

Why Case2 ?

Why Case3 ?

 

 

原创粉丝点击