Chapter 7: NP Completeness

# Chapter 7: NP Completeness

### Goal of Chapter 3

• Goal of Chapters 1-5: Analyze what problems can be solved, in theory

• Goal of Chapter 7: Analyze what problems can be solved, in a reasonable amount of time

### Def. 7.1: Running Time or Time Complexity of a TM

• DTM M halts on all inputs

• Let f(n): N → N be the maximum number of steps M uses on w when |w| = n

• We say M is an f(n) time TM, ie M runs in time f(n)

### Def. 7.2: Big O Time Complexity (Informal)

• Many times we don't care about the exact function f

• Use Big O notation to represent dominant term
• Example: Time Complexity of TM M is O(n2) means

• n2 is the dominant term of the time complexity function of M
• Examples of Big O notation:

• n3 + n2 + x is O(n2)
• 2n6 + 5n3 + 4 is O(n6)
• Bubble sort is O(n2)
• Why do we care about Big O

• When n is large, smaller terms are relatively unimportant
• Ignoring others simplifies analysis

### Polynomial and Exponential Time Complexity

• Assume input size of n

• Polynomial time complexity:

• Can be expressed as nk, for some k
• Exponential time complexity:

• Can be expressed as kn, for some k

### Theorem 7.8: Time Complexity of Multitape Machines

• Theorem 7.8: Every t(n) time multitape TM has an equivalent O(t2(n)) time single-tape TM. (Assuming function t(n) must be t(n) ≥ n.)

• This is a polynomial increase in speed

### Def. 7.9: Running time for a Nondeterministic Decider TM

• Def. 7.9: N is a NTM that is a decider. The running time of N is the function f(n), the maximum number of steps than N uses in any branch of its computation of any input of length n.

• See Figure 7.10

• Deterministic: one branch accepts/rejects
• Nondeterministic: branches accept or reject

### Theorem 7.11: Nondeterministic and Deterministic TM Running Times

• Th. 7.11: Every t(n) NTM has an equivalent 2O(t(n)) time deterministic TM. In both cases, the machines are single tape, and t(n)n.

• This is an exponential increase in speed

### Def. 7.12: The Class P

• P is the class of languages that are decidable in polynomial time on a deterministic single-tape TM.

P = ∪ TIME(nk), for all k

• P is the same for all models of computation that are polynomially equivalent to the deterministic TM.

• P is roughly the same as the class problems that can be realistically solved on a computer.

• Remember: Decidable in polynomial time

### Class P Example: PATH

• PATH = {<G, s, t>| Directed graph G has a path from s to t}

• Theorem 7.14: PATH ∈ P

• Proof of Theorem 7.14: M = On input <G, s, t>, where G is a directed graph with nodes s and t:

1. Mark node s
2. Repeat step 3 until no more nodes are marked
3. Scan all edges of G. For edge (a, b) where a is marked anb b is not marked, mark node b.
4. If t is marked, accept. Otherwise, reject.
• Use an adjacency matrix (ie table with a row and column for each node) to implement the matrix

### Proving that L(M) is in P

• We must decide a given input in polynomial time

• For input of size n:

1. The number of stages must be polynomial number in n

2. The number of steps per stage must be polynomial number in n

3. Assumption: Encoding of input must be done in polynomial time wrt to the input size

• We describe algorithms, but they could be Turing Machines.

### Class P Example 2: RELPRIME

• RELPRIME = {<x, y>| s and t are relatively prime}

• Theorem 7.15: RELPRIME ∈ P

• Proof of Theorem 7.15: E = On input <x, y>, where x and y are natural numbers in binary:

1. Repeat steps 2 and 3 until y = 0:
2. Assign xx mod y
3. Swap x and y
4. Output x
• R = On input <x, y>, where x and y are natural numbers in binary:

1. Run E on <x, y>
2. If E outputs 1, accept. Otherwise, reject.
• Each stage of E (except possible the first) at least halves the size of x

### Class P Example 3: Every CFL ∈ P

• Theorem 7.16: Every CFL is decidable in polynomial time.

• Is the CFL decidable language fast enough? No.

• This algorithm used a CNF grammar and tried all derivations less than the needed size
• Sketch of proof of Theorem 7.16: E = On input w = w1 ... wn:

• Assume grammar is in CNF

• Build a table of all derivable substrings of w

• Build the table from short substrings to long ones
• If A → BC is in G, and B and C are in the table, add *A to the table
• If start symbol S is in the table, the string is derivable

• O(n3) algorithm

• Building from partial solutions is called *dynamic programming

• Each stage of E (except possible the first) at least halves the size of x