13 Randomized Algorithms
13 Randomized Algorithms
Network flow.
Algorithms
Randomization.
in practice, access to a pseudo-random number generator
1 2
13.1 Contention Resolution Contention resolution. Given n processes P1, …, Pn, each competing for
access to a shared database. If two or more processes access the
database simultaneously, all processes are locked out. Devise protocol
to ensure all processes get through on a regular basis.
P1
P2
.
.
.
Pn
4
Contention Resolution: Randomized Protocol Contention Resolution: Randomized Protocol
Protocol. Each process requests access to the database at time t with Claim. The probability that process i fails to access the database in
probability p = 1/n. en rounds is at most 1/e. After e⋅n(c ln n) rounds, the probability is at
most n-c.
Claim. Let S[i, t] = event that process i succeeds in accessing the
database at time t. Then 1/(e ⋅ n) ≤ Pr[S(i, t)] ≤ 1/(2n). Pf. Let F[i, t] = event that process i fails to access database in rounds
1 through t. By independence and previous claim, we have
Pf. By independence, Pr[S(i, t)] = p (1-p)n-1. Pr[F(i, t)] ≤ (1 - 1/(en)) t.
process i requests access none of remaining n-1 processes request access
$en % en
Choose t = e ⋅ n: Pr[ F(i, t)] " (1# en1 ) " (1# en1 ) " 1
e
Setting p = 1/n, we have Pr[S(i, t)] = 1/n (1 - 1/n) n-1. ▪
c ln n
value that maximizes Pr[S(i, t)] between 1/e and 1/2
Choose t = e ⋅ n c ln n: Pr[ F(i, t)] " ( 1e ) = n#c
!
5 6
Pf. Let F[t] = event that at least one of the n processes fails to access
database in any of the rounds 1 through t.
"n % n t
Pr [ F [t] ] = Pr$ U F [i, t ] ' ( ) Pr[ F [i, t]] ( n ( 1* en1 )
# i=1 & i=1
!
Choosing t = 2 en c ln n yields Pr[F[t]] ≤ n · n-2 = 1/n. ▪
"n % n
Union bound. Given events E1, …, En, Pr $ U Ei ' ( ) Pr[ Ei ]
# i=1 & i=1
7
!
Global Minimum Cut Contraction Algorithm
Global min cut. Given a connected, undirected graph G = (V, E) find a Contraction algorithm. [Karger 1995]
cut (A, B) of minimum cardinality. Pick an edge e = (u, v) uniformly at random.
Contract edge e.
Applications. Partitioning items in a database, identify clusters of – replace u and v by single new super-node w
related documents, network reliability, network design, circuit design, – preserve edges, updating endpoints of u and v to w
TSP solvers. – keep parallel edges, but delete self-loops
Repeat until graph has just two nodes v1 and v2.
Network flow solution. Return the cut (all nodes that were contracted to form v1).
Replace every edge (u, v) with two antiparallel edges (u, v) and (v, u).
Pick some vertex s and compute min s-v cut separating s from each
other vertex v ∈ V.
a b c c
a b
9 10
Claim. The contraction algorithm returns a min cut with prob ≥ 2/n2. Claim. The contraction algorithm returns a min cut with prob ≥ 2/n2.
Pf. Consider a global min-cut (A*, B*) of G. Let F* be edges with one Pf. Consider a global min-cut (A*, B*) of G. Let F* be edges with one
endpoint in A* and the other in B*. Let k = |F*| = size of min cut. endpoint in A* and the other in B*. Let k = |F*| = size of min cut.
In first step, algorithm contracts an edge in F* probability k / |E|. Let G' be graph after j iterations. There are n' = n-j supernodes.
Every node has degree ≥ k since otherwise (A*, B*) would not be Suppose no edge in F* has been contracted. The min-cut in G' is still k.
min-cut. ⇒ |E| ≥ ½kn. Since value of min-cut is k, |E'| ≥ ½kn'.
Thus, algorithm contracts an edge in F* with probability ≤ 2/n. Thus, algorithm contracts an edge in F* with probability ≤ 2/n'.
Pr[E1 " E2 L " En#2 ] = Pr[E1 ] $ Pr[E2 | E1 ] $ L $ Pr[En#2 | E1 " E2 L " En#3 ]
A* B* % (1# 2n ) (1# n#1 ) ( 4 ) ( 3)
2 L 1# 2 1# 2
= ( n ) ( n #1 ) L ( 24 ) ( 13 )
n #2 n#3
= 2
n(n#1)
% 2
F* n2
11 12
Contraction Algorithm Global Min Cut: Context
Amplification. To amplify the probability of success, run the Remark. Overall running time is slow since we perform Θ(n2 log n)
contraction algorithm many times. iterations and each takes Ω(m) time.
Claim. If we repeat the contraction algorithm n2 ln n times with Improvement. [Karger-Stein 1996] O(n2 log3n).
independent random choices, the probability of failing to find the Early iterations are less risky than later ones: probability of
global min-cut is at most 1/n2. contracting an edge in min cut hits 50% when n / √ 2 nodes remain.
Run contraction algorithm until n / √ 2 nodes remain.
Pf. By independence, the probability of failure is at most Run contraction algorithm twice on resulting graph, and return best of
two cuts.
# 2 &n
2
ln n )# 2 & 12 n 2 , 2ln n 2ln n 1
%1" 2 (
$ n '
= +%1" 2 ( .
+*$ n ' .-
( )
/ e"1 =
n2 Extensions. Naturally generalizes to handle positive weights.
(1 - 1/x)x ≤ 1/e
Best known. [Karger 2000] O(m log3n).
! faster than best known max flow algorithm or
deterministic global min cut algorithm
13 14
Expectation
13.3 Linearity of Expectation Expectation. Given a discrete random variables X, its expectation E[X]
is defined by: "
E[X ] = # j Pr[X = j]
j=0
Waiting for!a first success. Coin is heads with probability p and tails
with probability 1-p. How many independent flips X until first heads?
# # p # p 1% p 1
E[X ] = $ j " Pr[X = j] = $ j (1% p) j%1 p = $ j (1% p) j = " 2 =
j=0 j=0 1% p j=0 1% p p p
j-1 tails 1 head
16
Expectation: Two Properties Guessing Cards
Useful property. If X is a 0/1 random variable, E[X] = Pr[X = 1]. Game. Shuffle a deck of n cards; turn them over one at a time; try to
guess each card.
Pf. # 1
E[X ] = $ j " Pr[X = j] = $ j " Pr[X = j] = Pr[X = 1]
j=0 j=0
Memoryless guessing. No psychic abilities; can't even remember
what's been turned over already. Guess a card from full deck
not necessarily independent uniformly at random.
! Linearity of expectation. Given two random variables X and Y defined
over the same probability space, E[X + Y] = E[X] + E[Y].
Claim. The expected number of correct guesses is 1.
Pf. (surprisingly effortless using linearity of expectation)
Let Xi = 1 if ith prediction is correct and 0 otherwise.
Decouples a complex calculation into simpler pieces.
linearity of expectation
17 18
Game. Shuffle a deck of n cards; turn them over one at a time; try to Coupon collector. Each box of cereal contains a coupon. There are n
guess each card. different types of coupons. Assuming all boxes are equally likely to
contain each coupon, how many boxes before you have ≥ 1 coupon of
Guessing with memory. Guess a card uniformly at random from cards each type?
not yet seen.
Claim. The expected number of steps is Θ(n log n).
Claim. The expected number of correct guesses is Θ(log n). Pf.
Pf. Phase j = time between j and j+1 distinct coupons.
Let Xi = 1 if ith prediction is correct and 0 otherwise. Let Xj = number of steps you spend in phase j.
Let X = number of correct guesses = X1 + … + Xn. Let X = number of steps in total = X0 + X1 + … + Xn-1.
E[Xi] = Pr[Xi = 1] = 1 / (n - i - 1).
n"1 n"1 n n 1
E[X] = E[X1] + … + E[Xn] = 1/n + … + 1/2 + 1/1 = H(n). ▪ E[X ] = # E[X j ] = # = n # = n H (n)
j=0 j=0 n" j i=1 i
linearity of expectation ln(n+1) < H(n) < 1 + ln n
prob of success = (n-j)/n
⇒ expected waiting time = n/(n-j)
!
19 20
Maximum 3-Satisfiability
C1 = x2 " x3 " x4
C2 = x2 " x3 " x4
C3 = x1 " x2 " x4
C4 = x1 " x2 " x3
C5 = x1 " x2 " x4
Simple idea. Flip a coin, and set each variable true with probability ½,
independently for each variable.
22
Claim. Given a 3-SAT formula with k clauses, the expected number of Corollary. For any instance of 3-SAT, there exists a truth assignment
clauses satisfied by a random assignment is 7k/8. that satisfies at least a 7/8 fraction of all clauses.
!
k
E[Z ] = " E[Z j ] Probabilistic method. We showed the existence of a non-obvious
j=1
linearity of expectation k property of 3-SAT by showing that a random construction produces it
= " Pr[clause C j is satisfied] with positive probability!
j=1
= 7k
8
23 24
Maximum 3-Satisfiability: Analysis Maximum 3-Satisfiability: Analysis
Q. Can we turn this idea into a 7/8-approximation algorithm? In Johnson's algorithm. Repeatedly generate random truth assignments
general, a random variable can almost always be below its mean. until one of them satisfies ≥ 7k/8 clauses.
Lemma. The probability that a random assignment satisfies ≥ 7k/8 Theorem. Johnson's algorithm is a 7/8-approximation algorithm.
clauses is at least 1/(8k).
Pf. By previous lemma, each iteration succeeds with probability at
Pf. Let pj be probability that exactly j clauses are satisfied; let p be least 1/(8k). By the waiting-time bound, the expected number of trials
probability that ≥ 7k/8 clauses are satisfied. to find the satisfying assignment is at most 8k. ▪
7k = E[Z ] = # j pj
8
j "0
= # j pj + # j pj
j < 7k /8 j " 7k /8
$ ( 7k
8
% 18 ) # p j + k # p j
j < 7k /8 j " 7k /8
$ ( 87 k % 18 ) & 1 + k p
Theorem. [Håstad 1997] Unless P = NP, no ρ-approximation algorithm Remark. Can always convert a Las Vegas algorithm into Monte Carlo,
for MAX-3SAT (and hence MAX-SAT) for any ρ > 7/8. but no known method to convert the other way.
27 28
RP and ZPP
29
Dictionary. Given a universe U of possible elements, maintain a subset Hash function. h : U → { 0, 1, …, n-1 }.
S ⊆ U so that inserting, deleting, and searching in S is efficient.
Hashing. Create an array H of size n. When processing element u,
Dictionary interface. access array element H[h(u)].
Create(): Initialize a dictionary with S = φ.
Insert(u): Add element u ∈ U to S. Collision. When h(u) = h(v) but u ≠ v.
Delete(u): Delete u from S, if u is currently in S. A collision is expected after Θ(√n) random insertions. This
Lookup(u): Determine whether u is in S. phenomenon is known as the "birthday paradox."
Separate chaining: H[i] stores linked list of elements u with h(u) = i.
H[2] null
Applications. File systems, databases, Google, compilers, checksums H[3] suburban untravelled considerating
31 32
Ad Hoc Hash Function Algorithmic Complexity Attacks
Ad hoc hash function. When can't we live with ad hoc hash function?
Obvious situations: aircraft control, nuclear reactors.
int h(String s, int n) { Surprising situations: denial-of-service attacks.
int hash = 0;
malicious adversary learns your ad hoc hash function
for (int i = 0; i < s.length(); i++)
(e.g., by reading Java API) and causes a big pile-up in
hash = (31 * hash) + s[i]; a single slot that grinds performance to a halt
return hash % n;
} hash function ala Java string library
33 34
Idealistic hash function. Maps m elements uniformly at random to n Universal class of hash functions. [Carter-Wegman 1980s]
hash slots. For any pair of elements u, v ∈ U, Pr h " H [ h(u) = h(v) ] # 1/ n
Running time depends on length of chains. Can select random h efficiently. chosen uniformly at random
Average length of chain = α = m / n. Can compute h(u) efficiently.
Choose n ≈ m ⇒ on average O(1) per insert, lookup, or delete. !
Ex. U = { a, b, c, d, e, f }, n = 2.
H = {h1, h2}
Challenge. Achieve idealized randomized guarantees, but with a hash a b c d e f
Pr h ∈ H [h(a) = h(b)] = 1/2
h1(x) 0 1 0 1 0 1 not universal
function where you can easily find items where you put them. Pr h ∈ H [h(a) = h(c)] = 1
h2(x) 0 0 0 1 1 1 Pr h ∈ H [h(a) = h(d)] = 0
...
Approach. Use randomization in the choice of h.
a b c d e f H = {h1, h2 , h3 , h4}
adversary knows the randomized algorithm you're using, Pr h ∈ H [h(a) = h(b)] = 1/2
but doesn't know random choices that the algorithm makes h1(x) 0 1 0 1 0 1
Pr h ∈ H [h(a) = h(c)] = 1/2 universal
h2(x) 0 0 0 1 1 1 Pr h ∈ H [h(a) = h(d)] = 1/2
h3(x) 0 0 1 0 1 1 Pr h ∈ H [h(a) = h(e)] = 1/2
Pr h ∈ H [h(a) = h(f)] = 0
h4(x) 1 0 0 1 1 0 ...
35 36
Universal Hashing Designing a Universal Family of Hash Functions
Universal hashing property. Let H be a universal class of hash Theorem. [Chebyshev 1850] There exists a prime between n and 2n.
functions; let h ∈ H be chosen uniformly at random from H; and let
u ∈ U. For any subset S ⊆ U of size at most n, the expected number of Modulus. Choose a prime number p ≈ n. no need for randomness here
37 38
Theorem. H = { ha : a ∈ A } is a universal class of hash functions. Fact. Let p be prime, and let z ≠ 0 mod p. Then αz = m mod p has at most
one solution 0 ≤ α < p.
Pf. Let x = (x1, x2, …, xr) and y = (y1, y2, …, yr) be two distinct elements of
U. We need to show that Pr[ha(x) = ha(y)] ≤ 1/n. Pf.
Since x ≠ y, there exists an integer j such that xj ≠ yj. Suppose α and β are two different solutions.
We have ha(x) = ha(y) iff Then (α - β)z = 0 mod p; hence (α - β)z is divisible by p.
Since z ≠ 0 mod p, we know that z is not divisible by p;
aj (yj " xj) = $ ai (xi " yi ) mod p
1424 3 i# j it follows that (α - β) is divisible by p.
z 1 44244 3
m This implies α = β. ▪
Can assume a was chosen uniformly at random by first selecting all
coordinates ai where i ≠ j, then selecting aj at random. Thus, we can
assume ! ai is fixed for all coordinates i ≠ j. Bonus fact. Can replace "at most one" with "exactly one" in above fact.
Since p is prime, aj z = m mod p has at most one solution among p Pf idea. Euclid's algorithm.
possibilities. see lemma on next slide
39 40
Chernoff Bounds (above mean)
13.9 Chernoff Bounds Theorem. Suppose X1, …, Xn are independent 0-1 random variables. Let
X = X1 + … + Xn. Then for any µ ≥ E[X] and for any δ > 0, we have
µ
' e! $
Pr[ X > (1 + ! ) µ ] < % 1+! "
& (1 + ! ) #
[
Pr[X > (1+ ")µ] = Pr e t X > e t(1+")µ ] # e$t(1+")µ % E[e tX ]
!
Now E[e tX ] = E[e t "i X i ] = # i E[e t X i ]
definition of X independence
42
!
Pf. (cont) Theorem. Suppose X1, …, Xn are independent 0-1 random variables. Let
Let pi = Pr[Xi = 1]. Then, X = X1 + … + Xn. Then for any µ ≤ E[X] and for any 0 < δ < 1, we have
t !1)
E[et X i ] = pi et + (1 ! pi )e0 = 1+ pi (et ! 1) " e pi ( e Pr[ X < (1 " ! ) µ ] < e "!
2µ / 2
!
Finally, choose t = ln(1 + δ). ▪
43 44
Load Balancing
13.10 Load Balancing Load balancing. System in which m jobs arrive in a stream and need to
be processed immediately on n identical processors. Find an assignment
that balances the workload across processors.
46
47 48
Extra Slides 13.5 Randomized Divide-and-Conquer
Quicksort Quicksort
51 52
Quicksort: BST Representation of Splitters Quicksort: BST Representation of Splitters
BST representation. Draw recursive BST of splitters. Observation. Element only compared with its ancestors and descendants.
x2 and x7 are compared if their lca = x2 or x7.
x2 and x7 are not compared if their lca = x3 or x4 or x5 or x6.
x7 x6 x12 x3 x11 x8 x7 x1 x15 x13 x17 x10 x16 x14 x9 x4 x5
Claim. Pr[xi and xj are compared] = 2 / |j - i + 1|.
first splitter, chosen uniformly at random
x10 x10
S- x5 S+ x13 x5 x13
x1 x6 x8 x14 x1 x6 x8 x14
53 54
2 n i 1 n 1 n 1
# = 2# # $ 2n # % 2n & dx = 2n ln n
1$ i < j $ n j " i +1 i=1 j=2 j j=1 j x=1 x
55