CS 3511: Algorithms Honors Homework 5


Hello, if you have any need, please feel free to consult us, this is my wechat: wx91due


CS 3511: Algorithms Honors Homework 5

Instructions:
  • Upload your solutions to as a single PDF on Gradescope. Please anonymize all your submissions (i.e., do not list your name in the PDF), but if you forget, it’s OK.
  • You may collaborate with any classmates, textbooks, any resource on the internet (but don’t search for the problem solution), etc. Please upload a brief “collaboration statement” listing any collaborators as the last page of your solutions PDF on Gradescope.
  • But after the collaboration, always write your solutions individually.
  • For each problem, you should aim to keep your writeup below one page. For some problems, this may be infeasible, and for some problems you may write significantly less than a page. This is not a hard constraint, but part of the assignment is figuring out how to easily convince the grader of correctness, and to do so concisely. “One page” is just a guideline: if your solution is longer because you chose to use figures (or large margins, display math, etc.) that’s fine.
  • Each problem is worth ten points (even those with multiple subparts).
Problems:

§1 (Game Theory and Convex Optimization).

(a) Calculate the Safety Strategy for the row player and the Safety Strategy for the column player in the following Zero-Sum game.

(3.-3) (0,0)
(1,-1) (4,-4))

Show your calculations along with showing that both the safety values are the same (which is known as value of the game).

(b) (Gradient Descent) Consider an α-strongly convex differentiable function f 1 with the 2-norm (standard length of a vector) of its gradient always bounded by G. Suppose we want to minimize f and let x ∗ denote its minimum. Show that the

gradient descent algorithm with step size α(t 1 +1) satisfies f  P t T xt  − f(x ∗ ) ≤ G2 (1+log T) 2αT . (Thus, strong-convexity allows us to get 1/T instead of 1/ √ T dependency for general convex functions.) dependency in regret (Hint: Change the potential function in the analysis from lecture to Φ(t) = tα 2 ∥xt − x ∗∥ 2 . Also, use that P t∈{1,...,T} 1 t ≤ 1 + log T.)

§2 (NP Completeness). A clique in a graph G = (V, E) is defined as a subset of vertices S ⊆ V such that every two vertices u, v ∈ S have an edge (u, v) ∈ E. Prove that the decision problem of finding whether the input graph has a clique of size at least k is NP complete (where k is an input between 1 and |V |) by reducing 3-SAT to this problem.

§3 (Approximation Algorithms).

(a) In this exercise we will generalize the 2 approximation algorithm for the vertex cover problem from Lecture 22.

In the set cover problem we are given n elements and m subsets S1, . . . , Sm of these elements. The goal is to find the minimum sized collection T ⊆ [m] of subsets such that S i∈T Si contains all the n elements. First write a Linear Program for this problem such that LP∗ ≤ OP T and then use the optimal fractional solution of this LP to design an f-approximation algorithm for set cover, where f is the maximum number of sets Si in which an element e appears.

Remark: This generalizes vertex cover because the elements correspond to the edges of the graph and the sets correspond to subsets of edges incident to the same vertex. There, f = 2 since each edge is incident to at most 2 vertices.

(b) Suppose we are given a graph G = (V, E) with edge weights w : E → R≥0. We are also given a subset of vertices T ⊆ V that need to be connected. Find a subset of edges F ⊆ E of minimum weight P e∈ F we such that all the vertices in T are connected by F (but the vertices outside T may or may not be connected.) This problem is known to be NP Complete. Prove that the MST of the vertices in T gives a 2-approximate solution for this problem.

§4 (Concentration and Randomized Algorithms.)

(a) Prove the following:
i. For any random variable X, we have E h (X − E[X])2 i = E[X2 ] − (E[X])2 .
ii. For any independent random variables X and Y , we have Var(X + Y ) = Var(X) + Var(Y ).

iii. Suppose you are given a coin that shows heads with an unknown probability p ∈ [0, 1]. Given an ϵ ∈ (0 , 1), prove that after O(1/ϵ2 ) coin tosses we can estimate p with probability ≥ 0. 99, i.e., we can output ˆ p such that |pˆ−p| ≤ ϵ with probability ≥ 0.99.

(b) Suppose there are n coupons {1, ..., n}. Each step, you draw a uniformly random coupon independently with replacement (i.e., you may redraw coupons), and you repeat this until you have drawn all n coupons at least once. Prove that with probability at least 1 − 1/n, the process takes O(n log n) steps.

发表评论

电子邮件地址不会被公开。 必填项已用*标注