EECS 598 Theoretical Foundations of Machine Learning - Homework #2


Hello, if you have any need, please feel free to consult us, this is my wechat: wx91due


Theoretical Foundations of Machine Learning - Homework #2 

Homework Policy: Working in groups is fine, but every student must submit their own writeup. Please write the members of your group on your solutions. There is no strict limit to the size of the group but we may find it a bit suspicious if there are more than 4 to a team. Questions labelled with (Challenge) are not strictly required, but you’ll get some participation credit if you have something interesting to add, even if it’s only a partial answer. 

1) Efficient PAC Learning. Let X := {0, 1} n and let C be the class of monotone disjunctions; that is, C := {h(x) = W i∈S xi : ∀S ⊆ [n]} where ∨ is the logical OR operator. Prove that C is efficiently PAC-learnable. You need to construct an algorithm and prove that it accomplishes what you need. 

2) Learning Circles. (Chapter 2.6 exercise 2.3) Let X := R 2 and consider the set of concepts C of the form c = {(x, y) : x 2 +y 2 ≤ r 2} for some real number r. Show that this class can be (, δ)-PAC-learned from training data of size m ≥ (1/) log(1/δ). 

3) Reducing to PAC Learning. (Chapter 2.6 exercise 2.9) We showed that for a finite hypothesis set C, a consistent learning algorithm A is a PAC-learning algorithm. Here, we consider a converse question. Let S be a finite set of m labeled points. Suppose that you are given a PAC-learning algorithm A. Show that you can use A (that will take in finitely many points) to find in polynomial time a hypothesis h ∈ C that is consistent with S, with high probability. (Hint: you can select an appropriate distribution D over S and give a condition on R(h) for h to be consistent.). 

4) Union of Intervals. (Chapter 3.6 exercise 3.6) What is the VC-dimension of subsets of the real line formed by the union of k intervals? 

5) Ellipsoids. (Chapter 3.6 exercise 3.10) An n-dimensional ellipsoid E is defined by a center point x0 ∈ R n and a symmetric positive semidefinite matrix M ∈ R n×n, such that x ∈ E ⇐⇒ (x − x0) >M(x − x0) ≤ 1. 

(a) Give a quadratic upper bound on the VC dimension of concept class of ellipsoids in R n. That is, show that the VCD = O(n 2 ). 

(b) (Challenge) Give an exact characterization of the VC dimension of ellipsoids. 

6) More Chernoff. (Challenge) Let’s say the random variables X1, . . . , Xn are iid Bernoulli with Pr(Xi = 1) = . We want to bound the probability that Pn i=1 Xi < n 2 . The standard Hoeffding inequality gives us that 

Pr Pn i=1 Xi < n 2  ≤ Pr | Pn i=1 Xi − n| > n 2  ≤ 2 exp(−n2/2), 

which suggests we need n > 1/2 before we can get any decay in probability. But you can get something tighter. Find a way to obtain the following bound, using the Chernoff technique or otherwise: 

Pr Pn i=1 Xi < n 2  ≤ exp(−O(n)). 

发表评论

电子邮件地址不会被公开。 必填项已用*标注