CS 189 Introduction to Machine Learning
Spring 2021 Jonathan Shewchuk HW3
Due: Wednesday, February 24 at 11:59 pm
This homework consists of coding assignments and math problems.
Begin early; you can submit models to Kaggle only twice a day!
DELIVERABLES:
1. Submit your predictions for the tes...
cs 189 introduction to machine learning spring 2021 jonathan shewchuk hw3 due wednesday
february 24 at 1159 pm this homework consists of coding assignment
Written for
Introduction to Machine Learning
All documents for this subject (9)
Seller
Follow
ExamsConnoisseur
Reviews received
Content preview
CS 189 Introduction to Machine Learning
Spring 2021 Jonathan Shewchuk HW3
Due: Wednesday, February 24 at 11:59 pm
This homework consists of coding assignments and math problems.
Begin early; you can submit models to Kaggle only twice a day!
DELIVERABLES:
1. Submit your predictions for the test sets to Kaggle as early as possible. Include your Kaggle
scores in your write-up. The Kaggle competition for this assignment can be found at
• MNIST: https://www.kaggle.com/c/spring21-cs189-hw3-mnist
• SPAM: https://www.kaggle.com/c/spring21-cs189-hw3-spam
2. Write-up: Submit your solution in PDF format to “Homework 3 Write-Up” in Gradescope.
• On the first page of your write-up, please list students with whom you collaborated
• Start each question on a new page. If there are graphs, include those graphs on the
same pages as the question write-up. DO NOT put them in an appendix. We need each
solution to be self-contained on pages of its own.
• Only PDF uploads to Gradescope will be accepted. You are encouraged use LATEX
or Word to typeset your solution. You may also scan a neatly handwritten solution to
produce the PDF.
• Replicate all your code in an appendix. Begin code for each coding question in a fresh
page. Do not put code from multiple questions in the same page. When you upload this
PDF on Gradescope, make sure that you assign the relevant pages of your code from
appendix to correct questions.
• While collaboration is encouraged, everything in your solution must be your (and only
your) creation. Copying the answers or code of another student is strictly forbidden.
Furthermore, all external material (i.e., anything outside lectures and assigned readings,
including figures and pictures) should be cited properly. We wish to remind you that
consequences of academic misconduct are particularly severe!
3. Code: Submit your code as a .zip file to “Homework 3 Code”.
• Set a seed for all pseudo-random numbers generated in your code. This ensures
your results are replicated when readers run your code.
• Include a README with your name, student ID, the values of random seed (above) you
used, and any instructions for compilation.
• Do NOT provide any data files. Supply instructions on how to add data to your code.
• Code requiring exorbitant memory or execution time might not be considered.
, • Code submitted here must match that in the PDF Write-up. The Kaggle score will not be
accepted if the code provided a) does not compile or b) compiles but does not produce
the file submitted to Kaggle.
4. The assignment covers concepts on Gaussian distributions and classifiers. Some of the ma-
terial may not have been covered in lecture; you are responsible for finding resources to
understand it.
1 Honor Code
Declare and sign the following statement (Mac Preview and FoxIt PDF Reader, among others, have
tools to let you sign a PDF file):
“I certify that all solutions are entirely my own and that I have not looked at anyone else’s solution.
I have given credit to all external sources I consulted.”
Signature:
2 Gaussian Classification
Let f (x | Ci ) ∼ N(µi , σ2 ) for a two-class, one-dimensional classification problem with classes C1
and C2 , P(C1 ) = P(C2 ) = 1/2, and µ2 > µ1 .
1. Find the Bayes optimal decision boundary and the corresponding Bayes decision rule by
finding the point(s) at which the posterior probabilities are equal.
2. Suppose the decision boundary for your classifier is x = b. The Bayes error is the probability
of misclassification, namely,
Pe = P((C1 misclassified as C2 ) ∪ (C2 misclassified as C1 )).
Show that the Bayes error associated with this decision rule, in terms of b, is
Z b
(x − µ2 )2 (x − µ1 )2
Z ∞
1
Pe (b) = √ dx + dx .
exp − exp −
2 2πσ −∞ 2σ2 b 2σ2
3. Using the expression above for the Bayes error, calculate the optimal decision boundary b∗
that minimizes Pe (b). How does this value compare to that found in part 1? Hint: Pe (b) is
convex for µ1 < b < µ2 .
Solution:
1.
P(C1 | x) = P(C2 | x) ⇔
f (x | C1 ) P(C 1)
f (x)
= f (x | C2 ) P(C 2)
f (x)
⇔
f (x | C1 ) = f (x | C2 ) ⇔
N(µ1 , σ2 ) = N(µ2 , σ2 ) ⇔
(x − µ1 )2 = (x − µ2 )2
, µ1 +µ2
Thus, the decision boundary is the point equidistant to µ1 and µ2 : b = 2
.
The corresponding decision rule is, given a data point x ∈ R:
µ1 +µ2
• if x < 2
, then classify x in class 1;
• otherwise, classify x in class 2.
Note that this is the centroid method.
2. Applying basic rules of probability, we have
Pe = P((C1 misclassified as C2 ) ∪ (C2 misclassified as C1 ))
= P(C1 misclassified as C2 ) + P(C2 misclassified as C1 )
= P(misclassified as C2 | C1 )P(C1 ) + P(misclassified as C1 | C2 )P(C2 ).
We know P(C1 ) = P(C2 ) = 21 . Thus, we use the PDF to calculate
Z b
1
e−(x−µ2 ) /(2σ ) dx.
2 2
P(misclassified as C1 | C2 ) = √
−∞ 2πσ
Z ∞
1
e−(x−µ1 ) /(2σ ) dx.
2 2
P(misclassified as C2 | C1 ) = √
b 2πσ
Therefore,
1
Pe (b) = P(misclassified as C1 | C2 ) + P(misclassified as C1 | C2 )
2 Z b
(x−µ2 )2
Z ∞ (x−µ )2
1 1
= √ e 2σ2 dx + e− 2σ2 dx .
−
2 2πσ −∞ b
3. We know the optimal decision boundary must lie between µ1 and µ2 (convince yourself this
is the case!). Thus, we can minimize Pe (b) by taking the derivative and setting it equal to
zero. Letting C = 2 √12πσ , f1 (x) = e−(x−µ2 ) /(2σ ) , and f2 (x) = e−(x−µ1 ) /(2σ ) , we have
2 2 2 2
Z Z ∞
dPe (b) d b
= C f1 (x)dx +
f2 (x)dx
db db −∞ b
Z !
d b d Z ∞
= C f1 (x)dx + f2 (x)dx .
db −∞ db b
From the fundamental theorem of calculus, we know that, for continuous functions f (x) such
that | f | → 0 for x → −∞, Z x
d
f (t) dt = f (x).
dx −∞
Noting that f1 and f2 satisfy these properties, we apply that here to get
dPe (b)
= C f1 (b) − f2 (b)] .
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller ExamsConnoisseur. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $7.99. You're not tied to anything after your purchase.