This course covers four ML algorithms: Decision Tree, Perceptron, Logistic Regression and Neural
Networks. The first chapter is about Machine Learning in general. The fourth chapter covers the
optimization problem (decoupled from a model), which is applicable to any of the algorithms. The fifth
chapter discusses the optimal representation of data in a model.
1.1 MACHINE LEARNING .......................................................................................................................................... 3
1.2 TYPES OF LEARNING PROBLEMS ............................................................................................................................ 4
1.3 EVALUATION: HOW WELL IS THE ALGORITHM LEARNING? .......................................................................................... 5
2.1 LEARNING RULES WHILE PLAYING A GAME............................................................................................................... 7
2.2 HOW TO BUILD A DECISION TREE ........................................................................................................................... 8
2.3 EFFICIENCY, SPEED AND DEPTH OF A DT ............................................................................................................... 14
2.4 IMPURITY MEASURES ........................................................................................................................................ 16
2.5 HOW CAN WE USE DECISION TREES FOR REGRESSION?............................................................................................ 17
2.6 ADVANTAGES AND DISADVANTAGES OF DECISION TREES .......................................................................................... 17
3.1 WHAT IS A PERCEPTRON?.................................................................................................................................. 18
3.2 ALGORITHM.................................................................................................................................................... 19
3.3 POSSIBLE STUMBLING BLOCKS ............................................................................................................................ 25
6.1 RECAP OF UPDATE RULE OF (S)GD ...................................................................................................................... 53
6.2 LOSS FUNCTION FOR CLASSIFICATION ................................................................................................................... 55
6.3 COST FUNCTION OF LOGISTIC REGRESSION ............................................................................................................ 63
6.4 SGD FOR THE LOSS FUNCTION ............................................................................................................................ 64
6.5 SUMMARY LINEAR REGRESSION VS. LOGISTIC REGRESSION ....................................................................................... 66
6.6 HOW TO CONTROL (OVER)FITTING? .................................................................................................................... 67
7.1 RECAP ........................................................................................................................................................... 71
7.2 THE BRAIN ...................................................................................................................................................... 73
7.3 FEED-FORWARD NEURAL NETWORK (A.K.A. MULTI-LAYER PERCEPTRON).................................................................. 74
7.4 NEURAL NETWORK PREDICTION.......................................................................................................................... 77
7.5 EX: REPRESENTING XOR .................................................................................................................................. 78
7.6 COST FUNCTIONS ............................................................................................................................................. 80
7.7 TRAINING THE NEURAL NETWORK ...................................................................................................................... 82
7.8 SUMMARY OF (ARTIFICIAL) NEURAL NETWORKS ..................................................................................................... 85
7.9 SPECIAL TYPES OF NEURAL NETWORKS ................................................................................................................ 86
,Lecture 1 Introduction
1.1 Machine Learning
Machine Learning (ML) is the study of computer algorithms that improve automatically through
experience. It involves becoming better at a task (T), based on some experience (E) with respect to some
performance measure (P).
1.1.1 Learning process
1) Find examples of labels/experiences.
2) Come up with a learning algorithm, which infers rules from examples (training set).
3) Applied the rules to new data.
1.1.2 Examples
- Filter email: If (A or B or C) and not D, then “spam”.
- Recognize handwritten numbers and letters.
- Recognize faces in photos.
- Determine whether text expresses positive, negative or no opinion.
- Guess a person’s age based on a sample of writing.
- Flag suspicious credit-card transactions.
- Recommend books and movies to users based on their own and others’ purchase history.
- Recognize and label mentions of people’s or organization names in text.
ML is not meant for random guessing, like predicting the number when rolling some dice. It studies
algorithms that learn from examples.
, 1.2 Types of learning problems
Type Input Response Example
Regression A (real) number predict person’s age, predict price of a stock,
predict student’s score on exam
Binary classification YES/NO answer (condition being there detect SPAM, predict polarity of product review:
or not there) positive vs negative
Multiclass One of a finite set of options detect species based on photo, classify newspaper
classification article as <politics> <sports> …
Multilabel A finite set of YES/NO answers assign songs to one or more genres (rock – pop –
classification metal, hip-hop – rap)
Ranking Object ordered according to relevance rank web pages in response to user query, predict
student’s preference for courses in a program
Sequence labeling a sequence of elements a corresponding sequence of labels label words in a sentence with their syntactic
(ex. words) category (noun – adverb – verb)
Sequence-to- a sequence of elements sequence of other elements (possibly translations (“My name is Penelope” → “Me
sequence modeling (ex. words) different length, possibly elements from llamo Penélope”), computer-generated subtitles
different sets)
Autonomous measurements from instructions for actuators (steering, self-driving car
behavior sensors (microphone, accelerator, brake …)
accelerometer …)
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller clairevanroey. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $10.78. You're not tied to anything after your purchase.