Week 1.mdp
Linear Regression with One Variable (Week 1)
Two types:
Supervised Learning
Unsupervised Learning
Linear Regression with One Variable
Hypothesis
hθ(x)=θ0+θ1x
Parameters
θ0,θ1
Cost Function
J(θ0,θ1)=12m∑i=1m(hθx(i)−y(i))2
Goal
minimizeθ0,θ1J(θ0,θ1)
Gradient Descent
Problem Description
Have some function J(θ0,θ1) , want minθ0,θ1J(θ0,θ1)
Algorithm
θj:=θj−α∂∂θjJ(θ0,θ1) (for j=0 and j=1)
α : learning rate
Repeat until convergence
Simultaneously update