02_Gradient Descent/Linear Regression/Linear Algebra
Carpe Tu Black Whistle

Parameter Learning

Gradient Descent

By taking the derivative(the tangential line to a function) of cost function, we can get a local minima.

The way we iterately update the parameters is Gradient Descent

Gradient Descent Algorithm

repeat until convergence:

At each iteration j, one should simultaneously update the parameters. Updating a specific parameter prior to calculating another one on theiteration would yield to a wrong implementation.

image

Gradient Descent for Linear Regression

repeat until convergence:{


}

The gradient descent is also called Batch Gradient Descent.
This method looks at every example in the entire training set on every step.

image

Linear Algebra Review

It’s hard to write the latex formula for the Matrix. So help urself