02_Gradient Descent/Linear Regression/Linear Algebra
Parameter Learning
Gradient Descent
By taking the derivative(the tangential line to a function) of cost function, we can get a local minima.
The way we iterately update the parameters is Gradient Descent
Gradient Descent Algorithm
repeat until convergence:
At each iteration j, one should simultaneously update the parameters
. Updating a specific parameter prior to calculating another one on the iteration would yield to a wrong implementation.
Gradient Descent for Linear Regression
repeat until convergence:{
}
The gradient descent is also called Batch Gradient Descent.
This method looks at every example in the entire training set on every step.
Linear Algebra Review
It’s hard to write the latex formula for the Matrix. So help urself
- Post title: 02_Gradient Descent/Linear Regression/Linear Algebra
- Create time: 2022-01-01 01:35:12
- Post link: Machine-Learning/02-gradient-descent-linear-regression-linear-algebra/
- Copyright notice: All articles in this blog are licensed under BY-NC-SA unless stating additionally.