Multivariate Linear Regression
Multiple Features
Linear regression with multiple variables
Remark: Note that for convenience reasons in this course we assume
Gradient Descent for Multiple Features
Hypothesis:
Parameters:
Cost Function:
Gradient Descent Algorithm
Repeat{
}(simultaneously update for every
Repeat until convergence:{
}
The following image compares gradient descent with one variable to gradient descent with multiple variables:
practical tricks
Feature Scaling
Main Idea: Make sure features are on a similar scale.
Method: Get every feature into approximately a
Mean normalization
Replace
E.g.
mean normalization formula:
Gradient Descent refer to Learning Rate
To recap:
- if
is too small: slow convergence. - if
is too large: may not decrease on every iteration and thus may not converge.
Features and Polynomial Regression
The feature and form of hypothsis function can be improved in different way.
also can multiple features into one . For instance, we can combine
Common Polynomial Regrssion
Quadratic Function
Cubic Function
Square Root Function
Remark
One important thing to keep in mind is, if you choose your features this way then feature scaling becomes very important.
eg. if
- Post title: 03_Multivariate Linear Regression
- Create time: 2022-01-01 14:58:41
- Post link: Machine-Learning/03-multivariate-linear-regression/
- Copyright notice: All articles in this blog are licensed under BY-NC-SA unless stating additionally.