The regression function involves predicting the output based on
$$ f(x) = w_1\phi_1(x) + w_2\phi_2(x) + ... w_n\phi_n(x) $$
This equation can also be represented in the following form
$$ f(x) = \phi(x)^Tw $$
The hope is that the predicted output of $f(x)$ is as close to the real values of $y$.
For illustration purposes, say we have the following
Model 1 Results
| x | y_predicted | y_real | error |
|---|---|---|---|
| 1 | 4 | 5 | 1 |
| 2 | 4 | 4 | 0 |
| 3 | 10 | 8 | -2 |
Model 2 Results
| x | y_predicted | y_real | error |
|---|---|---|---|
| 1 | 20 | 5 | 15 |
| 2 | 40 | 4 | 36 |
| 3 | 50 | 8 | -42 |
In this case, we can see that the first model $f_1(x)$ gives better results than the second model $f_2(x)$ becomes the predicted value is closer to the real value.
Let’s break down the components of the regression formula for a deeper understanding