Showing changes from revision #3 to #4:
Added | Removed | Changed
Regression using an prior
We can write the regression cost function—adding an prior—using explicit summations as
If we restrict to one variable from the numerous vectors and denote this variable by , we get
where and don’t depend on . If we denote by , we can multiply through by to find the minimum (along this co-ordinate) is
where the depends whether is positive/negative and all subject to needing to ensure the solutions in are also consistent with the original equation. Since this is a cubic equation we have a simple closed form for the solutions to this equation and hence can efficiently solve the original equation.
Revision from September 12, 2014 02:59:22 by
davidtweed