## Regression using an $l_{1/2}$ prior ## We can write the regression cost function—adding an $l_{1/2}$ prior—using explicit summations as $$ cost = \sum_{e=1}^{E} \left( \sum_{i=1}^P M^{(e)}_{i} x_{i} \right)^2 + \lambda \sum_{i=1}^P \sqrt{|x_i|} $$ If we restrict to one variable from the numerous vectors and denote this variable by $x$, we get $$ \frac{\partial cost}{\partial x} = A x + B + \frac{\lambda sgn(x)}{2\sqrt{|x|}} = 0 $$ where $A$ and $B$ don't depend on $x$. If we denote $\lambda sgn(x)/2$ by $C$, we can multiply through by $y =\sqrt{|x|}$ to find the minimum (along this co-ordinate) is $$ \pm A y^3 + B y + C = 0 $$ where the $\pm$ depends whether $x$ is positive/negative and all subject to needing to ensure the solutions in $y$ are also consistent with the original equation. Since this is a cubic equation we have a simple closed form for the solutions to this equation and hence can efficiently solve the original equation.