码迷,mamicode.com
首页 > 其他好文 > 详细

使用次梯度法求解lasso

时间:2020-05-10 19:25:21      阅读:167      评论:0      收藏:0      [点我收藏+]

标签:lan   die   tor   under   sig   amp   gradient   inline   lang   

Using subgradient method to solve lasso problem

The problem is to solve:

\[\underset{\beta}{\operatorname{minimize}}\left\{\frac{1}{2 N} \sum_{i=1}^{N}\left(y_{i}-z_{i} \beta\right)^{2}+\lambda|\beta|\right\} \]

Subgradient Optimality:

\[0 \in \partial\left\{\frac{1}{2 N} \sum_{i=1}^{N}\left(y_{i}-z_{i} \beta\right)^{2}+\lambda|\beta|\right\} \]

\[\Longleftrightarrow 0 \in-\frac{1}{N}\sum_{i=1}^{N}z_i(y_i-z_i\beta)+\lambda \partial|\beta| \]

Denote \(v=\partial|\beta|\),according to the definition of subgradient, we have

\[v \in\left\{\begin{array}{ll} \{1\} & \text { if } \beta>0 \\{-1\} & \text { if } \beta<0 \{[-1,1]} & \text { if } \beta=0 \end{array}\right. \]

The subgradient optimality condition is

\[\frac{1}{N}\sum_{i=1}^{N}z_i(y_i-z_i\beta)=\lambda v \]

  • if \(\beta>0, v=1\)

    \[\frac{1}{N}\sum_{i=1}^{N}z_i(y_i-z_i\beta)=\lambda \]

    we can solve \(\beta=\frac{\sum z_iy_i-\lambda N}{\sum z_i^2}\)

    Since zi is standardized,\(\sum z_i^2=N\),

    \[\beta=\frac{\sum z_iy_i-\lambda N}N\\=\frac{1}{N}\langle\mathbf{z}, \mathbf{y}\rangle-\lambda \]

  • if \(\beta<0\), \(v=-1\)

    \[\frac{1}{N}\sum_{i=1}^{N}z_i(y_i-z_i\beta)=-\lambda \]

    we can solve \(\beta=\frac{\sum z_iy_i+\lambda N}{\sum z_i^2}\)

    Since zi is standardized,\(\sum z_i^2=N\),

    \[\beta=\frac{\sum z_iy_i+\lambda N}N\\=\frac{1}{N}\langle\mathbf{z}, \mathbf{y}\rangle+\lambda \]

  • if \(\beta=0,|v|\le1\)

    \[|\frac{1}{N}\sum_{i=1}^{N}z_i(y_i-z_i\beta)|\le\lambda \]

    Since \(\beta=0,\) we have \(\frac{1}{N}|\langle\mathbf{z}, \mathbf{y}\rangle| \leq \lambda\)

In conclusion, we have:

\[\widehat{\beta}=\left\{\begin{array}{ll} \frac{1}{N}\langle\mathbf{z}, \mathbf{y}\rangle-\lambda & \text { if } \frac{1}{N}\langle\mathbf{z}, \mathbf{y}\rangle \quad>\lambda \0 & \text { if } \frac{1}{N}|\langle\mathbf{z}, \mathbf{y}\rangle| \leq \lambda \\frac{1}{N}\langle\mathbf{z}, \mathbf{y}\rangle+\lambda & \text { if } \frac{1}{N}\langle\mathbf{z}, \mathbf{y}\rangle<-\lambda \end{array}\right.\]

i.e.

\[\widehat{\beta}=\mathcal{S}_{\lambda}\left(\frac{1}{N}\langle\mathbf{z}, \mathbf{y}\rangle\right) \]

Where $$\mathcal{S}_{\lambda}(x)=\operatorname{sign}(x)(|x|-\lambda)$$

使用次梯度法求解lasso

标签:lan   die   tor   under   sig   amp   gradient   inline   lang   

原文地址:https://www.cnblogs.com/zzqingwenn/p/12864268.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!