Step 0: Matrix $A\in {ℝ}^{n×m}$ , vector $b\in {ℝ}^{m}$ , parameters $\lambda ,\gamma >0$ , $\theta \in \left[-1,1\right]$ , $t>0$ , $s>0$ , $ts>\frac{1}{4}{\left(1+\theta \right)}^{2}{\lambda }_{\mathrm{max}}\left({A}^{\text{T}}A\right)$ . Step 1 Solve the variational inequality $‖y‖-‖{y}^{\circ }‖+{\left(u-{\stackrel{˜}{u}}^{k}\right)}^{\text{T}}\left\{F\left({\stackrel{˜}{u}}^{k}\right)+M\left({\stackrel{˜}{u}}^{k}-{u}^{k}\right)\right\}\ge 0$ (7) we can get a predicted point ${\stackrel{˜}{u}}^{k}$ . correct ${u}^{k+1}$ : ${u}^{k+1}={u}^{k}-\gamma {\alpha }_{k}^{\circ }\stackrel{˜}{M}\left({u}^{k}-{\stackrel{˜}{u}}^{k}\right)$ (8) Step 2 ${\stackrel{˜}{y}}^{k}=arg\underset{y\in {\Delta }_{m}}{min}\lambda ‖y‖+\frac{t}{2}{‖y-\left({y}^{k}+\frac{1}{t}{A}^{\text{T}}{y}^{k}-2b{y}^{k}\right)‖}^{2}$ (9) ${x}^{k}={x}^{k}-\frac{1}{s+\lambda }\left(A\left(1+\theta \right){\stackrel{˜}{y}}^{k}-\theta {x}^{k}\right)+{y}^{k}$ (10) Step 3 $\left(\begin{array}{c}{y}^{k+1}\\ {x}^{k+1}\end{array}\right)=\left(\begin{array}{c}{y}^{k}\\ {x}^{k}\end{array}\right)-\gamma {\alpha }_{k}^{\circ }\left(\begin{array}{cc}{I}_{n}& \\ \left(\theta -1\right)\frac{1}{s}A& {I}_{m}\end{array}\right)\left(\begin{array}{c}{y}^{k}-{\stackrel{˜}{y}}^{k}\\ {x}^{k}-{\stackrel{˜}{x}}^{k}\end{array}\right)$ (11) where ${\alpha }_{k}^{\circ }=\frac{{\left({u}^{k}-{\stackrel{˜}{u}}^{k}\right)}^{\text{T}}M\left({u}^{k}-{\stackrel{˜}{u}}^{k}\right)}{{‖\stackrel{˜}{M}\left({u}^{k}-{\stackrel{˜}{u}}^{k}\right)‖}_{H}^{2}}$ (12) Step 4 if $\frac{{‖A{x}^{k}-b‖}_{2}}{{‖b‖}_{2}}\le \epsilon$ end; else for $k=k+1$ , do Step 2