• 论文 • 上一篇    

基于近似稀疏正则化的低秩张量填充算法

胡文玉1, 郑伟东1, 黄进红1, 喻高航2   

  1. 1. 赣南师范大学 数学与计算机科学学院, 赣州 341000;
    2. 杭州电子科技大学理学院, 杭州 310018
  • 收稿日期:2021-10-05 发布日期:2023-03-16
  • 基金资助:
    国家自然科学基金(62266002,61863001,82060328,12071104,61962003);浙江省自然科学基金(LD19A010002)和江西省自然科学基金(20224BAB202004,20192BAB205086,20202BAB202017)资助.

胡文玉, 郑伟东, 黄进红, 喻高航. 基于近似稀疏正则化的低秩张量填充算法[J]. 数值计算与计算机应用, 2023, 44(1): 53-67.

Hu Wenyu, Zheng Weidong, Huang Jinhong, Yu Gaohang. APPROXIMATE SPARSITY REGULARIZED LOW-RANK TENSOR COMPLETION[J]. Journal on Numerica Methods and Computer Applications, 2023, 44(1): 53-67.

APPROXIMATE SPARSITY REGULARIZED LOW-RANK TENSOR COMPLETION

Hu Wenyu1, Zheng Weidong1, Huang Jinhong1, Yu Gaohang2   

  1. 1. School of Mathematics and Computer Sciences, Gannan Normal University, GanZhou 341000, China;
    2. Department of Mathematics, School of Science, Hangzhou Dianzi University, HangZhou 310018, China
  • Received:2021-10-05 Published:2023-03-16
针对目前大多数的低秩张量填充模型存在稀疏过约束而导致恢复数据的细微特征被忽略的现象,本文借助低秩矩阵分解和框架变换,引入软阈值算子的$\ell_0$范数正则项,提出一个基于近似稀疏正则化的低秩张量填充模型.为有效地求解该模型,我们将$\ell_0$范数改写为具有非线性不连续权函数的加权$\ell_1$范数,并用连续权函数逼近不连续权函数,在此基础上设计块逐次上界极小化的求解算法.在一定条件下,证明该算法的收敛性.大量实验表明,本文所提出的算法比现有一些经典算法能更好地重建得到图像的局部细节特征.
Considering that most of the current Low-Rank Tensor Completion (LRTC) models suffer from the shortage of over-constraints on the sparsity, some subtle features of the recovered data are ignored. Based on low-rank matrix decomposition and framelet transform, in this paper we propose an Approximate Sparsity regularization for Low-Rank Tensor Completion (AS-LRTC) by introducing a $\ell_0$-norm regularized term of the soft-thresholding operator. To solve the resulted model effectively, we rewrite the $\ell_0$ norm as a weighted $\ell_1$ norm with a nonlinear discontinuous weight function which is then approximated by a continuous weight function, and then design a Block Successive Upper bound Minimization (BSUM) solving algorithm. Under certain condition, we can prove the convergence of the proposed algorithm. Extensive experiments are conducted to show that the proposed algorithm is better than the classic algorithms at recovering the local detail features of images.

MR(2010)主题分类: 

()
[1] Wang Y, Meng D, Yuan M, Sparse recovery:from vectors to tensors[J]. Natl Sci Rev, 2018, 5:756-767.
[2] Yokota T, Lee N, Cichocki A. Robust multilinear tensor rank estimation using higher order singular value decomposition and information criteria[J]. IEEE Trans Signal Proces, 2016, 65(5):1196-1206.
[3] Sobral A, Zahzah E. Matrix and tensor completion algorithms for background model initialization:A comparative evaluation[J]. Pattern Recogn Lett, 2017, 96:22-33.
[4] Varghees V N, Manikandan M S, Gini R. Adaptive MRI image denoising using total-variation and local noise estimation[C]. In:Proceedings of International Conference on Advances in Engineering, Science And Management (ICAESM-2012), 2012. 506-511.
[5] Zhao X L, Wang F, Huang T Z, et al. Deblurring and sparse unmixing for hyperspectral images[J]. IEEE Trans Geosci Remote Sens, 2013, 51(7):4045-4058.
[6] Chen Y, Huang T Z, Zhao X L, et al. Hyperspectral image restoration using framelet-regularized low-rank nonnegative matrix factorization[J]. Appl Math Model, 2018, 63:128-147.
[7] Jiang T X, Huang T Z, Zhao X L, et al. Fastderain:A novel video rain streak removal method using directional gradient priors[J]. IEEE Trans Image Process, 2018, 28(4):2089-2102.
[8] Wang Y G, Huang T Z, Zhao X L, et al. Video deraining via nonlocal low-rank regularization[J]. Appl Math Model, 2020, 79:896-913.
[9] Sidiropoulos N D, De Lathauwer L, Fu X, et al. Tensor decomposition for signal processing and machine learning[J]. IEEE Trans Image Process, 2017, 65(13):3551-3582.
[10] Liu J, Musialski P, Wonka P, et al. Tensor completion for estimating missing values in visual data[J]. IEEE Trans Pattern Anal Mach Intell, 2012, 35(1):208-220.
[11] Tan H, Cheng B, Wang W, et al. Tensor completion via a multi-linear low-n-rank factorization model[J]. Neurocomputing, 2014, 133:161-169.
[12] Hillar C J, Lim L H. Most tensor problems are NP-hard[J]. J ACM, 2013, 60(6):1-39.
[13] Gandy S, Recht B, Yamada I. Tensor completion and low-n-rank tensor recovery via convex optimization[J]. Inverse Prob, 2011, 27(2):025010.
[14] Romera-Paredes B, Pontil M. A new convex relaxation for tensor completion[C]. In:Proceedings of Advances in Neural Information Processing Systems, 2013. 2967-2975.
[15] Li Y F, Shang K, Huang Z H. Low Tucker rank tensor recovery via ADMM based on exact and inexact iteratively reweighted algorithms[J]. J Comput Appl Math, 2018, 331:64-81.
[16] Liu C S, Shan H, Chen C L, Tensor p-shrinkage nuclear norm for low-rank tensor completion[J]. Neurocomputing, 2020, 387:255-267.
[17] Ji T Y, Huang T Z, Zhao X L, et al. A non-convex tensor rank approximation for tensor completion[J]. Appl Math Model, 2017, 48:410-422.
[18] Xue J Z, Zhao Y Q, Liao W Z, et al. Nonconvex tensor rank minimization and its applications to tensor recovery[J]. Inform Sci, 2019, 53:109-128.
[19] Mu C, Huang B, Wright J, et al. Square deal:Lower bounds and improved relaxations for tensor recovery[C]. In:Proceedings of International conference on machine learning, 2014. 73-81.
[20] Bai M, Zhang X, Ni G, et al. An adaptive correction approach for tensor completion[J]. SIAM J Imaging Sci, 2016, 9(3):1298-1323.
[21] Xu Y, Hao R, Yin W, et al. Parallel matrix factorization for low-rank tensor completion[J]. Inverse Probl Imag, 2015, 9(2):601-624.
[22] Wen Z, Yin W, Zhang Y. Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm[J]. Math Program Comput, 2012, 4(4):333-361.
[23] Zhao X L, Wang W, Zeng T Y, et al. Total variation structured total least squares method for image restoration[J]. SIAM J Sci Comput, 2013, 35(6):1304-1320.
[24] Chan T F, Vese L A. Active contours without edges[J]. IEEE Trans Image Process, 2001, 10(2):266-277.
[25] Chan S H, Khoshabeh R, Gibson K B, et al. An augmented Lagrangian method for total variation video restoration[J]. IEEE Trans Image Process, 2011, 20(11):3097-3111.
[26] Ji T Y, Huang T Z, Zhao X L, et al. Tensor completion using total variation and low-rank matrix factorization[J]. Inform Sci, 2016, 326:243-257.
[27] Jiang T X, Huang T Z, Zhao X L, et al. Matrix factorization for low-rank tensor completion using framelet prior[J]. Inform Sci, 2018, 436:403-417.
[28] Shen L, Xu Y, Zhang N. An approximate sparsity model for inpainting[J]. Appl Comput Harmon Anal, 2014, 37(1):171-184.
[29] Kolda T G, Bader B W. Tensor decompositions and applications[J]. SIAM Rev, 2009, 51(3):455-500.
[30] Razaviyayn M, Hong M, Luo Z Q. A unified convergence analysis of block successive minimization methods for nonsmooth optimization[J]. SIAM J Optimiz, 2013, 23(2):1126-1153.
[31] Li Q, Shen L, Xu Y, et al. Multi-step fixed-point proximity algorithms for solving a class of optimization problems arising from image processing[J]. Adv Comput Math, 2015, 41(2):387-422.
[1] 谢力, 王武, 冯仰德. 基于多层半可分结构矩阵的快速算法与并行实现[J]. 数值计算与计算机应用, 2017, 38(1): 37-48.
阅读次数
全文


摘要