结合稀疏先验与多模式分解的低秩张量恢复方法
Low-rank tensor recovery using sparse prior and multi-modal tensor factorization
- 2024年29卷第4期 页码:922-938
纸质出版日期: 2024-04-16
DOI: 10.11834/jig.230490
移动端阅览
浏览全部资源
扫码关注微信
纸质出版日期: 2024-04-16 ,
移动端阅览
杨秀红, 苟田坤, 薛怡, 金海燕, 石争浩. 2024. 结合稀疏先验与多模式分解的低秩张量恢复方法. 中国图象图形学报, 29(04):0922-0938
Yang Xiuhong, Gou Tiankun, Xue Yi, Jin Haiyan, Shi Zhenghao. 2024. Low-rank tensor recovery using sparse prior and multi-modal tensor factorization. Journal of Image and Graphics, 29(04):0922-0938
目的
2
各类终端设备获取的大量数据往往由于信息丢失而导致数据不完整,或经常受到降质问题的困扰。为有效恢复缺损或降质数据,低秩张量补全备受关注。张量分解可有效挖掘张量数据的内在特征,但传统分解方法诱导的张量秩函数无法探索张量不同模式之间的相关性;另外,传统张量补全方法通常将全变分约束施加于整体张量数据,无法充分利用张量低维子空间的平滑先验。为解决以上两个问题,提出了基于稀疏先验与多模式张量分解的低秩张量恢复方法。
方法
2
在张量秩最小化模型基础上,融入多模式张量分解技术以及分解因子局部稀疏性。首先对原始张量施加核范数约束,以此捕获张量的全局低秩性,然后,利用多模式张量分解将整体张量沿着每个模式分解为一组低维张量和一组因子矩阵,以探索不同模式之间的相关性,对因子矩阵施加因子梯度稀疏正则化约束,探索张量子空间的局部稀疏性,进一步提高张量恢复性能。
结果
2
在高光谱图像、多光谱图像、YUV(也称为YCbCr)视频和医学影像数据上,将本文方法与其他8种修复方法在3种丢失率下进行定量及定性比较。在恢复4种类型张量数据方面,本文方法与深度学习GP-WLRR方法(global prior refined weighted low-rank representation)的修复效果基本持平, 本文方法的MPSNR(mean peak signal-to-noise ratio)在所有丢失率及张量数据上的总体平均高0.68 dB,MSSIM(mean structural similarity)总体平均高0.01;与其他6种张量建模方法相比,本文方法的MPSNR及MSSIM均取得最优结果。
结论
2
提出的基于稀疏先验与多模式张量分解的低秩张量恢复方法,可同时利用张量的全局低秩性与局部稀疏性,能够对受损的多维视觉数据进行有效修复。
Objective
2
The large amount of data obtained by various terminal devices often results in incomplete data due to missing information or is frequently plagued by degradation issues. Low-rank tensor completion has received significant attention for recovering contaminated data. Tensor decomposition can effectively explore the essential features of tensors, but the tensor rank function induced by traditional tensor decomposition methods cannot explore the correlation between different modes of tensors. In addition, traditional tensor completion methods typically impose the total variational constraint on the overall tensor data, which cannot fully utilize the smoothing prior for the low-dimensional subspace of tensors. To address the above two problems, this study proposes a low-rank tensor recovery algorithm using sparse prior and multimode tensor factorization. The traditional low-rank tensor completion models based on tensor rank minimization restore tensors by directly minimizing the tensor rank, in which the tensor rank can be Tucker rank and tensor nuclear norm (TNN). However, extensive research has shown that a correlation exists among different modes of tensor data. The Tucker rank induced by Tucker decomposition and the TNN induced by tensor singular value decomposition cannot flexibly handle multimode correlations within tensors. Therefore, we consider introducing multimode tensor decomposition via mode-n product, incorporating multimode tensor decomposition into the tensor rank minimization model. In the process of continuous iteration to complete the overall tensor, our model can effectively explore the characteristics of mutual correlation between different modes of the tensor, which can address the limitation of traditional TNN in inadequately capturing the intermode correlations within the tensor. Each factor matrix obtained from the multimode tensor decomposition framework encapsulates latent information corresponding to its respective mode, revealing valuable correlated auxiliary information within and across modes, such as the local sparsity exhibited by natural tensor data. By showing that the majority of factor gradients in the factor gradient histogram are zero or close to zero, we can demonstrate that the factors in multimode tensor decomposition exhibit local sparsity. Therefore, on the basis of the assumption of tensor subspace, we consider introducing the local sparsity prior to preserve the similarity in local segments.
Method
2
The method incorporates multimode tensor factorization and local sparsity of decomposed factors based on the tensor rank minimization model. First, the nuclear norm constraint is imposed on the original tensor to capture the global low rankness of the tensor, which makes the model robust when dealing with tensor completion tasks. Second, multimode tensor factorization is used to decompose the tensor into a series of low-dimensional tensors and a series of factor matrices along each mode, which explores the correlation between different modes. The factor gradient sparsity regularization constraint is imposed on the factor matrices to explore the local sparsity of the tensor subspace, which further improves the tensor recovery performance. Specifically, after tensor decomposition, first-order differencing is applied, and the norm smoothness constraint is leveraged. Combining multimode tensor decomposition with tensor subspace sparsity, a robust tensor completion model is developed. The proposed model is optimized through the alternating direction method of multipliers (ADMM) framework, which is achieved by iteratively updating various variables to accomplish tensor completion and tensor decomposition simultaneously.
Result
2
The method in this paper is quantitatively and qualitatively compared with eight other restoration methods at three loss rates on hyperspectral images, multispectral images, YUV (also known as YCbCr) videos, and medical imaging data. The restoration effect of our method is basically the same as that of the deep learning GP-WLRR method, but it has no computational burden at all. Compared with six other tensor modeling methods, our method achieves the best results in terms of mean peak signal-to-noise ratio (MPSNR) and mean structural similarity (MSSIM) metrics. It exhibits superior recovery performance even at high loss rates up to 95%. This finding demonstrates the effectiveness of the proposed model in tensor data recovery.
Conclusion
2
The low-rank tensor completion algorithm based on sparse prior and multimode tensor decomposition proposed in this paper can simultaneously exploit the global low rankness and local sparsity of a tensor and effectively recover contaminated multichannel visual data.
多模式张量分解稀疏先验因子梯度稀疏性低秩张量恢复
multi-modal tensor factorizationsparse priorfactor gradient sparsitylow-rank tensor completion
Boyd S, Parikh N, Chu E, Peleato B and Eckstein J. 2011. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends® in Machine Learning, 3(1): 1-122 [DOI: 10.1561/2200000016http://dx.doi.org/10.1561/2200000016]
Chen C, Wu Z B, Chen Z T, Zheng Z B and Zhang X J. 2021. Auto-weighted robust low-rank tensor completion via tensor-train. Information Sciences, 567: 100-115 [DOI: 10.1016/j.ins.2021.03.025http://dx.doi.org/10.1016/j.ins.2021.03.025]
Chen Y L, Hsu C T and Liao H Y M. 2014. Simultaneous tensor decomposition and completion using factor priors. IEEE Transactions on Pattern Analysis and Machine Intelligence, 36(3): 577-591 [DOI: 10.1109/TPAMI.2013.164http://dx.doi.org/10.1109/TPAMI.2013.164]
Cichocki A, Lee N, Oseledets I V, Phan A H, Zhao Q and Mandic D. 2017. Low-rank tensor networks for dimensionality reduction and large-scale optimization problems: perspectives and challenges PART 1 [EB/OL]. [2023-07-03]. https://arxiv.org/pdf/1609.00893.pdfhttps://arxiv.org/pdf/1609.00893.pdf
Fu Y F, Ruan Q Q, Luo Z Y, Jin Y, An G Y and Wan J. 2019. FERLrTc: 2D+3D facial expression recognition via low-rank tensor completion. Signal Processing, 161: 74-88 [DOI: 10.1016/j.sigpro.2019.03.015http://dx.doi.org/10.1016/j.sigpro.2019.03.015]
Guo X, Yao Q and Kwok J. 2017. Efficient sparse low-rank tensor completion using the Frank-Wolfe algorithm//Proceedings of the 31st AAAI Conference on Artificial Intelligence. San Francisco, USA: AAAI: 1948-1954
Han Z F, Leung C S, Huang L T and So H C. 2017. Sparse and truncated nuclear norm based tensor completion. Neural Processing Letters, 45(3): 729-743 [DOI: 10.1007/s11063-016-9503-4http://dx.doi.org/10.1007/s11063-016-9503-4]
He W, Yokoya N, Yuan L H and Zhao Q B. 2019. Remote sensing image reconstruction using tensor ring completion and total variation. IEEE Transactions on Geoscience and Remote Sensing, 57(11): 8998-9009 [DOI: 10.1109/TGRS.2019.2924017http://dx.doi.org/10.1109/TGRS.2019.2924017]
Hou B, Wang Y H and Liu Q J. 2017. Change detection based on deep features and low rank. IEEE Geoscience and Remote Sensing Letters, 14(12): 2418-2422 [DOI: 10.1109/LGRS.2017.2766840http://dx.doi.org/10.1109/LGRS.2017.2766840]
Ji T Y, Huang T Z, Zhao X L, Ma T H and Liu G. 2016. Tensor completion using total variation and low-rank matrix factorization. Information Sciences, 326: 243-257 [DOI: 10.1016/j.ins.2015.07.049http://dx.doi.org/10.1016/j.ins.2015.07.049]
Jiang T X, Huang T Z, Zhao X L and Deng L J. 2020. Multi-dimensional imaging data recovery via minimizing the partial sum of tubal nuclear norm. Journal of Computational and Applied Mathematics, 372: #112680 [DOI: 10.1016/j.cam.2019.112680http://dx.doi.org/10.1016/j.cam.2019.112680]
Jiang T X, Ng M K, Pan J J and Song G J. 2023. Nonnegative low rank tensor approximations with multidimensional image applications. Numerische Mathematik, 153(1): 141-170 [DOI: 10.1007/s00211-022-01328-6http://dx.doi.org/10.1007/s00211-022-01328-6]
Ko C Y, Batselier K, Daniel L, Yu W J and Wong N. 2020. Fast and accurate tensor completion with total variation regularized tensor trains. IEEE Transactions on Image Processing, 29: 6918-6931 [DOI: 10.1109/TIP.2020.2995061http://dx.doi.org/10.1109/TIP.2020.2995061]
Li X T, Ye Y M and Xu X F. 2017. Low-rank tensor completion with total variation for visual data inpainting//Proceedings of the 31st AAAI Conference on Artificial Intelligence. San Francisco, USA: AAAI: 2210-2216 [DOI: 10.1609/aaai.v31i1.10776http://dx.doi.org/10.1609/aaai.v31i1.10776]
Liao S H, Liu X Y, Han R Y, Fu S J, Zhou Y F and Li Y L. 2024. Image inpainting exploiting global prior refined weighted low-rank representation. Optics and Laser Technology, 169: #110061 [DOI: 10.1016/j.optlastec.2023.110061http://dx.doi.org/10.1016/j.optlastec.2023.110061]
Liu J, Musialski P, Wonka P and Ye J P. 2013. Tensor completion for estimating missing values in visual data. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(1): 208-220 [DOI: 10.1109/TPAMI.2012.39http://dx.doi.org/10.1109/TPAMI.2012.39]
Liu S, Zeng H J, Kong W F and Zhang P D. 2021. Hyperspectral image restoration based on frequency-weighted tensor nuclear norm. Journal of Image and Graphics, 26(8): 1910-1925
刘盛, 曾海金, 孔文凤, 张鹏丹. 2021. 基于频率加权张量核范数的高光谱图像复原. 中国图象图形学报, 26(8): 1910-1925 [DOI: 10.11834/jig.210021http://dx.doi.org/10.11834/jig.210021]
Long Z, Zhu C, Liu J N and Liu Y P. 2021. Bayesian low rank tensor ring for image recovery. IEEE Transactions on Image Processing, 30: 3568-3580 [DOI: 10.1109/TIP.2021.3062195http://dx.doi.org/10.1109/TIP.2021.3062195]
Lu C Y, Feng J S, Chen Y D, Liu W, Lin Z C and Yan S C. 2020. Tensor robust principal component analysis with a new tensor nuclear norm. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(4): 925-938 [DOI: 10.1109/TPAMI.2019.2891760http://dx.doi.org/10.1109/TPAMI.2019.2891760]
Luo Y S, Zhao X L, Jiang T X, Chang Y, Ng M K and Li C. 2022. Self-supervised nonlinear transform-based tensor nuclear norm for multi-dimensional image recovery. IEEE Transactions on Image Processing, 31: 3793-3808 [DOI: 10.1109/TIP.2022.3176220http://dx.doi.org/10.1109/TIP.2022.3176220]
Mu C, Huang B, Wright J and Goldfarb D. 2014. Square deal: lower bounds and improved relaxations for tensor recovery//Proceedings of the 31st International Conference on Machine Learning. Beijing, China: JMLR.org: II-73-II-81
Oseledets I V. 2011. Tensor-train decomposition. SIAM Journal on Scientific Computing, 33(5): 2295-2317 [DOI: 10.1137/090752286http://dx.doi.org/10.1137/090752286]
Qin W J, Wang H L, Zhang F, Wang J J, Luo X and Huang T W. 2022. Low-rank high-order tensor completion with applications in visual data. IEEE Transactions on Image Processing, 31: 2433-2448 [DOI: 10.1109/TIP.2022.3155949http://dx.doi.org/10.1109/TIP.2022.3155949]
Quan W Z, Zhang R S, Zhang Y, Li Z F, Wang J and Yan D M. 2022. Image inpainting with local and global refinement. IEEE Transactions on Image Processing, 31: 2405-2420 [DOI: 10.1109/TIP.2022.3152624http://dx.doi.org/10.1109/TIP.2022.3152624]
Song G J, Ng M K and Zhang X J. 2020. Robust tensor completion using transformed tensor singular value decomposition. Numerical Linear Algebra with Applications, 27(3): #e2299 [DOI: 10.1002/nla.2299http://dx.doi.org/10.1002/nla.2299]
Tucker L R. 1966. Some mathematical notes on three-mode factor analysis. Psychometrika, 31(3): 279-311 [DOI: 10.1007/BF02289464http://dx.doi.org/10.1007/BF02289464]
Wang J L, Huang T Z, Zhao X L, Luo Y S and Jiang T X. 2022. CoNoT: coupled nonlinear transform-based low-rank tensor representation for multidimensional image completion. IEEE Transactions on Neural Networks and Learning Systems (early access) [DOI: 10.1109/TNNLS.2022.3217198http://dx.doi.org/10.1109/TNNLS.2022.3217198]
Wang Y, Peng J J, Zhao Q, Leung Y, Zhao X L and Meng D Y. 2018. Hyperspectral image restoration via total variation regularized low-rank tensor decomposition. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 11(4): 1227-1243 [DOI: 10.1109/JSTARS.2017.2779539http://dx.doi.org/10.1109/JSTARS.2017.2779539]
Xue J Z, Zhao Y Q, Huang S G, Liao W Z, Chan J C W and Kong S G. 2022. Multilayer sparsity-based tensor decomposition for low-rank tensor completion. IEEE Transactions on Neural Networks and Learning Systems, 33(11): 6916-6930 [DOI: 10.1109/TNNLS.2021.3083931http://dx.doi.org/10.1109/TNNLS.2021.3083931]
Yaman B, Weingärtner S, Kargas N Sidiropoulos N D and Akçakaya M. 2020. Low-rank tensor models for improved multidimensional MRI: application to dynamic cardiac T1 mapping. IEEE Transactions on Computational Imaging, 6: 194-207 [DOI: 10.1109/TCI.2019.2940916http://dx.doi.org/10.1109/TCI.2019.2940916]
Yang R Y, Jia Y X, Xu P and Xie X Z. 2019. Hyperspectral image restoration with truncated nuclear norm minimization and total variation regularization. Journal of Image and Graphics, 24(10): 1801-1812
杨润宇, 贾亦雄, 徐鹏, 谢晓振. 2019. 截断核范数和全变差正则化高光谱图像复原. 中国图象图形学报, 24(10): 1801-1812 [DOI: 10.11834/jig.180433http://dx.doi.org/10.11834/jig.180433]
Yang X H, Xue Y, Lv Z Y and Jin H Y. 2022. An effective LRTC model integrated with total α-order variation and boundary adjustment for multichannel visual data inpainting. IET Image Processing, 16(13): 3684-3699 [DOI: 10.1049/ipr2.12585http://dx.doi.org/10.1049/ipr2.12585]
Yu Q and Yang M. 2023. Low-rank tensor recovery via non-convex regularization, structured factorization and spatio-temporal characteristics. Pattern Recognition, 137: #109343 [DOI: 10.1016/j.patcog.2023.109343http://dx.doi.org/10.1016/j.patcog.2023.109343]
Yuan L H, Li C, Mandic D, Cao J T and Zhao Q B. 2019a. Tensor ring decomposition with rank minimization on latent space: an efficient approach for tensor completion//Proceedings of the 33rd AAAI Conference on Artificial Intelligence. Honolulu, USA: AAAI: 9151-9158 [DOI: 10.1609/aaai.v33i01.33019151http://dx.doi.org/10.1609/aaai.v33i01.33019151]
Yuan L H, Zhao Q B, Gui L H and Cao J T. 2019b. High-order tensor completion via gradient-based optimization under coupled nonlinear transform-based low-rank tensor representation for multidimensional image completiontensor train format. Signal Processing: Image Communication, 73: 53-61 [DOI: 10.1016/j.image.2018.11.012http://dx.doi.org/10.1016/j.image.2018.11.012]
Zeng H J. 2021. Multi-mode core tensor factorization based low-rankness and its applications to tensor completion [EB/OL]. [2023-07-03]. https://arxiv.org/pdf/2012.01918.pdfhttps://arxiv.org/pdf/2012.01918.pdf
Zhang H Y, Liu L, He W and Zhang L P. 2020a. Hyperspectral image denoising with total variation regularization and nonlocal low-rank tensor decomposition. IEEE Transactions on Geoscience and Remote Sensing, 58(5): 3071-3084 [DOI: 10.1109/TGRS.2019.2947333http://dx.doi.org/10.1109/TGRS.2019.2947333]
Zhang L, Wei W, Shi Q F, Shen C H, van den Hengel A and Zhang Y N. 2019. Accurate tensor completion via adaptive low-rank representation. IEEE Transactions on Neural Networks and Learning Systems, 31(10): 4170-4184 [DOI: 10.1109/TNNLS.2019.2952427http://dx.doi.org/10.1109/TNNLS.2019.2952427]
Zhang Y K, Peng J J, Zeng D, Xie Q, Li S, Bian Z Y, Wang Y B, Zhang Y, Zhao Q, Zhang H, Liang Z R, Lu H B, Meng D Y and Ma J H. 2020b. Contrast-medium anisotropy-aware tensor total variation model for robust cerebral perfusion CT reconstruction with low-dose scans. IEEE Transactions on Computational Imaging, 6: 1375-1388 [DOI: 10.1109/TCI.2020.3023598http://dx.doi.org/10.1109/TCI.2020.3023598]
Zhang Z M and Aeron S. 2017. Exact tensor completion using t-SVD. IEEE Transactions on Signal Processing, 65(6): 1511-1526 [DOI: 110.1109/TSP.2016.2639466http://dx.doi.org/110.1109/TSP.2016.2639466]
Zheng Y B, Huang T Z, Zhao X L, Jiang T X, Ji T Y and Ma T H. 2020. Tensor N-tubal rank and its convex relaxation for low-rank tensor recovery. Information Sciences, 532: 170-189 [DOI: 10.1016/j.ins.2020.05.005http://dx.doi.org/10.1016/j.ins.2020.05.005]
Zheng Y B, Huang T Z, Zhao X L, Zhao Q B and Jiang T X. 2021. Fully-connected tensor network decomposition and its application to higher-order tensor completion//Proceedings of the 35th AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI: 11071-11078 [DOI: 10.1609/aaai.v35i12.17321http://dx.doi.org/10.1609/aaai.v35i12.17321]
相关文章
相关作者
相关机构