Compressive sensing(CS)is an emerging methodology in computational signal processing that has recently attracted intensive research activities.At present,the basic CS theory includes recoverability and stability:the f...Compressive sensing(CS)is an emerging methodology in computational signal processing that has recently attracted intensive research activities.At present,the basic CS theory includes recoverability and stability:the former quantifies the central fact that a sparse signal of length n can be exactly recovered from far fewer than n measurements via l1-minimization or other recovery techniques,while the latter specifies the stability of a recovery technique in the presence of measurement errors and inexact sparsity.So far,most analyses in CS rely heavily on the Restricted Isometry Property(RIP)for matrices.In this paper,we present an alternative,non-RIP analysis for CS via l1-minimization.Our purpose is three-fold:(a)to introduce an elementary and RIP-free treatment of the basic CS theory;(b)to extend the current recoverability and stability results so that prior knowledge can be utilized to enhance recovery via l1-minimization;and(c)to substantiate a property called uniform recoverability of l1-minimization;that is,for almost all random measurement matrices recoverability is asymptotically identical.With the aid of two classic results,the non-RIP approach enables us to quickly derive from scratch all basic results for the extended theory.展开更多
The generalized l1 greedy algorithm was recently introduced and used to reconstruct medical images in computerized tomography in the compressed sensing framework via total variation minimization. Experimental results ...The generalized l1 greedy algorithm was recently introduced and used to reconstruct medical images in computerized tomography in the compressed sensing framework via total variation minimization. Experimental results showed that this algorithm is superior to the reweighted l1-minimization and l1 greedy algorithms in reconstructing these medical images. In this paper the effectiveness of the generalized l1 greedy algorithm in finding random sparse signals from underdetermined linear systems is investigated. A series of numerical experiments demonstrate that the generalized l1 greedy algorithm is superior to the reweighted l1-minimization and l1 greedy algorithms in the successful recovery of randomly generated Gaussian sparse signals from data generated by Gaussian random matrices. In particular, the generalized l1 greedy algorithm performs extraordinarily well in recovering random sparse signals with nonzero small entries. The stability of the generalized l1 greedy algorithm with respect to its parameters and the impact of noise on the recovery of Gaussian sparse signals are also studied.展开更多
Given that the concurrent L1-minimization(L1-min)problem is often required in some real applications,we investigate how to solve it in parallel on GPUs in this paper.First,we propose a novel self-adaptive warp impleme...Given that the concurrent L1-minimization(L1-min)problem is often required in some real applications,we investigate how to solve it in parallel on GPUs in this paper.First,we propose a novel self-adaptive warp implementation of the matrix-vector multiplication(Ax)and a novel self-adaptive thread implementation of the matrix-vector multiplication(ATx),respectively,on the GPU.The vector-operation and inner-product decision trees are adopted to choose the optimal vector-operation and inner-product kernels for vectors of any size.Second,based on the above proposed kernels,the iterative shrinkage-thresholding algorithm is utilized to present two concurrent L1-min solvers from the perspective of the streams and the thread blocks on a GPU,and optimize their performance by using the new features of GPU such as the shuffle instruction and the read-only data cache.Finally,we design a concurrent L1-min solver on multiple GPUs.The experimental results have validated the high effectiveness and good performance of our proposed methods.展开更多
文摘Compressive sensing(CS)is an emerging methodology in computational signal processing that has recently attracted intensive research activities.At present,the basic CS theory includes recoverability and stability:the former quantifies the central fact that a sparse signal of length n can be exactly recovered from far fewer than n measurements via l1-minimization or other recovery techniques,while the latter specifies the stability of a recovery technique in the presence of measurement errors and inexact sparsity.So far,most analyses in CS rely heavily on the Restricted Isometry Property(RIP)for matrices.In this paper,we present an alternative,non-RIP analysis for CS via l1-minimization.Our purpose is three-fold:(a)to introduce an elementary and RIP-free treatment of the basic CS theory;(b)to extend the current recoverability and stability results so that prior knowledge can be utilized to enhance recovery via l1-minimization;and(c)to substantiate a property called uniform recoverability of l1-minimization;that is,for almost all random measurement matrices recoverability is asymptotically identical.With the aid of two classic results,the non-RIP approach enables us to quickly derive from scratch all basic results for the extended theory.
文摘The generalized l1 greedy algorithm was recently introduced and used to reconstruct medical images in computerized tomography in the compressed sensing framework via total variation minimization. Experimental results showed that this algorithm is superior to the reweighted l1-minimization and l1 greedy algorithms in reconstructing these medical images. In this paper the effectiveness of the generalized l1 greedy algorithm in finding random sparse signals from underdetermined linear systems is investigated. A series of numerical experiments demonstrate that the generalized l1 greedy algorithm is superior to the reweighted l1-minimization and l1 greedy algorithms in the successful recovery of randomly generated Gaussian sparse signals from data generated by Gaussian random matrices. In particular, the generalized l1 greedy algorithm performs extraordinarily well in recovering random sparse signals with nonzero small entries. The stability of the generalized l1 greedy algorithm with respect to its parameters and the impact of noise on the recovery of Gaussian sparse signals are also studied.
基金The research has been supported by the Natural Science Foundation of China under great number 61872422the Natural Science Foundation of Zhejiang Province,China under great number LY19F020028.
文摘Given that the concurrent L1-minimization(L1-min)problem is often required in some real applications,we investigate how to solve it in parallel on GPUs in this paper.First,we propose a novel self-adaptive warp implementation of the matrix-vector multiplication(Ax)and a novel self-adaptive thread implementation of the matrix-vector multiplication(ATx),respectively,on the GPU.The vector-operation and inner-product decision trees are adopted to choose the optimal vector-operation and inner-product kernels for vectors of any size.Second,based on the above proposed kernels,the iterative shrinkage-thresholding algorithm is utilized to present two concurrent L1-min solvers from the perspective of the streams and the thread blocks on a GPU,and optimize their performance by using the new features of GPU such as the shuffle instruction and the read-only data cache.Finally,we design a concurrent L1-min solver on multiple GPUs.The experimental results have validated the high effectiveness and good performance of our proposed methods.