期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
On the l_(1)-Norm Invariant Convex k-Sparse Decomposition of Signals 被引量:3
1
作者 Guangwu Xu Zhiqiang Xu 《Journal of the Operations Research Society of China》 EI 2013年第4期537-541,共5页
Inspired by an interesting idea of Cai and Zhang,we formulate and prove the convex k-sparse decomposition of vectors that is invariant with respect to the l_(1) norm.This result fits well in discussing compressed sens... Inspired by an interesting idea of Cai and Zhang,we formulate and prove the convex k-sparse decomposition of vectors that is invariant with respect to the l_(1) norm.This result fits well in discussing compressed sensing problems under the Restricted Isometry property,but we believe it also has independent interest.As an application,a simple derivation of the RIP recovery conditionδk+θk,k<1 is presented. 展开更多
关键词 Convex k-sparse decomposition l_(1)1 minimization Restricted isometry property Sparse recovery
原文传递
GRAPH SPARSIFICATION BY UNIVERSAL GREEDY ALGORITHMS
2
作者 Ming-Jun Lai Jiaxin Xie Zhiqiang Xu 《Journal of Computational Mathematics》 SCIE CSCD 2023年第4期741-770,共30页
Graph sparsification is to approximate an arbitrary graph by a sparse graph and is useful in many applications,such as simplification of social networks,least squares problems,and numerical solution of symmetric posit... Graph sparsification is to approximate an arbitrary graph by a sparse graph and is useful in many applications,such as simplification of social networks,least squares problems,and numerical solution of symmetric positive definite linear systems.In this paper,inspired by the well-known sparse signal recovery algorithm called orthogonal matching pursuit(OMP),we introduce a deterministic,greedy edge selection algorithm,which is called the universal greedy approach(UGA)for the graph sparsification problem.For a general spectral sparsification problem,e.g.,the positive subset selection problem from a set of m vectors in R n,we propose a nonnegative UGA algorithm which needs O(mn^(2)+n^(3)/ϵ^(2))time to find a 1+ϵ/β/1-ϵ/β-spectral sparsifier with positive coefficients with sparsity at most[n/ϵ^(2)],where β is the ratio between the smallest length and largest length of the vectors.The convergence of the nonnegative UGA algorithm is established.For the graph sparsification problem,another UGA algorithm is proposed which can output a 1+O(ϵ)/1-O(ϵ)-spectral sparsifier with[n/ϵ^(2)]edges in O(m+n^(2)/ϵ^(2))time from a graph with m edges and n vertices under some mild assumptions.This is a linear time algorithm in terms of the number of edges that the community of graph sparsification is looking for.The best result in the literature to the knowledge of the authors is the existence of a deterministic algorithm which is almost linear,i.e.O(m^(1+o(1)))for some o(1)=O((log log(m))^(2/3)/log^(1/3)(m)).Finally,extensive experimental results,including applications to graph clustering and least squares regression,show the effectiveness of proposed approaches. 展开更多
关键词 Spectral sparsification Subset selection Greedy algorithms Graph clustering Linear sketching
原文传递
SOLVING SYSTEMS OF QUADRATIC EQUATIONS VIA EXPONENTIAL-TYPE GRADIENT DESCENT ALGORITHM
3
作者 Meng Huang Zhiqiang Xu 《Journal of Computational Mathematics》 SCIE CSCD 2020年第4期638-660,共23页
We consider the rank minimization problem from quadratic measurements,i.e.,recovering a rank r matrix X 2 Rn×r from m scalar measurements yi=ai XX⊤ai,ai 2 Rn,i=1,...,m.Such problem arises in a variety of applicat... We consider the rank minimization problem from quadratic measurements,i.e.,recovering a rank r matrix X 2 Rn×r from m scalar measurements yi=ai XX⊤ai,ai 2 Rn,i=1,...,m.Such problem arises in a variety of applications such as quadratic regression and quantum state tomography.We present a novel algorithm,which is termed exponential-type gradient descent algorithm,to minimize a non-convex objective function f(U)=14m Pm i=1(yi−a⊤i UU⊤ai)2.This algorithm starts with a careful initialization,and then refines this initial guess by iteratively applying exponential-type gradient descent.Particularly,we can obtain a good initial guess of X as long as the number of Gaussian random measurements is O(nr),and our iteration algorithm can converge linearly to the true X(up to an orthogonal matrix)with m=O(nr log(cr))Gaussian random measurements。 展开更多
关键词 Low-rank matrix recovery Non-convex optimization Phase retrieval
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部