In the realm of large-scale machine learning,it is crucial to explore methods for reducing computational complexity and memory demands while maintaining generalization performance.Additionally,since the collected data...In the realm of large-scale machine learning,it is crucial to explore methods for reducing computational complexity and memory demands while maintaining generalization performance.Additionally,since the collected data may contain some sensitive information,it is also of great significance to study privacy-preserving machine learning algorithms.This paper focuses on the performance of the differentially private stochastic gradient descent(SGD)algorithm based on random features.To begin,the algorithm maps the original data into a lowdimensional space,thereby avoiding the traditional kernel method for large-scale data storage requirement.Subsequently,the algorithm iteratively optimizes parameters using the stochastic gradient descent approach.Lastly,the output perturbation mechanism is employed to introduce random noise,ensuring algorithmic privacy.We prove that the proposed algorithm satisfies the differential privacy while achieving fast convergence rates under some mild conditions.展开更多
A new algorithm for linear instantaneous independent component analysis is proposed based on maximizing the log-likelihood contrast function which can be changed into a gradient equation.An iterative method is introdu...A new algorithm for linear instantaneous independent component analysis is proposed based on maximizing the log-likelihood contrast function which can be changed into a gradient equation.An iterative method is introduced to solve this equation efficiently.The unknown probability density functions as well as their first and second derivatives in the gradient equation are estimated by kernel density method.Computer simulations on artificially generated signals and gray scale natural scene images confirm the efficiency and accuracy of the proposed algorithm.展开更多
We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces.In the algorithm,we use an early stopping technique,instead of the cla...We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces.In the algorithm,we use an early stopping technique,instead of the classical Tikhonov regularization,to prevent the iteration from an overfitting function.Under mild conditions,we obtain upper bounds,essentially matching the known minimax lower bounds,for excess prediction risk.An almost sure convergence is also established for the proposed algorithm.展开更多
This paper deals with Hermite learning which aims at obtaining the target function from the samples of function values and the gradient values. Error analysis is conducted for these algorithms by means of approaches f...This paper deals with Hermite learning which aims at obtaining the target function from the samples of function values and the gradient values. Error analysis is conducted for these algorithms by means of approaches from convex analysis in the frame- work of multi-task vector learning and the improved learning rates are derived.展开更多
行人检测在机器人、驾驶辅助系统和视频监控等领域有广泛的应用,该文提出一种基于显著性检测与方向梯度直方图-非负矩阵分解(Histogram of Oriented Gradient-Non-negative Matrix Factorization,HOG-NMF)特征的快速行人检测方法。采用...行人检测在机器人、驾驶辅助系统和视频监控等领域有广泛的应用,该文提出一种基于显著性检测与方向梯度直方图-非负矩阵分解(Histogram of Oriented Gradient-Non-negative Matrix Factorization,HOG-NMF)特征的快速行人检测方法。采用频谱调谐显著性检测提取显著图,并基于熵值门限进行感兴趣区域的提取;组合非负矩阵分解和方向梯度直方图生成HOG-NMF特征;采用加性交叉核支持向量机方法(Intersection Kernel Support Vector Machine,IKSVM)。该算法显著降低了特征维数,在相同的计算复杂度下明显改善了线性支持向量机的检测率。在INRIA数据库的实验结果表明,该方法对比HOG/线性SVM和HOG/RBF-SVM显著减少了检测时间,并达到了满意的检测率。展开更多
基金supported by Zhejiang Provincial Natural Science Foundation of China(LR20A010001)National Natural Science Foundation of China(12271473 and U21A20426)。
文摘In the realm of large-scale machine learning,it is crucial to explore methods for reducing computational complexity and memory demands while maintaining generalization performance.Additionally,since the collected data may contain some sensitive information,it is also of great significance to study privacy-preserving machine learning algorithms.This paper focuses on the performance of the differentially private stochastic gradient descent(SGD)algorithm based on random features.To begin,the algorithm maps the original data into a lowdimensional space,thereby avoiding the traditional kernel method for large-scale data storage requirement.Subsequently,the algorithm iteratively optimizes parameters using the stochastic gradient descent approach.Lastly,the output perturbation mechanism is employed to introduce random noise,ensuring algorithmic privacy.We prove that the proposed algorithm satisfies the differential privacy while achieving fast convergence rates under some mild conditions.
文摘A new algorithm for linear instantaneous independent component analysis is proposed based on maximizing the log-likelihood contrast function which can be changed into a gradient equation.An iterative method is introduced to solve this equation efficiently.The unknown probability density functions as well as their first and second derivatives in the gradient equation are estimated by kernel density method.Computer simulations on artificially generated signals and gray scale natural scene images confirm the efficiency and accuracy of the proposed algorithm.
基金supported in part by National Natural Science Foundation of China(Grant No.11871438)supported in part by the HKRGC GRF Nos.12300218,12300519,17201020,17300021,C1013-21GF,C7004-21GFJoint NSFC-RGC N-HKU76921。
文摘We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces.In the algorithm,we use an early stopping technique,instead of the classical Tikhonov regularization,to prevent the iteration from an overfitting function.Under mild conditions,we obtain upper bounds,essentially matching the known minimax lower bounds,for excess prediction risk.An almost sure convergence is also established for the proposed algorithm.
基金supported by the National Natural Science Foundation of China(No.11471292)
文摘This paper deals with Hermite learning which aims at obtaining the target function from the samples of function values and the gradient values. Error analysis is conducted for these algorithms by means of approaches from convex analysis in the frame- work of multi-task vector learning and the improved learning rates are derived.
文摘行人检测在机器人、驾驶辅助系统和视频监控等领域有广泛的应用,该文提出一种基于显著性检测与方向梯度直方图-非负矩阵分解(Histogram of Oriented Gradient-Non-negative Matrix Factorization,HOG-NMF)特征的快速行人检测方法。采用频谱调谐显著性检测提取显著图,并基于熵值门限进行感兴趣区域的提取;组合非负矩阵分解和方向梯度直方图生成HOG-NMF特征;采用加性交叉核支持向量机方法(Intersection Kernel Support Vector Machine,IKSVM)。该算法显著降低了特征维数,在相同的计算复杂度下明显改善了线性支持向量机的检测率。在INRIA数据库的实验结果表明,该方法对比HOG/线性SVM和HOG/RBF-SVM显著减少了检测时间,并达到了满意的检测率。