The training process of Back Propagation Neural Network (BPNN) is easily converged at a local minimum, which slows the training process sharply.In this paper, an analysis is given to the chief formative reason of loca...The training process of Back Propagation Neural Network (BPNN) is easily converged at a local minimum, which slows the training process sharply.In this paper, an analysis is given to the chief formative reason of local minimum, and an improved Genetic Algorithm (GA) is introduced to overcome local minimum.Most BPNNs use Sigmoid function as the transfer function of network nodes, this paper indicates that the flat characteristic of Sigmoid function results in the formation of local minimum.In the improved GA, pertinent modifications are made to the evaluation function and the mutation model.The evaluation of solution is associated with both values of error function and gradient model corresponding to the certain solution, so that solutions away from local minimum are highly evaluated.The sensitivity of error function to network parameter is imported to form a self-adapting mutation model, which is powerful to diminish error function.Both modifications help to drive solutions out of local minimum.A case study of a real industrial process shows the advantage of the improved GA to overcome local minimum and to accelerate the training process.展开更多
针对工作量平衡的多旅行商问题,提出了一种融合杂草算法繁殖机制和局部优化变异算子的改进遗传算法(Reproductive mechanism and Local optimization mutation operator based Genetic Algorithm,RLGA)。该算法利用入侵杂草优化算法中...针对工作量平衡的多旅行商问题,提出了一种融合杂草算法繁殖机制和局部优化变异算子的改进遗传算法(Reproductive mechanism and Local optimization mutation operator based Genetic Algorithm,RLGA)。该算法利用入侵杂草优化算法中以适应度为基准的繁殖机制来产生种群并进行遗传操作,以此来提高算法的搜索效率;同时提出一种混合局部优化算子作为变异算子来提高算法的局部搜索能力,从而提高收敛精度。实验结果表明,RLGA在求解工作量平衡的多旅行商问题时可以快速收敛到较优解,并且求解精度得到了很大的提高。展开更多
A local minimum is frequently encountered in the training of back propagation neural networks (BPNN), which sharply slows the training process. In this paper, an analysis of the formation of local minima is presented,...A local minimum is frequently encountered in the training of back propagation neural networks (BPNN), which sharply slows the training process. In this paper, an analysis of the formation of local minima is presented, and an improved genetic algorithm (GA) is introduced to overcome local minima. The Sigmoid function is generally used as the activation function of BPNN nodes. It is the flat characteristic of the Sigmoid function that results in the formation of local minima. In the improved GA, pertinent modifications are made to the evaluation function and the mutation model. The evaluation of the solution is associated with both the training error and gradient. The sensitivity of the error function to network parameters is used to form a self adapting mutation model. An example of industrial application shows the advantage of the improved GA to overcome local minima. 展开更多
文摘The training process of Back Propagation Neural Network (BPNN) is easily converged at a local minimum, which slows the training process sharply.In this paper, an analysis is given to the chief formative reason of local minimum, and an improved Genetic Algorithm (GA) is introduced to overcome local minimum.Most BPNNs use Sigmoid function as the transfer function of network nodes, this paper indicates that the flat characteristic of Sigmoid function results in the formation of local minimum.In the improved GA, pertinent modifications are made to the evaluation function and the mutation model.The evaluation of solution is associated with both values of error function and gradient model corresponding to the certain solution, so that solutions away from local minimum are highly evaluated.The sensitivity of error function to network parameter is imported to form a self-adapting mutation model, which is powerful to diminish error function.Both modifications help to drive solutions out of local minimum.A case study of a real industrial process shows the advantage of the improved GA to overcome local minimum and to accelerate the training process.
文摘针对工作量平衡的多旅行商问题,提出了一种融合杂草算法繁殖机制和局部优化变异算子的改进遗传算法(Reproductive mechanism and Local optimization mutation operator based Genetic Algorithm,RLGA)。该算法利用入侵杂草优化算法中以适应度为基准的繁殖机制来产生种群并进行遗传操作,以此来提高算法的搜索效率;同时提出一种混合局部优化算子作为变异算子来提高算法的局部搜索能力,从而提高收敛精度。实验结果表明,RLGA在求解工作量平衡的多旅行商问题时可以快速收敛到较优解,并且求解精度得到了很大的提高。
文摘A local minimum is frequently encountered in the training of back propagation neural networks (BPNN), which sharply slows the training process. In this paper, an analysis of the formation of local minima is presented, and an improved genetic algorithm (GA) is introduced to overcome local minima. The Sigmoid function is generally used as the activation function of BPNN nodes. It is the flat characteristic of the Sigmoid function that results in the formation of local minima. In the improved GA, pertinent modifications are made to the evaluation function and the mutation model. The evaluation of the solution is associated with both the training error and gradient. The sensitivity of the error function to network parameters is used to form a self adapting mutation model. An example of industrial application shows the advantage of the improved GA to overcome local minima.