Pakistan Science Abstracts
Article details & metrics
No Detail Found!!
A smoothing algorithm for training max-min neural networks.
Author(s):
1. Long Li: Department of Mathematics and Computational Science, Hengyang Normal University, Hengyang, 421008, China
2. Tian Xu: Department of Mathematics and Computational Science, Hengyang Normal University, Hengyang, 421008, China
3. Yan Liu: Department of Applied Mathematics, Dalian Polytechnic University, Dalian, 116034, China
4. Jie Yang: School of Mathematical Sciences, Dalian University of Technology, Dalian, 116024, China
Abstract:
In this paper, a smoothing algorithm for training max-min neural networks is proposed. Specifically, we apply a smooth function to approximate functions and use this smoothing technique twice, once to eliminate the inner operator and once to eliminate the operator. In place of actual network output by its approximation function, we use all partial derivatives of the approximation function with respect to weight to substitute those of the actual network output. Then, the smoothing algorithm is constructed by the gradient descent method. This algorithm can also be used to solve fuzzy relational equations. Finally, two numerical examples are provided to show the effectiveness of our smoothing algorithm for training max-min neural networks.
Page(s): 114-119
DOI: DOI not available
Published: Journal: Journal of Theoretical and Applied Information Technology, Volume: 46, Issue: 1, Year: 2012
Keywords:
Keywords are not available for this article.
References:
References are not available for this document.
Citations
Citations are not available for this document.
0

Citations

0

Downloads

3

Views