![]() |
|
![]()
|
|
![]() |
Modular network architecture and distance based training algorithm for pattern associationA two-dimensional modular architecture for Hopfield
neural network that improves the storage capacity and reduces the structural
complexity is presented. The new approach involves dividing a 2-D network
into sub-modules with each module functioning independently as a sub-network
in conjunction with the inputs from neighboring modules. A divide and conquer
approach, which can solve a complex computational task by dividing it into
simpler subtasks and then combining their individual solutions is utilized.
The performance of the proposed technique is evaluated by applying various
character patterns. It is observed that the network exhibits faster convergence
characteristics and is capable of reproducing learned patterns successfully
from noisy data.
References [1]. R. G. Rosandich, HAVNET: A new neural network architecture for pattern recognition, Neural Networks, Vol.10 (1996) 139-151. [2]. S. H. Chen and Y. F. Liao, Modular recurrent neural networks for Mandarin syllable recognition, IEEE Transactions on Neural Networks, Vol.9 (1998) 1430-1441. [3]. S. W. Moon and S. G. Kong, Block-based neural networks, IEEE Transactions on Neural Networks, Vol.12 (2001) 307-317. [4]. R. A. Jacobs and M. I. Jordan, A Competitive modular connectionist architecture, Neural Information Processing System, (1991) 767-773. [5]. M. J. Seow and V. K. Asari, High storage capacity architecture for pattern recognition using an array of Hopfield neural networks, IEEE Computer Society Proceedings of 30th International Workshop on Applied Imagery and Pattern Recognition, (2001) 169 – 174. [6]. R. S. Sultton, Two problems with backpropagation and other steepest descent learning procedure for networks, in Proc. 8th Annual Conf. Cognitive Science Soc., (1986) 823-831. [7]. D. C. Plaut and G. E Hanton, Learning sets of filters using backpropagation, Computer, Speech Language, Vol. 2 (1987) 35-61. [8]. R. Kumar and P. Rockett, Multiobject Genetic Algorithm Partitioning for Hierarchical Learning of High-Dimensional Pattern Spaces: A Learning-Follows-Decomposition Strategy, IEEE Transactions on Neural Networks, Vol. 9 (1998) 822-830. [9]. R. Kumar, W. C. Chen, and P. I. Rockett, Bayesian labeling if image corner feature using gray-level corner model with a bootstrapped modular neural network, in Proc. IEE 5th Int. Conf. Artificial Neural Network (ANN 1997) 82-87. [10]. A. Waibel, H. Sawai, and K. Shikano, Modularity and scaling up large phonemic neural network, IEEE Trans. Acoust., Speech, Signal Processing, Vol. 37 (1989) 1888-1897. [11]. D. W. Tank and J. J. Hopfield, Simple neural optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit, IEEE Transactions on Circuits and Systems, Vol. CAS-33 (1986) 533-541. [12]. R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh, The capacity of the Hopfield associative memory, IEEE Transactions on Information Theory, Vol. IT-33 (1987) 461-482. |
![]() |
||
![]() |
VLSI Systems Laboratory
|
![]() |
![]() |