A hierarchical neural network architecture for classification
Document Type
Conference Proceeding
Date of Original Version
8-23-2012
Abstract
In this paper, a hierarchical neural network with cascading architecture is proposed and its application to classification is analyzed. This cascading architecture consists of multiple levels of neural network structure, in which the outputs of the hidden neurons in the higher hierarchical level are treated as an equivalent input data to the input neurons at the lower hierarchical level. The final predictive result is obtained through a modified weighted majority vote scheme. In this way, it is hoped that new patterns could be learned from hidden layers at each level and thus the combination result could significantly improve the learning performance of the whole system. In simulation, a comparison experiment is carried out among our approach and two popular ensemble learning approaches, bagging and AdaBoost. Various simulation results based on synthetic data and real data demonstrate this approach can improve the classification performance. © 2012 Springer-Verlag.
Publication Title, e.g., Journal
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume
7367 LNCS
Issue
PART 1
Citation/Publisher Attribution
Wang, Jing, Haibo He, Yuan Cao, Jin Xu, and Dongbin Zhao. "A hierarchical neural network architecture for classification." Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 7367 LNCS, PART 1 (2012): 37-46. doi: 10.1007/978-3-642-31346-2_5.