EDOS: Entropy Difference-based Oversampling Approach for Imbalanced Learning
Date of Original Version
A large number of datasets in various applications are imbalanced in which majority samples dominate minority samples. The skewed distribution poses a difficulty for existing learning approaches. Oversampling techniques address this concern by replicating original samples or adding new synthetic samples of minority class. Even with success, they raise the problems of over-generation and overlapping. In this paper, we propose an entropy difference-based oversampling approach (EDOS) for imbalanced learning using a novel metric, termed entropy difference (ED). First, given a dataset, EDOS measures the imbalance degree between the majority and the minority with ED. Second, EDOS creates synthetic minority samples. For each synthetic sample, EDOS evaluates its retention capability and remains the informative sample. Third, original and qualified synthetic samples are combined to train the classifiers. In the experiments, we demonstrate the effectiveness of the proposed EDOS method on several UCI datasets.
Proceedings of the International Joint Conference on Neural Networks
Li, Lusi, Haibo He, Jie Li, and Weijun Li. "EDOS: Entropy Difference-based Oversampling Approach for Imbalanced Learning." Proceedings of the International Joint Conference on Neural Networks 2018-July, (2018). doi:10.1109/IJCNN.2018.8489729.