Oversampling the minority class in the feature space
- Submitting institution
-
The University of Birmingham
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 43101367
- Type
- D - Journal article
- DOI
-
10.1109/TNNLS.2015.2461436
- Title of journal
- IEEE Transactions on Neural Networks and Learning Systems
- Article number
- -
- First page
- 1947
- Volume
- 27
- Issue
- 9
- ISSN
- 2162-237X
- Open access status
- Out of scope for open access requirements
- Month of publication
- August
- Year of publication
- 2015
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
3
- Research group(s)
-
-
- Citation count
- 22
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- A popular approach to deal with imbalanced class distribution is up-sampling (i.e. creating synthetic examples) of minority class. This is fine if the distribution of the minority class is "well behaved" (e.g. convex support), which cannot be assumed in many real-world problems. The paper proposes to up-sample in the feature space of a kernel machine where the classes are more likely to be linearly separated, eliminating the need for sophisticated and often ad-hoc up-sampling methods. The idea of up-sampling in a feature space has been followed by others, e.g. Wang (Engineering Applications of AI, 2020), Zhu (Applied Soft Computing, 2019).
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -