Methods for constructing predictor ensembles based on convex combinations

Cover Page

Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription or Fee Access

Abstract

Сonstructing convex combinations of predictors is an effective method for building ensembles in solving regression problems. Herewith it seems possible to improve the final quality of the algorithm if an initial set of predictors is constructed in a special way. In this paper, we study two techniques that allow us to achieve such an improvement: bagging in combination with the random subspace method, and optimization of the divergence of predictors. The effectiveness of resulting methods is verified in applied problems.

Full Text

Restricted Access

About the authors

I. M. Borisov

Lomonosov Moscow State University

Author for correspondence.
Email: s02210331@gse.cs.msu.ru
Russian Federation, Moscow

A. A. Dokukin

Computer Science and Control Federal Research Center of the Russian Academy of Sciences

Email: dalex@ccas.ru
Russian Federation, Moscow

O. V. Senko

Computer Science and Control Federal Research Center of the Russian Academy of Sciences

Email: senkoov@mail.ru
Russian Federation, Moscow

References

  1. Zhou Z.H. Ensemble Methods: Foundations and Algorithms. Chapman and Hall/CRC. N. Y., 2012.
  2. Hastie T., Tibshirani R., Friedman J. The Elements of Statistical Learning Data Mining, Inference, and Prediction. Springer Series in Statistics. N. Y.: Springer, 2009.
  3. Сенько О.В., Докукин А.А. Оптимальные выпуклые корректирующие процедуры в задачах высокой размерности // ЖВМ и МФ. 2011. Т. 51. № 9. С. 1751–1760.
  4. Сенько О.В., Докукин А.А. Регрессионная модель, основанная на выпуклых комбинациях, максимально коррелирующих с откликом // ЖВМ и МФ. 2015. Т. 55. № 3. С. 530–544.
  5. Senko O.V., Dokukin A.A., Kiselyova N.N., Dudarev V.A., Kuznetsova Yu.O. New Two-Level Ensemble Method and Its Application to Chemical Compounds Properties Prediction // Lobachevskii Journal of Mathematics. 2023. V. 44. № 1. P. 188–197.
  6. Докукин А.А., Сенько О.В. Новый двухуровневый метод машинного обучения для оценивания вещественных характеристик объектов // Изв. РАН ТиСУ. 2023. № 4. C. 17–24. https://doi.org/10.31857/S0002338823040029
  7. Zhuravlev Yu.I., Senko O.V., Dokukin A.A., Kiselyova N.N., Saenko I.A. Two-Level Regression Method Using Ensembles of Trees with Optimal Divergence // Doklady Mathematics. 2021. V. 104. № 1. P. 212–214.
  8. Kiselyova N.N., Stolyarenko A.V., Ryazanov V.V., Sen’ko O.V., Dokukin A.A. Application of Machine Training Methods to Design of New Inorganic Compounds // Diagnostic Test Approaches to Machine Learning and Commonsense Reasoning Systems / Eds X.A. Naidenova, D.I. Ignatov. Hershey: IGI Global, 2013. P. 197–220.
  9. Breiman L. Random forests // Machine Learning. 2001. V. 45. № 1. P. 5–32.
  10. Ho T.K. The Random Subspace Method for Constructing Decision Forests // IEEE Transactions on Pattern Analysis and Machine Intelligence. 1998. V. 20. № 8. P. 832–844.
  11. Wolpert D.H. Stacked Generalization // Neural Networks. 1992. V. 5. № 2. P. 241–259.

Supplementary files

Supplementary Files
Action
1. JATS XML
2. Fig. 1. Histogram of the effectiveness of various methods in terms of the top-1 criterion.

Download (114KB)
3. Fig. 2. Histogram of the effectiveness of various methods in terms of the top-3 criterion.

Download (121KB)

Copyright (c) 2025 Russian Academy of Sciences