Comparative Performance Analysis of Boosting Ensemble Learning Models for Optimizing Marketing Promotion Strategy Classification

ensemble learning, boosting, marketing promotion, classification, machine learning

Authors

  • Imam Husni Al Amin Faculty of Information Technology and Industri, Universitas Stikubank Semarang, Indonesia
  • Fatkhul Amin Faculty of Economics and Bussines, Universitas PGRI Semarang, Indonesia
  • Setyawan Wibisono Faculty of Information Technology and Industri, Universitas Stikubank Semarang, Indonesia
May 5, 2025
May 7, 2025

Downloads

This study evaluates the performance of four boosting algorithms in ensemble learning, namely AdaBoost, Gradient Boosting, XGBoost, and CatBoost, for optimizing the classification of marketing promotion strategies. The rise of digitalization has driven the use of machine learning to understand consumer behavior better and enhance the effectiveness of promotional campaigns. Using the Marketing Promotion Campaign Uplift Modeling dataset from Kaggle, this study examines the capabilities of each algorithm in handling complex and imbalanced customer data. The evaluation metrics include accuracy, precision, recall, F1-score, and Area Under the Curve (AUC). Results indicate that XGBoost excels in precision, while Gradient Boosting achieves the highest AUC value, demonstrating superior ability in distinguishing positive and negative classes. CatBoost provides stable performance with categorical data, whereas AdaBoost shows strength in recall but is prone to false-positive predictions. Although all four algorithms exhibit good performance, the main challenge lies in addressing class imbalance. This study offers insights for marketing practitioners in selecting the most suitable algorithm and highlights the importance of data-balancing strategies to improve predictive accuracy in data-driven marketing