Optimizing the trade-off between accuracy and speed in stock price forecasting: An online adaptive ensemble learning framework based on differential evolution

Document Type : Original Article

Authors

1 Department of Financial management, SR.C., Islamic Azad University, Tehran, Iran

2 Full Professor of Applied Mathematics, Department of Applied Mathematics, Faculty of Mathematical Sciences, Yazd University, Yazd, Iran

3 Associate Professor of Department of Financial management, SR.C., Islamic Azad University, Tehran, Iran

4 Associate Professor of Computer Engineering, Artificial Intelligence and Robotics Group, Faculty of Computer Engineering, Yazd University, Yazd, Iran.

Abstract

Abstract



Purpose:The primary objective of this research is to resolve the fundamental and critical trade-off between prediction accuracy and computational efficiency in stock price forecasting models. Despite the ability of advanced deep learning models, such as LSTM and Transformer, to achieve high accuracy, their heavy processing costs and high latency present serious challenges for their practical deployment in online, time-sensitive financial ecosystems. This issue highlights a significant research gap: the absence of an integrated framework capable of systematically and intelligently optimizing these two conflicting objectives (accuracy and speed) simultaneously. In response to this need, this study introduces a hybrid, adaptive, and self-optimizing framework named DE-Optimized AT-M(OS-ELM), specifically designed to find an optimal balance between these two metrics. The ultimate goal is to provide a practical and realistic solution that maintains competitive statistical accuracy while adapting to streaming data with extremely high speed, paving the way for the operationalization of artificial intelligence in real-time algorithmic trading systems.



Methodology:The methodology of this research is based on a multi-layered and intelligent architecture. The proposed framework, DE-Optimized AT-M(OS-ELM), integrates three key components: (1) Base Learner (OS-ELM): The Online Sequential Extreme Learning Machine is utilized for fast learning and instantaneous adaptation to new data without requiring complete retraining. (2) Adaptive Ensemble Structure (AT-M): To enhance stability and manage noise and "concept drift," multiple OS-ELM models are placed in an ensemble structure. The weight of each model is dynamically adjusted based on its recent performance within a sliding time window, using an "Adaptive Trust-weighted" mechanism. (3) Optimization Engine (DE): The Differential Evolution algorithm is employed for the automatic and intelligent optimization of the model's key hyperparameters. The core innovation of this research is the design of a dual-objective function for the DE algorithm, which simultaneously minimizes prediction error (RMSE) and computational cost (training time). For a comprehensive performance evaluation, a 14-year historical dataset (2010-2023) of five key assets from the US stock market was used. The proposed model was benchmarked against a diverse set of models, including ARIMA, Random Forest, SVR, LSTM, and Transformer. Model performance was assessed using multi-dimensional metrics for both accuracy (RMSE, MAE, R²) and efficiency (training and prediction time), and the statistical significance of the results was confirmed using the Wilcoxon non-parametric test.

.

Findings:The quantitative and qualitative evaluation results demonstrated that the proposed framework successfully achieved its objectives. In terms of accuracy, the proposed model delivered highly competitive performance, proving to be statistically superior or entirely on par with state-of-the-art deep learning models (LSTM and Transformer) (p < 0/05). Its superiority was particularly evident in the Directional Accuracy (DA) metric, which is critical for algorithmic trading, with an average of 66/1%. The most prominent finding emerged in the dimension of computational efficiency; with an average training time of less than one second, the proposed model registered a speed that was hundreds of times faster than advanced deep learning models. This dramatic reduction in computational cost represents a decisive and definitive advantage for practical applications. Visual analyses also confirmed these findings; the trade-off analysis plot uniquely positioned the proposed model in the "sweet spot" (high accuracy, low cost), and the rolling error analysis revealed that the model maintains higher performance stability, especially during periods of high market volatility.

.

Conclusion:This research successfully demonstrates that the solution to the accuracy-speed challenge in financial forecasting does not necessarily lie in greater architectural complexity but rather in a smart and targeted system engineering approach. The DE-Optimized AT-M(OS-ELM) framework, by intelligently integrating online learning, adaptive ensembling, and dual-objective optimization, establishes a systematic and effective balance between accuracy and efficiency. The model not only achieved accuracy on par with or superior to state-of-the-art models but did so at a computational cost that is orders of magnitude lower. This achievement challenges the paradigm of focusing exclusively on complex models and highlights the importance of designing practical and efficient solutions. The proposed framework, as a powerful tool, holds significant potential for implementation in algorithmic trading systems and real-time financial analytics, marking an important step toward the practical operationalization of artificial intelligence in finance.

Keywords


  1. Al-qaness, M. A. A., Ewees, A. A., Fan, H., & Abualigah, L. (2020). Optimization method for forecasting crude oil price using online sequential extreme learning machine. Applied Soft Computing, 95, 106518. https://doi.org/10.1016/j.asoc.2020.106518
  2. Ansari, H., & Mashayekhi, B. (2021). Stock price forecasting using a hybrid model based on GRU neural networks and whale optimization algorithm. Journal of Experimental Accounting Research, 11(42), 257–281. (In Persian)
  3. Azimi, R., Rezaei, F., & Karimi, M. (2023). Presenting a hybrid deep learning model and ant colony optimization algorithm for predicting the trend of Tehran Stock Exchange index. New Research in Decision Making, 8(4), 67–89. (In Persian)
  4. Box, G. E. P., Jenkins, G. M., Reinsel, G. C., & Ljung, G. M. (2015). Time series analysis: Forecasting and control. John Wiley & Sons.
  5. Breiman, L. (2001). Random forests. Machine Learning, 45(1), 5–32. https://doi.org/10.1023/A:1010933404324
  6. Brest, J., Boskovic, B., Zamuda, A., & Fister, I. (2021). Self-adaptive differential evolution algorithm for single-objective constrained real-parameter optimization. Swarm and Evolutionary Computation, 60, 100802. https://doi.org/10.1016/j.swevo.2020.100802
  7. Brzezinski, D., Stefanowski, J., & Wilk, S. (2021). On the challenges of learning from streaming financial data with high-frequency updates. Expert Systems with Applications, 169, 114498. https://doi.org/10.1016/j.eswa.2020.114498
  8. Coello Coello, C. A. (2006). Evolutionary multi-objective optimization: A historical view of the field. IEEE Computational Intelligence Magazine, 1(1), 28–36. https://doi.org/10.1109/MCI.2006.1597059
  9. Das, S., & Suganthan, P. N. (2020). Real-world applications of differential evolution. In J. Kacprzyk & W. Pedrycz (Eds.), Handbook of computational intelligence (pp. 819–835). Springer. https://doi.org/10.1007/978-3-319-07173-2_27
  10. Demšar, J. (2006). Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research, 7, 1–30.
  11. Dhahri, H., Al-qaness, M. A. A., Ewees, A. A., & Said, S. (2021). A novel online sequential learning approach for stock price forecasting. IEEE Access, 9, 75317–75329. https://doi.org/10.1109/ACCESS.2021.3080320
  12. Drucker, H., Burges, C. J., Kaufman, L., Smola, A. J., & Vapnik, V. (1997). Support vector regression machines. In Advances in neural information processing systems (pp. 155–161). MIT Press.
  13. Fama, E. F. (1970). Efficient capital markets: A review of theory and empirical work. The Journal of Finance, 25(2), 383–417. https://doi.org/10.2307/2325486
  14. Fischer, T., & Krauss, C. (2018). Deep learning with long short-term memory networks for financial market predictions. European Journal of Operational Research, 270(2), 654–669. https://doi.org/10.1016/j.ejor.2017.11.054
  15. Gama, J., & Castillo, G. (2023). Adaptive learning from data streams. In Data streams (pp. 25–45). Chapman and Hall/CRC.
  16. Gama, J., Žliobaitė, I., Bifet, A., Pechenizkiy, M., & Bouchachia, A. (2014). A survey on concept drift adaptation. ACM Computing Surveys (CSUR), 46(4), 1–37. https://doi.org/10.1145/2523813
  17. Gogas, P., & Papadimitriou, T. (2021). Machine learning in asset pricing. Journal of Economic Surveys, 35(4), 1018–1051. https://doi.org/10.1111/joes.12394
  18. Goldstein, M. A., Kumar, P., & Graves, F. C. (2014). Computerized and high-frequency trading. The Financial Review, 49(2), 273-281. https://doi.org/10.1111/fire.12035
  19. Gonçalves, I., Vale, Z., & Corchado, J. M. (2022). Adaptive ensemble learning for financial market forecasting with concept drift. Neurocomputing, 489, 324–338. https://doi.org/10.1016/j.neucom.2022.03.031
  20. Hastie, T., Tibshirani, R., & Friedman, J. (2009). The elements of statistical learning: Data mining, inference, and prediction. Springer.
  21. Huang, G.-B., Zhu, Q.-Y., & Siew, C.-K. (2006). Extreme learning machine: Theory and applications. Neurocomputing, 70(1–3), 489–501. https://doi.org/10.1016/j.neucom.2005.12.126
  22. Huang, S.-C., Chuang, C.-C., Wu, C.-H., & Chen, C.-T. (2019). An online sequential extreme learning machine-based stock price prediction system. Applied Soft Computing, 77, 656–667. https://doi.org/10.1016/j.asoc.2019.01.042
  23. Kallam, S., Kumar, M. J. N. V. P., & Sadad, T. (2022). A DE-optimized hybrid CNN-LSTM for stock price forecasting. Neural Computing and Applications, 34, 20857–20874. https://doi.org/10.1007/s00521-022-07523-7
  24. Ko, T., Ryu, S., & Kim, H. (2019). Adaptive ensemble online sequential extreme learning machine for time series prediction. Applied Sciences, 9(12), 2465. https://doi.org/10.3390/app9122465
  25. Li, H., & Zhao, C. (2022). A novel hybrid model for stock price prediction based on optimization algorithm and deep learning. Physica A: Statistical Mechanics and its Applications, 592, 126839. https://doi.org/10.1016/j.physa.2022.126839
  26. Liang, N. Y., Huang, G. B., Saratchandran, P., & Sundararajan, N. (2006). A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Transactions on Neural Networks, 17(6), 1411–1423. https://doi.org/10.1109/TNN.2006.880583
  27. Liu, Y., Zeng, Q., & Zhang, Y. (2024). TFD-Former: A temporal-frequency dependent Transformer for stock price forecasting. Expert Systems with Applications, 248, 123287. https://doi.org/10.1016/j.eswa.2023.123287
  28.  

77

  1. Sezer, O. B., Gudelek, M. U., & Ozbayoglu, A. M. (2020). Financial time series forecasting with deep learning: A systematic literature review: 2005–2019. Applied Soft Computing, 90, 106181. https://doi.org/10.1016/j.asoc.2020.106181
  2. Shahi, T. B., Shrestha, A., Neupane, A., & Guo, W. (2022). Stock price forecasting with deep learning: A comparative study. Mathematics, 10(9), 1441. https://doi.org/10.3390/math10091441.
  3. Storn, R., & Price, K. (1997). Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization, 11(4), 341–359. https://doi.org/10.1023/A:1008202821328
  4. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., & Polosukhin, I. (2017). Attention is all you need. In Advances in neural information processing systems (pp. 5998–6008). Curran Associates.
  5. Wang, H., Zhang, J., Wu, L., & Wang, J. (2022). Adaptive and dynamic ensemble learning for evolving data streams. ACM Transactions on Knowledge Discovery from Data (TKDD), 16(2), 1–24. https://doi.org/10.1145/3467891
  6. Wang, Z., & Cao, J. (2022). A robust extreme learning machine for regression in the presence of outliers. IEEE Transactions on Cybernetics, 52(9), 9205–9217. https://doi.org/10.1109/TCYB.2021.3051597
  7. Zhou, Z.-H. (2021). Ensemble methods: Foundations and algorithms (2nd ed.). Chapman and Hall/CRC.