ANALYSIS OF LONG SHORT – TERM MEMORY (LSTM) PARAMETERS IN PREDICTING IHSG

Auteurs

  • Daud Padut Aritiran Remetwa
  • Lina Cahyadi Universitas Pelita Harapan
  • Ferry Vincentius Ferdinand
  • Kie Van Ivanky Saputra
  • Kathleen Teja

DOI:

https://doi.org/10.19166/johme.v9i2.10220

Trefwoorden:

IHSG, LSTM, prediction, parameter, stock, prediksi, saham

Samenvatting

For investors looking to enhance the value of their financial assets, stock investment is a popular choice. A Long Short-Term Memory (LSTM) model will be used to forecast the movement of the Indonesia Composite Index (IHSG) in the domestic capital market. This research focuses on key parameters of the LSTM model, such as sliding window size, the number of epochs, the learning rate, and the type of optimizer. There are four configurations that were tested. First, the sliding window size was varied while keeping other parameters constant. Second, while maintaining the other parameters, the number of epochs was modified. Third, while keeping the remaining parameters unchanged, the learning rate was adjusted. Lastly, while holding the other parameters constant, different optimizers were tested. The dataset is divided into two periods, such as: pre-pandemic and during the pandemic. The dataset is segmented into training and testing sets for every period. During the pre-pandemic period, the best-performing parameters included a sliding window size of 20, training over 40 epochs with a learning rate of 0.001, and the Adam optimizer, resulting in an RMSE of 7.2218.  The best results during the pandemic period were obtained with parameters consisting of a sliding window size of 5, 10 epochs, a learning rate of 0.001, and the Adam optimizer, resulting in an RMSE of 1.727. These parameter combinations demonstrated the highest predictive performance for IHSG.

BAHASA INDONESIA ABSTRACT: Untuk para investor yang ingin meningkatkan nilai aset keuangan mereka, investasi saham adalah pilihan populer. Sebuah model Long Short-Term Memory (LSTM) akan digunakan untuk memprediksi harga Indeks Harga Saham Gabungan (IHSG) di pasar modal Indonesia. Penelitian ini memfokuskan pada parameter kunci dari model LSTM, seperti ukuran sliding windows, jumlah epoch, learning rate, dan jenis optimizer. Ada empat konfigurasi yang diuji. Pertama, ukuran sliding windows divariasikan sementara parameter lainnya tetap konstan. Kedua, jumlah epoch dimodifikasi dengan tetap mempertahankan parameter lainnya. Ketiga, learning rate divariasikan dengan parameter lainnya tetap tidak berubah. Terakhir, berbagai optimizer diuji dengan parameter lainnya tetap konstan. Dataset ini dibagi menjadi dua periode, yaitu sebelum pandemi dan selama pandemi. Data dibagi menjadi set pelatihan dan pengujian untuk setiap periode. Parameter optimal untuk periode sebelum pandemi adalah ukuran sliding windows 20, 40 epoch, learning rate 0,001, dan optimizer Adam, menghasilkan Root Mean Squared Error (RMSE) sebesar 7,2218. Selama pandemi, parameter terbaik adalah ukuran sliding windows 5, 10 epoch, learning rate 0,001, dan optimizer Adam, dengan RMSE sebesar 1,727. Kombinasi parameter ini menunjukkan kinerja prediksi tertinggi untuk IHSG.

Referenties

Aggarwal, C. (2018). Neural networks and deep learning. Springer. https://doi.org/10.1007/978-3-031-29642-0_13

Camuñas-Mesa, L. A., Linares-Barranco, B., & Serrano-Gotarredona, T. (2019). Neuromorphic spiking neural networks and their memristor-CMOS hardware implementations. Materials, 12(17), 1-28. https://doi.org/10.3390/ma12172745

Chiang, T. C. (2020). US policy uncertainty and stock returns: Evidence in the US and its spillovers to the European Union, China, and Japan. The Journal of Risk Finance, 21(5), 621–657. https://doi.org/10.1108/jrf-10-2019-0190

Duchi, J., Hazan, E., & Singer, Y. (2011). Adaptive subgradient methods for online learning and stochastic optimization. Journal of Machine Learning Research, 12(7), 2121–2159. Retrieved from https://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf

Enriko, I. K. A., Gustiyana, F. N., & Putra, R. H. (2023). Komparasi hasil optimasi pada prediksi harga saham PT Telkom Indonesia menggunakan algoritma long short term memory. Jurnal Media Informatika Budidarma, 7(2), 659–667. https://doi.org/10.30865/mib.v7i2.5822

Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press.

Hua, Y., Zhao, Z., Liu, Z., Chen, X., Li, R., & Zhang, H. (2018). Traffic prediction based on random connectivity in deep learning with long short-term memory. 2018 IEEE 88th Vehicular Technology Conference (VTC-Fall), 1–6. https://doi.org/10.1109/vtcfall.2018.8690851

Huang, Y., & Yan, E. (2023) Economic recession forecasts using machine learning models based on the evidence from the covid-19 pandemic. Modern Economy, 14, 899-922. https://doi.org/10.4236/me.2023.147049

Hyndman, R. J., & Athanasopoulos, G. (2018). Forecasting: Principles and practice. Retrieved from https://robjhyndman.com/uwafiles/fpp-notes.pdf

Hull, J. C. (2009). Options, futures and other derivatives. Upper Saddle River, NJ: Prentice Hall.

Ismailsyah, S. E. (2020). Analisis pengaruh free float saham-saham first liner, second liner, dan third liner terhadap likuiditas saham. Jurnal Ilmiah Mahasiswa FEB, 9(1), 1-28. Retrieved from https://jimfeb.ub.ac.id/index.php/jimfeb/article/view/7067

Karno, A. S. B. (2020). Analisis data time series menggunakan LSTM (Long short term memory) dan ARIMA (autocorrelation integrated moving average) dalam bahasa Python. Ultima InfoSys: Jurnal Ilmu Sistem Informasi, 11(1), 1–7. https://doi.org/10.31937/si.v9i1.1223

Kingma, D. P., & Ba, J. L. (2015). Adam: A method for stochastic optimization. ICLR 2015, 1-15. Retrieved from https://arxiv.org/pdf/1412.6980

Li, A. W., & Bastos, G. S. (2020). Stock market forecasting using deep learning and technical analysis: A systematic review. IEEE Access, 8, 185232–185242. https://doi.org/10.1109/access.2020.3030226

Mustapha, A., Mohamed, L., & Ali, K. (2021). Comparative study of optimization techniques in deep learning: Application in the ophthalmology field. Journal of Physics: Conference Series, 1743, 1-13. https://doi.org/10.1088/1742-6596/1743/1/012002

Noviando, E. S., Ervianto, E., & Yasri, I. (2016). Study on the application of ANN (Artificial neural network) to eliminate harmonics in the main computer center building [Doctoral dissertation]. Retrieved from https://media.neliti.com/media/publications/184148-ID-studi-penerapan-ann-artificial-neural-ne.pdf

Qiu, J., Wang, B., & Zhou, C. (2020). Forecasting stock prices with long short-term memory neural network based on attention mechanism. Plos One, 15(1), 1-15. https://doi.org/10.1371/journal.pone.0227222

Rasamoelina, A. D., Adjalila, F., & Sincák, P. (2020). A review of activation function for artificial neural network. 2020 IEEE 18th World Symposium on Applied Machine Intelligence and Informatics (SAMI), 281–286. https://doi.org/10.1109/sami48414.2020.9108717

Setiawan, D. (2018). The impact of information and communication technology development on culture. Jurnal Simbolika: Research and Learning in Communication Study (E-Journal), 4(1), 62–72. https://doi.org/10.31289/simbollika.v4i1.1474

Siami-Namini, S., Tavakoli, N., & Namin, A. S. (2018). A comparison of ARIMA and LSTM in forecasting time series. 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), 1394–1401. https://doi.org/10.1109/ICMLA.2018.00227

Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 15(1), 1929–1958. Retrieved from https://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf

Staudemeyer, R. C., & Morris, E. R. (2019). Understanding LSTM – A tutorial into long short-term memory recurrent neural networks. Retrieved from https://arxiv.org/abs/1909.09586

Ticknor, J. L. (2013). A Bayesian regularized artificial neural network for stock market forecasting. Expert Systems with Applications, 40(14), 5501–5506. https://doi.org/10.1016/j.eswa.2013.04.013

Van Houdt, G., Mosquera, C., & Nápoles, G. (2020). A review on the long short-term memory model. Artificial Intelligence Review, 53(8), 5929–5955. https://link.springer.com/article/10.1007/s10462-020-09838-1

Vanstone, B., & Finnie, G. (2009). An empirical methodology for developing stock market trading systems using artificial neural networks. Expert Systems with Applications, 36(3), 6668–6680. https://doi.org/10.1016/j.eswa.2008.08.019

Wiranda, L., & Sadikin, M. (2019). Application of long short-term memory on time series data to predict product sales of PT Metiska Farma. Jurnal Nasional Pendidikan Teknik Informatika: JANAPATI, 8(3), 184–196. Retrieved from https://www.neliti.com/id/publications/407800/penerapan-long-short-term-memory-pada-data-time-series-untuk-memprediksi-penjual

Yamak, P. T., Yujian, L., & Gadosey, P. K. (2019). A comparison between ARIMA, LSTM, and GRU for time series forecasting. Proceedings of the 2019 2nd International Conference on Algorithms, Computing and Artificial Intelligence, 49–55. https://doi.org/10.1145/3377713.3377722

Yusuf, A. (2022). Prediksi indeks harga saham gabungan (IHSG) menggunakan long short-term memory. EPSILON: Jurnal Matematika Murni dan Terapan, 15(2), 124-132. https://doi.org/10.20527/epsilon.v15i2.5026

Zamanlooy, B., & Mirhassani, M. (2014). Efficient VLSI implementation of neural networks with hyperbolic tangent activation function. IEEE Transactions on Very Large Scale Integration (VLSI) Systems, 22(1), 39–48. https://doi.org/10.1109/tvlsi.2012.2232321

Zhang, L., Wen, J., Li, Y., Chen, J., Ye, Y., Fu, Y., & Livingood, W. (2021). A review of machine learning in building load prediction. Applied Energy, 285, 1-22. Retrieved from https://www.sciencedirect.com/science/article/pii/S0306261921000209

##submission.downloads##

Gepubliceerd

2025-12-03

Citeerhulp

Remetwa, D. P. A., Cahyadi, L., Ferdinand, F. V., Saputra, K. V. I., & Teja, K. (2025). ANALYSIS OF LONG SHORT – TERM MEMORY (LSTM) PARAMETERS IN PREDICTING IHSG. JOHME: Journal of Holistic Mathematics Education, 9(2), 213–224. https://doi.org/10.19166/johme.v9i2.10220

Nummer

Sectie

Research in Mathematics Education