Skip to main content

Part of the book series: Studies in Big Data ((SBD,volume 77))

Abstract

Deep learning, one of the most remarkable techniques of machine learning, has been a major success in many fields, including image processing, speech recognition, and text understanding. It is powerful engines capable of learning arbitrary mapping functions, not require a scaled or stationary time series as input, support multivariate inputs, and support multi-step outputs. All of these features together make deep learning useful tools when dealing with more complex time series prediction problems involving large amounts of data, and multiple variables with complex relationships. This paper provides an overview of the most common Deep Learning types for time series forecasting, Explain the relationships between deep learning models and classical approaches to time series forecasting. A brief background of the particular challenges presents in time-series data and the most common deep learning techniques that are often used for time series forecasting is provided. Previous studies that applied deep learning to time series are reviewed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Abbreviations

ANN:

Artificial Neural Network

AR:

Auto Regression

ARMA:

Auto-Regressive Moving Average

ARIMA:

Auto Regressive Integrated Moving Average

CNN:

Convolutional Neural Network

DBN:

Deep Be-Lief Networks

DL:

Deep learning

GRU:

Gated Recurrent Unit

LSTM:

Long Short Term Memory

MA:

Moving Average

MAE:

Mean Absolute Error

MFE:

Mean Forecast Error

ML:

Machine Learning

MLP:

Multi-Layer Perception

MMSE:

Minimum Mean Square Error

MPE:

The Mean Percentage Error

RBM:

Restricted Boltzmann Machine

RMSE:

The Root Mean Squared Error

SAE:

Stacked-Autoencoders

SARIMA:

Seasonal Autoregressive Integrated Moving Average Models

References

  1. Sengupta, S., et al.: A Review of deep learning with special emphasis on architectures. Applications and Recent Trends. arXiv preprint arXiv:1905.13294 (2019)

    Google Scholar 

  2. Paszke, A., et al.: Automatic differentiation in pytorch (2017)

    Google Scholar 

  3. Abadi, M., et al.: TensorFlow: large-scale machine learning on heterogeneous systems, 2015. Software available from tensorflow.org, vol. 1, no. 2 (2015)

    Google Scholar 

  4. Jia, Y., et al.: Caffe: convolutional architecture for fast feature embedding. In: Proceedings of the 22nd ACM International Conference on Multimedia, pp. 675–678. ACM (2014)

    Google Scholar 

  5. Tokui, S., Oono, K., Hido, S., Clayton, J.: Chainer: a next-generation open source framework for deep learning. In: Proceedings of Workshop on Machine Learning Systems (LearningSys) in the Twenty-Ninth Annual Conference on Neural Information Processing Systems (NIPS), vol. 5, pp. 1–6 (2015)

    Google Scholar 

  6. e. a. Chollet, F.: Keras. https://github.com/fchollet/keras (2015)

  7. Dai, J., et al.: BigDL: a distributed deep learning framework for big data. arXiv preprint arXiv:1804.05839 (2018)

  8. Cavalcante, R.C., Brasileiro, R.C., Souza, V.L., Nobrega, J.P., Oliveira, A.L.: Computational intelligence and financial markets: A survey and future directions. Expert Syst. Appl. 55, 194–211 (2016)

    Article  Google Scholar 

  9. Dorffner, G.: Neural networks for time series processing. In: Neural Network World. Citeseer (1996)

    Google Scholar 

  10. I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Advances in neural information processing systems, (2014), pp. 3104–3112

    Google Scholar 

  11. Tkáč, M., Verner, R.: Artificial neural networks in business: two decades of research. Appl. Soft Comput. 38, 788–804 (2016)

    Article  Google Scholar 

  12. J. C. B. Gamboa, Deep learning for time-series analysis. arXiv preprint arXiv:1701.01887 (2017)

    Google Scholar 

  13. Palma, W.: Time Series Analysis. Wiley (2016)

    Google Scholar 

  14. Boshnakov, G.N.: Introduction to Time Series Analysis and Forecasting, Wiley Series in Probability and Statistics (Montgomery, D.C., Jennings, C.L., Kulahci, M. (eds.)). Wiley, Hoboken, NJ, USA (2015). Total number of pages: 672 Hardcover: ISBN: 978-1-118-74511-3, ebook: ISBN: 978-1-118-74515-1, etext: ISBN: 978-1-118-74495-6, J. Time Ser. Anal. 37(6), 864 (2016)

    Google Scholar 

  15. Fuller, W.A.: Introduction to Statistical Time Series. Wiley (2009)

    Google Scholar 

  16. Adhikari, R., Agrawal, R.K.: An introductory study on time series modeling and forecasting. arXiv preprint arXiv:1302.6613 (2013)

    Google Scholar 

  17. Otoo, H., Takyi Appiah, S., Wiah, E.: regression and time series analysis of loan default. Minescho Cooperative Credit Union, Tarkwa (2015)

    Google Scholar 

  18. Dagum, E.B., Bianconcini, S.: Seasonal Adjustment Methods and Real Time Trend-Cycle Estimation. Springer (2016)

    Google Scholar 

  19. Hyndman, R.J., Athanasopoulos, G.: Forecasting: Principles and Practice. OTexts (2018)

    Google Scholar 

  20. Brockwell, P.J., Davis, R.A., Calder, M.V.: Introduction to Time Series and Forecasting. Springer (2002)

    Google Scholar 

  21. Box, G.E., Jenkins, G.M., Reinsel, G.C., Ljung, G.M.: Time Series Analysis: Forecasting and Control. Wiley (2015)

    Google Scholar 

  22. Hipel, H.W., McLeod, A.I.: Time Series Modelling of Water Resources and Environmental Systems. Elsevier (1994)

    Google Scholar 

  23. Cochrane, J.H.: Time Series for Macroeconomics and Finance. University of Chicago, Manuscript (2005)

    Google Scholar 

  24. Zhang, G.P.: A neural network ensemble method with jittered training data for time series forecasting. Inf. Sci. 177(23), 5329–5346 (2007)

    Article  Google Scholar 

  25. Hamzaçebi, C.: Improving artificial neural networks’ performance in seasonal time series forecasting. Inf. Sci. 178(23), 4550–4559 (2008)

    Article  Google Scholar 

  26. Chatfield, C.: Time series forecasting with neural networks. In: Neural Networks for Signal Processing VIII. Proceedings of the 1998 IEEE Signal Processing Society Workshop (Cat. No. 98TH8378), pp. 419–427. IEEE (1998)

    Google Scholar 

  27. Kihoro, J., Otieno, R., Wafula, C.: Seasonal time series forecasting: A comparative study of ARIMA and ANN models (2004)

    Google Scholar 

  28. Haykin, S.S.: Neural Networks and Learning Machines. Pearson Education, Upper Saddle River (2009)

    Google Scholar 

  29. Schmidhuber, J.: Deep learning in neural networks: an overview. Neur. Netw. 61, 85–117 (2015)

    Article  Google Scholar 

  30. Vellido, A., Lisboa, P.J., Vaughan, J.: Neural networks in business: a survey of applications (1992–1998). Expert Syst. Appl. 17(1), 51–70 (1999)

    Article  Google Scholar 

  31. Rojas, R.: Neural Networks: A Systematic Introduction. Springer Science & Business Media (2013)

    Google Scholar 

  32. Deb, C., Zhang, F., Yang, J., Lee, S.E., Shah, K.W.: A review on time series forecasting techniques for building energy consumption. Renew. Sustain. Energy Rev. 74, 902–924 (2017)

    Article  Google Scholar 

  33. Bengio, Y., Goodfellow, I.J., Courville, A.: Deep learning. Nature 521(7553), 436–444 (2015)

    Article  MATH  Google Scholar 

  34. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436 (2015)

    Google Scholar 

  35. Chen, Z., Yi, D.: The game imitation: deep supervised convolutional networks for quick video game AI. arXiv preprint arXiv:1702.05663 (2017)

    Google Scholar 

  36. Øyen, S.: Forecasting Multivariate Time Series Data Using Neural Networks. NTNU (2018)

    Google Scholar 

  37. Ciresan, D.C., Meier, U., Masci, J., Gambardella, L.M., Schmidhuber, J.: Flexible, high performance convolutional neural networks for image classification. In: Twenty-Second International Joint Conference on Artificial Intelligence (2011)

    Google Scholar 

  38. Scherer, D., Müller, A., Behnke, S.: Evaluation of pooling operations in convolutional architectures for object recognition. In: International Conference on Artificial Neural Networks, pp. 92–101. Springer (2010)

    Google Scholar 

  39. Fawaz, H.I., Forestier, G., Weber, J., Idoumghar, L., Muller, P.-A.: Deep learning for time series classification: a review. Data Mining and Knowledge Discovery, pp. 1–47 (2019)

    Google Scholar 

  40. Graves, A., Mohamed, A.-r., Hinton, G.: Speech recognition with deep recurrent neural networks. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 6645–6649. IEEE (2013)

    Google Scholar 

  41. Palangi, H., et al.: Deep sentence embedding using long short-term memory networks: Analysis and application to information retrieval. IEEE/ACM Trans. Audio Speech Lang. Process. (TASLP) 24(4), 694–707 (2016)

    Article  Google Scholar 

  42. Che, Z., Purushotham, S., Cho, K., Sontag, D., Liu, Y.: Recurrent neural networks for multivariate time series with missing values. Sci. Rep. 8(1), 6085 (2018)

    Article  Google Scholar 

  43. Palangi, H., Ward, R., Deng, L.: Distributed compressive sensing: a deep learning approach. IEEE Trans. Signal Process. 64(17), 4504–4518 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  44. Graves, A.: Sequence transduction with recurrent neural networks. arXiv preprint arXiv:1211.3711 (2012)

    Google Scholar 

  45. Graves, A.: Generating sequences with recurrent neural networks. arXiv preprint arXiv:1308.0850 (2013)

  46. Walid, A.: Recurrent neural network for forecasting time series with long memory pattern. J. Phys.: Conf. Ser. 824(1), 012038 (2017)

    Google Scholar 

  47. Gómez, P., Nebot, A., Ribeiro, S., Alquézar, R., Mugica, F., Wotawa, F.: Local maximum ozone concentration prediction using soft computing methodologies. Syst. Anal. Model. Simul. 43(8), 1011–1031 (2003)

    Article  Google Scholar 

  48. Pascanu, R., Gulcehre, C., Cho, K., Bengio, Y.: How to construct deep recurrent neural networks. arXiv preprint arXiv:1312.6026 (2013)

  49. Bao, W., Yue, J., Rao, Y.: A deep learning framework for financial time series using stacked autoencoders and long-short term memory. PLoS ONE 12(7), e0180944 (2017)

    Article  Google Scholar 

  50. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Cogn. Model. 5(3), 1 (1988)

    MATH  Google Scholar 

  51. Xu, L., Li, C., Xie, X., Zhang, G.: Long-short-term memory network based hybrid model for short-term electrical load forecasting. Information 9(7), 165 (2018)

    Article  Google Scholar 

  52. Bengio, Y., Simard, P., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Networks 5(2), 157–166 (1994)

    Article  Google Scholar 

  53. Li, X., et al.: Long short-term memory neural network for air pollutant concentration predictions: Method development and evaluation. Environ. Pollut. 231, 997–1004 (2017)

    Article  Google Scholar 

  54. Jozefowicz, R., Zaremba, W., Sutskever, I.: An empirical exploration of recurrent network architectures. In: International Conference on Machine Learning, pp. 2342–2350 (2015)

    Google Scholar 

  55. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014)

  56. Y. Song, “Stock trend prediction: Based on machine learning methods,” UCLA, (2018)

    Google Scholar 

  57. Mohammadi, M., Al-Fuqaha, A., Sorour, S., Guizani, M.: Deep learning for IoT big data and streaming analytics: a survey. IEEE Commun. Surv. Tutor. 20(4), 2923–2960 (2018)

    Article  Google Scholar 

  58. Fischer, A., Igel, C.: An introduction to restricted Boltzmann machines. In: iberoamerican Congress on Pattern Recognition, pp. 14–36. Springer (2012)

    Google Scholar 

  59. Smolensky, P.: Information processing in dynamical systems: Foundations of harmony theory. Colorado Univ at Boulder Dept of Computer Science (1986)

    Google Scholar 

  60. Hinton, G.E.: Training products of experts by minimizing contrastive divergence. Neural Comput. 14(8), 1771–1800 (2002)

    Article  MATH  Google Scholar 

  61. Hinton,G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313(5786), 504–507 (2006)

    Google Scholar 

  62. Larochelle, H., Bengio, Y.: Classification using discriminative restricted Boltzmann machines. In: Proceedings of the 25th International Conference on Machine Learning, pp. 536–543. ACM (2008)

    Google Scholar 

  63. Coates, A., Ng, A., Lee, H.: An analysis of single-layer networks in unsupervised feature learning. In: Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, pp. 215–223 (2011)

    Google Scholar 

  64. Sutskever, I., Hinton, G.: Learning multilevel distributed representations for high-dimensional sequences. In: Artificial Intelligence and Statistics, pp. 548–555 (2007)

    Google Scholar 

  65. Taylor, G.W., Hinton, G.E., Roweis, S.T.: Modeling human motion using binary latent variables. In: Advances in Neural Information Processing systems, pp. 1345–1352 (2007)

    Google Scholar 

  66. Hinton, G.E., Osindero, S., Teh, Y.-W.: A fast learning algorithm for deep belief nets. Neural Comput. 18(7), 1527–1554 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  67. Nweke, H.F., Teh, Y.W., Al-Garadi, M.A., Alo, U.R.: Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: state of the art and research challenges. Expert Syst. Appl. 105, 233–261 (2018)

    Article  Google Scholar 

  68. Zhang, Q., Yang, L.T., Chen, Z., Li, P.: A survey on deep learning for big data. Information Fusion 42, 146–157 (2018)

    Article  Google Scholar 

  69. Gensler, A., Henze J., Sick, B., Raabe, N.: Deep Learning for solar power forecasting—an approach using AutoEncoder and LSTM neural networks. In: 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 002858–002865. IEEE (2016)

    Google Scholar 

  70. Ahmad, A., Anderson, T., Lie, T.: Hourly global solar irradiation forecasting for New Zealand. Sol. Energy 122, 1398–1408 (2015)

    Article  Google Scholar 

  71. Sharma, V., Yang, D., Walsh, W., Reindl, T.: Short term solar irradiance forecasting using a mixed wavelet neural network. Renew. Energy 90, 481–492 (2016)

    Article  Google Scholar 

  72. Grover, A., Kapoor, A., Horvitz, E.: A deep hybrid model for weather forecasting. In: Proceedings of the 21st ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 379–386. ACM (2015)

    Google Scholar 

  73. Marino, D.L., Amarasinghe, K., Manic, M.: Building energy load forecasting using deep neural networks. In: IECON 2016-42nd Annual Conference of the IEEE Industrial Electronics Society, pp. 7046–7051. IEEE (2016)

    Google Scholar 

  74. Ryu, S., Noh, J., Kim, H.: Deep neural network based demand side short term load forecasting. Energies 10(1), 3 (2016)

    Article  Google Scholar 

  75. Tong, C., Li, J., Lang, C., Kong, F., Niu, J., Rodrigues, J.J.: An efficient deep model for day-ahead electricity load forecasting with stacked denoising auto-encoders. J. Parallel Distrib. Comput. 117, 267–273 (2018)

    Article  Google Scholar 

  76. Lu, S., et al.: Electric load data characterising and forecasting based on trend index and auto-encoders. J. Eng. 2018(17), 1915–1921 (2018)

    Google Scholar 

  77. Shi, H., Xu, M., Li, R.: Deep learning for household load forecasting—a novel pooling deep RNN. IEEE Trans. Smart Grid 9(5), 5271–5280 (2017)

    Article  Google Scholar 

  78. Bouktif, S., Fiaz, A., Ouni, A., Serhani, M.: Optimal deep learning LSTM model for electric load forecasting using feature selection and genetic algorithm: comparison with machine learning approaches. Energies 11(7), 1636 (2018)

    Article  Google Scholar 

  79. Ugurlu, U., Oksuz, I., Tas, O.: Electricity price forecasting using recurrent neural networks. Energies 11(5), 1255 (2018)

    Article  Google Scholar 

  80. Kuo, P.-H., Huang, C.-J.: An electricity price forecasting model by hybrid structured deep neural networks. Sustainability 10(4), 1280 (2018)

    Article  MathSciNet  Google Scholar 

  81. Lago, J., De Ridder, F., De Schutter, B.: Forecasting spot electricity prices: deep learning approaches and empirical comparison of traditional algorithms. Appl. Energy 221, 386–405 (2018)

    Article  Google Scholar 

  82. Kim, K.-J., Ahn, H.: Simultaneous optimization of artificial neural networks for financial forecasting. Appl. Intell. 36(4), 887–898 (2012)

    Article  Google Scholar 

  83. Adebiyi, A.A., Adewumi, A.O., Ayo, C.K.: Comparison of ARIMA and artificial neural networks models for stock price prediction. J. Appl. Math. 2014 (2014)

    Google Scholar 

  84. Göçken, M., Özçalıcı, M., Boru, A., Dosdoğru, A.T.: Integrating metaheuristics and artificial neural networks for improved stock price prediction. Expert Syst. Appl. 44, 320–331 (2016)

    Article  Google Scholar 

  85. Lu, C.-J., Lee, T.-S., Chiu, C.-C.: Financial time series forecasting using independent component analysis and support vector regression. Decis. Support Syst. 47(2), 115–125 (2009)

    Article  Google Scholar 

  86. Hossain, M.A., Karim, R., Thulasiram, R., Bruce, N.D., Wang, Y.: Hybrid deep learning model for stock price prediction. In: 2018 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1837–1844. IEEE (2018)

    Google Scholar 

  87. Siami-Namini, S., Namin, A.S.: Forecasting economics and financial time series: Arima vs. LSTM. arXiv preprint arXiv:1803.06386 (2018)

  88. Fischer, T., Krauss, C.: Deep learning with long short-term memory networks for financial market predictions. Eur. J. Oper. Res. 270(2), 654–669 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  89. dos Santos Pinheiro, L., Dras, M.: Stock market prediction with deep learning: a character-based neural language model for event-based trading. In: Proceedings of the Australasian Language Technology Association Workshop 2017, pp. 6–15 (2017)

    Google Scholar 

  90. Wang, J.-Z., Wang, J.-J., Zhang, Z.-G., Guo, S.-P.: Forecasting stock indices with back propagation neural network. Expert Syst. Appl. 38(11), 14346–14355 (2011)

    Google Scholar 

  91. Rafiei, M., Niknam, T., Khooban, M.-H.: Probabilistic forecasting of hourly electricity price by generalization of ELM for usage in improved wavelet neural network. IEEE Trans. Industr. Inf. 13(1), 71–79 (2016)

    Article  Google Scholar 

  92. Chen, K., Zhou, Y., Dai, F.: A LSTM-based method for stock returns prediction: a case study of China stock market. In: 2015 IEEE International Conference on Big Data (Big Data), pp. 2823–2824. IEEE (2015)

    Google Scholar 

  93. Sengupta, S., et al.: A review of deep learning with special emphasis on architectures, applications and recent trends. Networks, 21 (2006)

    Google Scholar 

  94. Gudelek, M.U., Boluk, S.A, Ozbayoglu, A.M.: A deep learning based stock trading model with 2-D CNN trend detection. In 2017 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–8. IEEE (2017)

    Google Scholar 

  95. Türkmen, A.C., Cemgil, A.T.: An application of deep learning for trade signal prediction in financial markets. In: 2015 23rd Signal Processing and Communications Applications Conference (SIU), pp. 2521–2524. IEEE (2015)

    Google Scholar 

  96. Ding, X., Zhang, Y., Liu, T., Duan, J.: Deep learning for event-driven stock prediction. In: Twenty-Fourth International Joint Conference on Artificial Intelligence (2015)

    Google Scholar 

  97. Chen, W., Zhang, Y., Yeo, C.K., Lau, C.T., Lee, B.S.: Stock market prediction using neural network through news on online social networks. In: 2017 International Smart Cities Conference (ISC2), pp. 1–6. IEEE (2017)

    Google Scholar 

  98. Nichiforov, C., Stamatescu, G., Stamatescu, I., Făgărăşan, I.: Evaluation of sequence-learning models for large-commercial-building load forecasting. Information 10(6), 189 (2019)

    Article  Google Scholar 

  99. Zahid, M., et al.: Electricity price and load forecasting using enhanced convolutional neural network and enhanced support vector regression in smart grids. Electronics 8(2), 122 (2019)

    Article  Google Scholar 

  100. Shi, H., Xu, M., Ma, Q., Zhang, C., Li, R., Li, F.: A whole system assessment of novel deep learning approach on short-term load forecasting. Energy Procedia 142, 2791–2796 (2017)

    Article  Google Scholar 

  101. Hernández, E., Sanchez-Anguix, V., Julian, V., Palanca, J., Duque, N.: Rainfall prediction: a deep learning approach. In: International Conference on Hybrid Artificial Intelligence Systems, pp. 151–162. Springer (2016)

    Google Scholar 

  102. Zhao, Y., Li, J., Yu, L.: A deep learning ensemble approach for crude oil price forecasting. Energy Econ. 66, 9–16 (2017)

    Article  Google Scholar 

  103. Ni, C., Ma, X.: Prediction of wave power generation using a convolutional neural network with multiple inputs. Energies 11(8), 2097 (2018)

    Article  Google Scholar 

  104. Fu, R., Zhang, Z., Li, L.: Using LSTM and GRU neural network methods for traffic flow prediction. In: 2016 31st Youth Academic Annual Conference of Chinese Association of Automation (YAC), pp. 324–328. IEEE (2016)

    Google Scholar 

  105. Duan, Y., Lv, Y., Wang, F.-Y.: Travel time prediction with LSTM neural network. In: 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), pp. 1053–1058. IEEE (2016)

    Google Scholar 

  106. Du, S., Li, T., Gong, X., Yu, Z., Huang, Y., Horng, S.-J.: A hybrid method for traffic flow forecasting using multimodal deep learning. arXiv preprint arXiv:1803.02099 (2018)

  107. Alhassan, Z., McGough, R.S., Alshammari, R., Daghstani, T., Budgen, D., Al Moubayed, N.: Type-2 diabetes mellitus diagnosis from time series clinical data using deep learning models. In: International Conference on Artificial Neural Networks, pp. 468–478. Springer (2018)

    Google Scholar 

  108. Choi, E., Bahadori, M.T., Schuetz, A., Stewart, W.F., Sun, J.: Doctor AI: predicting clinical events via recurrent neural networks. In: Machine Learning for Healthcare Conference, pp. 301–318 (2016)

    Google Scholar 

  109. Lipton, Z.C., Kale, D.C., Elkan, C., Wetzel, R.: Learning to diagnose with LSTM recurrent neural networks. arXiv preprint arXiv:1511.03677 (2015)

  110. Lv, Y., Duan, Y., Kang, W., Li, Z., Wang, F.-Y.: Traffic flow prediction with big data: a deep learning approach. IEEE Trans. Intell. Transp. Syst. 16(2), 865–873 (2014)

    Google Scholar 

  111. Yang, J., Nguyen, M.N., San, P.P., Li, X.L., Krishnaswamy, S.: Deep convolutional neural networks on multichannel time series for human activity recognition. In: Twenty-Fourth International Joint Conference on Artificial Intelligence (2015)

    Google Scholar 

  112. Mehdiyev, N., Lahann, J., Emrich, A., Enke, D., Fettke, P., Loos, P.: Time series classification using deep learning for process planning: a case from the process industry. Procedia Comput. Sci. 114, 242–249 (2017)

    Article  Google Scholar 

  113. Di Persio, L., Honchar, O.: Artificial neural networks approach to the forecast of stock market price movements. Int. J. Econ. Manag. Syst. 1 (2016)

    Google Scholar 

  114. Dedinec, A., Filiposka, S., Dedinec, A., Kocarev, L.: Deep belief network based electricity load forecasting: an analysis of Macedonian case. Energy 115, 1688–1700 (2016)

    Article  Google Scholar 

  115. Ke, J., Zheng, H., Yang, H., Chen, X.M.: Short-term forecasting of passenger demand under on-demand ride services: a spatio-temporal deep learning approach. Transp. Res. Part C: Emerg. Technol. 85, 591–608 (2017)

    Article  Google Scholar 

  116. Zhao, Z., Chen, W., Wu, X., Chen, P.C., Liu, J.: LSTM network: a deep learning approach for short-term traffic forecast. IET Intell. Transp. Syst. 11(2), 68–75 (2017)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Amal Mahmoud .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Mahmoud, A., Mohammed, A. (2021). A Survey on Deep Learning for Time-Series Forecasting. In: Hassanien, A.E., Darwish, A. (eds) Machine Learning and Big Data Analytics Paradigms: Analysis, Applications and Challenges. Studies in Big Data, vol 77. Springer, Cham. https://doi.org/10.1007/978-3-030-59338-4_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-59338-4_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-59337-7

  • Online ISBN: 978-3-030-59338-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics