Improved prediction of daily pan evaporation using Deep-LSTM model | Neural Computing and Applications Skip to main content
Log in

Improved prediction of daily pan evaporation using Deep-LSTM model

  • Hybrid Artificial Intelligence and Machine Learning Technologies
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Precise measurement or estimation of evaporation losses is extremely important for the development of water resource management strategies and its effective implementation, particularly in drought-prone areas for increasing agricultural productivity. Evaporation can either be measured directly using evaporimeters, or it can be estimated by means of empirical models with the help of climatic factors influencing evaporation process. In general, variations in climatic factors such as temperature, humidity, wind speed, sunshine and solar radiation influence and control the evaporation process to a great extent. Due to the highly nonlinear nature of evaporation phenomenon, it is invariably very difficult to model the evaporation process through climatic factors especially in diverse agro-climatic situations. The present investigation is carried out to examine the potential of deep neural network architecture with long short-term memory cell (Deep-LSTM) to estimate daily pan evaporation with minimum input features. Depending upon the availability of climatic data Deep-LSTM models with different input combinations are proposed to model daily evaporation losses in three agro-climatic zones of Chhattisgarh state in east-central India. The performance of the proposed Deep-LSTM models are compared with commonly used multilayer artificial neural network and empirical methods (Hargreaves and Blaney–Criddle). The results of the investigations in terms of various performance evaluation criteria reveal that the proposed Deep-LSTM structure is able to successfully model the daily evaporation losses with improved accuracy as compared to other models considered in this study.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Abbreviations

EP:

Pan evaporation

Deep-LSTM:

Deep neural network architecture with long short-term memory cell

ACZs:

Agro-climatic zones

MLANN:

Multilayer artificial neural network

T max :

Maximum temperature

T min :

Minimum temperature

RHI :

Relative humidity morning

RHII :

Relative humidity afternoon

WS:

Wind speed

BSS:

Bright sunshine hours

SD:

Standard deviation

ET0 :

Reference evapotranspiration

CV:

Coefficient of variation

R:

Correlation coefficient

RMSE:

Root-mean-square error

R2 :

Coefficient of determination

EF:

Efficiency factor

AIC:

Akaike information criterion

g t :

Input node at time t

tanh :

Hyperbolic tangent function

x t :

Input to the memory cell at time t

W gx :

Weight matrix between input layer of the network and input node of the memory cell

h t−1 :

Hidden state input at time t − 1

W gh :

Weight matrix between hidden states at different time steps

\({\text{bias}}_{{{\text{input}}\;{\text{node}}}}\) :

Bias to the input node

i t :

Input gate at time t

σ :

Sigmoidal activation function

s t :

Internal state at time t

s t−1 :

Internal state at time t−1

\({\text{bias}}_{{{\text{input}}\;{\text{gate}}}}\) :

Bias to the input gate

f t :

Forget state at time t

\(\odot\) :

Point-wise linear operator

W fx :

Weight matrix between forget gates and input layer

W fh :

Weight matrix between forget gates and hidden states

h t :

Final output of memory cell at time t

\({\text{bias}}_{\text{forget}}\) :

Bias for the forget gate

O t :

Output gate at time t

W ox :

Weight matrix between output gates and input layers

W oh :

Weight matrix between output gates and hidden states

\({\text{bias}}_{{{\text{output}}\;{\text{gates}}}}\) :

Bias for the output gate

References

  1. Abtew W, Melesse A (2013) Evaporation and evapotranspiration: measurements and estimations. Springer, Netherlands, pp 1–206. https://doi.org/10.1007/978-94-007-4737-1

    Book  Google Scholar 

  2. Yao H (2009) Long-term study of lake evaporation and evaluation of seven estimation methods: results from Dickie Lake, South-Central Ontario, Canada. J Water Resour Prot 01(02):59–77. https://doi.org/10.4236/jwarp.2009.12010

    Article  Google Scholar 

  3. Martí P, González-Altozano P, López-Urrea R, Mancha LA, Shiri J (2015) Modeling reference evapotranspiration with calculated targets. Assessment and implications. Agric Water Manag 149:81–90. https://doi.org/10.1016/j.agwat.2014.10.028

    Article  Google Scholar 

  4. Singh VP, Xu C-Y (1997) Evaluation and generalization of 13 mass transfer equations for determining free water evaporation. Hydrol Process 11(3):311–323. https://doi.org/10.1002/(SICI)1099-1085(19970315)11:3%3c311:AID-HYP446%3e3.3.CO;2-P

    Article  Google Scholar 

  5. Xu CY, Singh VP (2000) Evaluation and generalization of radiation-based methods for calculating evaporation. Hydrol Process 14:339–349

    Article  Google Scholar 

  6. Xu CY, Singh VP (2001) Evaluation and generalization of temperature-based methods for calculating evaporation. Hydrol Process 15:305–319

    Article  Google Scholar 

  7. Gianniou SK, Antonopoulos VZ (2007) Evaporation and energy budget in Lake Vegoritis, Greece. J Hydrol 345:212–223

    Article  Google Scholar 

  8. Rosenberry DO, Winter TC, Buso DC, Likens GE (2007) Comparison of 15 evaporation methods applied to a small mountain lake in the northeastern USA. J Hydrol 340:149–166. https://doi.org/10.1016/j.jhydrol.2007.03.018

    Article  Google Scholar 

  9. Ali S, Ghosh NC, Singh R (2008) Evaluating best evaporation estimate model for water surface evaporation in semi-arid region, India. Hydrol Processes 22:1093–1106

    Article  Google Scholar 

  10. Allen RG, Pereira LS, Raes D, Smith M (1998) Crop evapotranspiration. Guidelines for computing crop water requirements. Irrigation and drainage. Paper 56, FAO, Rome, p 300

  11. Abudu S, Cui C, King JP, Moreno J, Bawazir AS (2011) Modeling of daily pan evaporation using partial least squares regression. Sci China Technol Sci 54(1):163–174. https://doi.org/10.1007/s11431-010-4205-z

    Article  Google Scholar 

  12. Benzaghta MA, Mohammed TA, Ghazali AH, Soom MAM (2012) Validation of selected models for evaporation estimation from reservoirs located in arid and semi-arid regions. Arab J Sci Eng 37(3):521–534. https://doi.org/10.1007/s13369-012-0194-5

    Article  Google Scholar 

  13. Guven A, Kisi O (2011) Daily pan evaporation modeling using linear genetic programming technique. IrrigSci 29:135–145

    Google Scholar 

  14. Guven A, Kisi O (2013) Monthly pan evaporation modeling using linear genetic programming. J Hydrol 503:178–185. https://doi.org/10.1016/j.jhydrol.2013.08.043

    Article  Google Scholar 

  15. Keskin ME, Terzi Ö, Taylan D (2009) Estimating daily pan evaporation using adaptive neural-based fuzzy inference system. Theor Appl Climatol 98(1–2):79–87. https://doi.org/10.1007/s00704-008-0092-7

    Article  Google Scholar 

  16. Kim S, Shiri J, Kisi O (2012) Pan evaporation modeling using neural computing approach for different climatic zones. Water Resour Manag 26(11):3231–3249. https://doi.org/10.1007/s11269-012-0069-2

    Article  Google Scholar 

  17. Kim S, Singh VP, Seo Y (2014) Evaluation of pan evaporation modeling with two different neural networks and weather station data. Theor Appl Climatol 117(1):1–13. https://doi.org/10.1007/s00704-013-0985-y

    Article  Google Scholar 

  18. Kişi Ö (2009) Modeling monthly evaporation using two different neural computing techniques. Irrig Sci 27(5):417–430. https://doi.org/10.1007/s00271-009-0158-z

    Article  Google Scholar 

  19. Malik A, Kumar A (2015) Pan evaporation simulation based on daily meteorological data using soft computing techniques and multiple linear regression. Water Resour Manag 29(6):1859–1872. https://doi.org/10.1007/s11269-015-0915-0

    Article  Google Scholar 

  20. Shirsath PB, Singh AK (2010) A comparative study of daily pan evaporation estimation using ANN, regression and climate based models. Water Resour Manag 24(8):1571–1581. https://doi.org/10.1007/s11269-009-9514-2

    Article  Google Scholar 

  21. Sanikhani H, Kisi O, Nikpour MR, Dinpashoh Y (2012) Estimation of daily pan evaporation using two different adaptive neuro-fuzzy computing techniques. Water Resour Manag 26(15):4347–4365. https://doi.org/10.1007/s11269-012-0148-4

    Article  Google Scholar 

  22. Tabari H, Marofi S, Sabziparvar AA (2010) Estimation of daily pan evaporation using artificial neural network and multivariate non-linear regression. Irrig Sci 28(5):399–406. https://doi.org/10.1007/s00271-009-0201-0

    Article  Google Scholar 

  23. Deo RC, Samui P, Kim D (2016) Estimation of monthly evaporative loss using relevance vector machine, extreme learning machine and multivariate adaptive regression spline models. Stoch Environ Res Risk Assess 30(6):1769–1784. https://doi.org/10.1007/s00477-015-1153-y

    Article  Google Scholar 

  24. Wang L, Kisi O, Zounemat-Kermani M, Li H (2017) Pan evaporation modeling using six different heuristic computing methods in different climates of China. J Hydrol 544:407–427. https://doi.org/10.1016/j.jhydrol.2016.11.059

    Article  Google Scholar 

  25. Wang L, Niu Z, Kisi O, Li C, Yu D (2017) Pan evaporation modeling using four different heuristic approaches. Comput Electron Agric 140:203–213. https://doi.org/10.1016/j.compag.2017.05.036

    Article  Google Scholar 

  26. Malik A, Kumar A, Kisi O (2017) Monthly pan-evaporation estimation in Indian central Himalayas using different heuristic approaches and climate based models. Comput Electron Agric 143:302–313. https://doi.org/10.1016/j.compag.2017.11.008

    Article  Google Scholar 

  27. Tezel G, Buyukyildiz M (2016) Monthly evaporation forecasting using artificial neural networks and support vector machines. Theor Appl Climatol 124(1–2):69–80. https://doi.org/10.1007/s00704-015-1392-3

    Article  Google Scholar 

  28. Kisi O, Genc O, Dinc S, Zounemat-Kermani M (2016) Daily pan evaporation modeling using Chi squared automatic interaction detector, neural networks, classification and regression tree. Comput Electron Agric 122:112–117. https://doi.org/10.1016/j.compag.2016.01.026

    Article  Google Scholar 

  29. Keshtegar B, Piri J, Kisi O (2016) A nonlinear mathematical modeling of daily pan evaporation based on conjugate gradient method. Comput Electron Agric 127:120–130. https://doi.org/10.1016/j.compag.2016.05.018

    Article  Google Scholar 

  30. Martí P, González-Altozano P, López-Urrea R, Mancha LA, Shiri J (2015) Modeling reference evapotranspiration with calculated targets. Assessment and implications. Agric Water Manag 149:81–90. https://doi.org/10.1016/j.agwat.2014.10.028

    Article  Google Scholar 

  31. Panda B, Majhi B (2018) A novel improved prediction of protein structural class using deep recurrent neural network. Evol Intell. https://doi.org/10.1007/s12065-018-0171-3

    Article  Google Scholar 

  32. Abd-Elazim SM, Ali ES (2016) Load frequency controller design of a two-area system composing of PV grid and thermal generator via firefly algorithm. Neural Comput Appl. https://doi.org/10.1007/s00521-016-2668-y

    Article  Google Scholar 

  33. Abd-Elazim SM, Ali ES (2016) Load frequency controller design via BAT algorithm for nonlinear interconnected power system. Electr Power Energy Syst 77:166–177. https://doi.org/10.1016/j.ijepes.2015.11.029

    Article  Google Scholar 

  34. Ali ES, AbdElazim SM, Abdelaziz AY (2016) Ant lion optimization algorithm for renewable distributed generations. Energy 116:445–458. https://doi.org/10.1016/j.energy.2016.09.104

    Article  Google Scholar 

  35. Arqub OA, Abo-Hammour Z (2014) Numerical solution of systems of second-order boundary value problems using continuous genetic algorithm. Inf Sci 279:396–415. https://doi.org/10.1016/j.ins.2014.03.128

    Article  MathSciNet  MATH  Google Scholar 

  36. Arqub OA, Al-Smadi M, Momani S, Hayat T (2017) Application of reproducing kernel algorithm for solving second-order, two-point fuzzy boundary value problems. Soft Comput 21(23):7191–7206. https://doi.org/10.1007/s00500-016-2262-3

    Article  MATH  Google Scholar 

  37. Arqub OA, Mohamed A-S, Momani S, Hayat T (2016) Numerical solutions of fuzzy differential equations using reproducing kernel Hilbert space method. Soft Comput 20(8):3283–3302. https://doi.org/10.1007/s00500-015-1707-4

    Article  MATH  Google Scholar 

  38. Abu Arqub O (2017) Adaptation of reproducing kernel algorithm for solving fuzzy Fredholm-Volterra integro-differential equations. Neural Comput Appl 28(7):1591–1610. https://doi.org/10.1007/s00521-015-2110-x

    Article  Google Scholar 

  39. Zhao Z, Chen W, Wu X, Chen PCY, Liu J (2017) LSTM network: a deep learning approach for short-term traffic forecast. IET Intell Transp Syst 11(2):68–75. https://doi.org/10.1049/iet-its.2016.0208

    Article  Google Scholar 

  40. Greff K, Srivastava RK, Koutnik J, Steunebrink BR, Schmidhuber J (2017) LSTM: a search space odyssey. IEEE Trans Neural Netw Learn Syst 28(10):2222–2232. https://doi.org/10.1109/TNNLS.2016.2582924

    Article  MathSciNet  Google Scholar 

  41. Xie Y, Le L, Zhou Y, Raghavan VV (2018) Deep learning for natural language processing. In: Handbook of statistics (vol 38, pp 317–328). Elsevier B.V. https://doi.org/10.1016/bs.host.2018.05.001

  42. Alipanahi B, Delong A, Weirauch MT, Frey BJ (2015) Predicting the sequence specificities of DNA- and RNA-binding proteins by deep learning. Nat Biotechnol 33(8):831–838. https://doi.org/10.1038/nbt.3300

    Article  Google Scholar 

  43. Dalto M (2015) Deep neural networks for time series prediction with applications in ultra-short-term wind forecasting. In: Proceedings of IEEE international conference on industrial technology (ICIT), 2015, 1657–1663. https://doi.org/10.1109/ICIT.2015.7125335

  44. Fischer T, Krauss C (2018) Deep learning with long short-term memory networks for financial market predictions. Eur J Oper Res 270(2):654–669. https://doi.org/10.1016/j.ejor.2017.11.054

    Article  MathSciNet  MATH  Google Scholar 

  45. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  46. Haykin S (1998) Neural networks: a comprehensive foundation, 2nd edn. Prentice-Hall, Upper Saddle River, pp 26–32

    Google Scholar 

  47. Hargreaves GL, Hargreaves GH, Riley JP (1985) Irrigation water requirements for Senegal River Basin. J Irrig Drain Eng 111(3):265–275. https://doi.org/10.1061/(ASCE)0733-9437(1985)111:3(265)

    Article  Google Scholar 

  48. Doorenbos J, Pruitt WO (1977) Guidelines for predicting crop water requirements. FAO irrigation and drainage paper no. 24. Food and Agriculture Organization of the United Nations, Rome, 15–29, 112–115. https://doi.org/10.1161/CIRCULATIONAHA.105.601930

  49. Nash JE, Sutcliffe JV (1970) River flow forecasting through conceptual models part I: a discussion of principles. J Hydrol 10(3):282–290. https://doi.org/10.1016/0022-1694(70)90255-6

    Article  Google Scholar 

  50. Kingma DP, Ba JL (2014) Adam optimizer. ArXiv Preprint arXiv:1412.6980, 1–15. https://doi.org/10.1145/1830483.1830503

  51. Gers FA, Schraudolph NN, Schmidhuber J (2003) Learning precise timing with LSTM recurrent networks. J Mach Learn Res 3(1):115–143. https://doi.org/10.1162/153244303768966139

    Article  MathSciNet  MATH  Google Scholar 

  52. Graves A, Fernández S, Schmidhuber J (2005) Bidirectional LSTM networks for improved phoneme classification and recognition. In: Proceedings of international conference on artificial neural networks, pp 799–804. https://doi.org/10.1007/11550907_126

  53. Abadi M, Barham P, Chen J, Chen Z, Davis A, Dean J, Zheng X (2016) TensorFlow: a system for large-scale machine learning. In: Proceedings of 12th USENIX conference on operating systems design and implementation, 272–283. https://doi.org/10.1126/science.aab4113.4

  54. Chollet F (2015) Keras: deep Learning library for Theano and TensorFlow. GitHub Repository, 1–21

  55. Pedregosa F, Weiss R, Brucher M (2011) Scikit-learn : machine learning in python. J Mach Learn Res 12(October):2825–2830. https://doi.org/10.1016/j.molcel.2012.08.019

    Article  MathSciNet  MATH  Google Scholar 

  56. Symonds MRE, Moussalli A (2011) A brief guide to model selection, multimodel inference and model averaging in behavioural ecology using Akaike’s information criterion. Behav Ecol Sociobiol. https://doi.org/10.1007/s00265-010-1037-6

    Article  Google Scholar 

  57. Penny WD (2012) Comparing dynamic causal models using AIC. BIC Free Energy NeuroImage 59(1):319–330. https://doi.org/10.1016/j.neuroimage.2011.07.039

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Babita Majhi.

Ethics declarations

Conflict of interest

We declare that we have no conflict of interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Majhi, B., Naidu, D., Mishra, A.P. et al. Improved prediction of daily pan evaporation using Deep-LSTM model. Neural Comput & Applic 32, 7823–7838 (2020). https://doi.org/10.1007/s00521-019-04127-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-019-04127-7

Keywords

Navigation