Weather Prediction Using Ann Final Review

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

WEATHER PREDICTION USING ANN

FINAL REVIEW

Submitted for the course: Artificial Intelligence

(ITE2010)

By

GROUP MEMBERS REGISTRATION NUMBERS

SHIKHAR RATHORE 17BIT0225

GARVIT KATARIA 17BIT0101

AYUSHI SHARMA 17BIT0381

Slot: A2+TA2

(School of Information Technology & Engineering)


ABSTRACT

Weather forecasting has become an important field of research in the last few decades. Daily
Weather forecasting is used for multiple reasons in multiple areas like agriculture, energy supply,
transportations etc. In this project, a neural network-based algorithm for predicting the temperature
will be implemented. The Neural Networks package supports different types of training or learning
algorithms. One such algorithm is Back Propagation Neural Network (BPN) technique. The main
advantage of the BPN neural network method is that it can fairly approximate a large class of
functions. The proposed idea is tested using the real time dataset.
Backpropagation is a method used in artificial neural networks to calculate a gradient that is needed
in the calculation of the weights to be used in the network. Backpropagation is shorthand for "the
backward propagation of errors " since an error is computed at the output and distributed
backwards throughout the network’s layers. It is commonly used to train deep neural networks a
term referring to neural networks with more than one hidden layer.

OBJECTIVE
To classify weather as hot, cold, rainy, windy, sunny, cloudy or humid based on various
parameters such as temperature, humidity, precipitation, dew point and wind speed.

INTRODUCTION
The chaotic nature of the atmosphere implies the need of massive computational power required
to solve the equations that describe the atmospheric conditions. This is resulted from incomplete
understanding of atmospheric processes mean that forecasts become less accurate as the difference
in time between the present moment and the time for which the forecast is being made increases.
Weather is a continuous, data-intensive, multidimensional, dynamic and chaotic process and these
properties make weather prediction [6] a big challenge. Generally, two methods are used for
weather forecasting
(a) the empirical approach and
(b) the dynamical approach.
The first approach is based on the occurrence of analogs and is often referred by meteorologists as
analog forecasting. This approach is useful for predicting local-scale weather if recorded data’s
are plentiful. The second approach is based on equations and forward simulations of the
atmosphere and is often referred to as computer modeling. The dynamical approach is only useful
for modeling large-scale weather phenomena and may not forecast short-term weather efficiently.
Most weather prediction systems use a combination of empirical and dynamical techniques
Artificial Neural Network (ANN) provides a methodology for solving many types of nonlinear
problems that are difficult to be solved by traditional techniques
LITERATURE REVIEW

Similar Research work have been done previously by some of which includes “”ANN Approach
for Weather Prediction using Back Propagation Ch.Jyosthna Devi B.Syam Prasad
Reddy,K.Vagdhan Kumar,B.Musala Reddy,N.Raja Nayak”
This paper include how neural networks are useful in forecasting the weather and the working of
most powerful prediction algorithm called back propagation algorithm was explained . A 3-layered
neural network is designed and trained with the existing dataset and obtained a relationship
between the existing non-linear parameters of weather. Now the trained neural network can predict
the future temperature with less error.
Another Research paper Classification and Prediction of Future Weather by using Back
Propagation Algorithm-An Approach Sanjay D. Sawaitul, , Prof. K. P. Wagh , Dr. P. N.
Chatur concludes that the new technology of wireless medium can be used for weather
forecasting process The system increases the reliability, accuracy and consistency of identification
and interpretation of weather images. It also concludes that the Back Propagation Algorithm can
also be applied on the weather forecasting data. Neural Networks are capable of modeling a
weather forecast system. The neural network signal processing approach for weather forecasting
is capable of yielding good results and can be considered as an alternative to traditional
meteorological approaches.
Another Research Paper An Effective Weather Forecasting Using Neural Network Pooja
Malik S,Prof. Saranjeet Singh Sr. Binni Arora
This paper proposes that a new technique of weather forecasting by using Feed-forward ANN.
The data is taken from Rice Research center (Kaul) Haryana. In this paper data is trained by LM
algorithm. This is the fastest method among other weather forecasting methods. As there are many
BP algorithm but among them Levenberg BP has better learning rate.

An Article on Artificial Neural Network based Weather Prediction using Back Propagation
Technique was published in International Journal of Advanced Computer Science and
Applications(IJACSA), Volume 9 Issue 8, 2018. Eleven weather features were used to perform
classification of weather into four types. Furthermore, twenty training examples from 1997-2015
were used to predict eleven weather features. The prediction was calculated for weather forecast
basic factors like humidity, speed, etc. A Multi-layered neural network is designed and trained
with the existing dataset and obtained a relationship between the existing non-linear parameters of
weather. The overall behavior of our model has been concluded is that by increasing the number
of hidden layers, the trained neural network can classify and predict the weather variables with
less error
METHODOLOGY

BACK PROPOGATION:
Backpropagation is a method used in artificial neural networks to calculate a gradient that is
needed in the calculation of the weights to be used in the network.
Backpropagation is shorthand for "the backward propagation of errors," since an error is
computed at the output and distributed backwards throughout the network’s layers.
It is commonly used to train deep neural networks a term referring to neural networks with more
than one hidden layer.
Backpropagation is a special case of a more general technique called automatic differentiation. In
the context of learning, backpropagation is commonly used by the gradient descent optimization
algorithm to adjust the weight of neurons by calculating the gradient of the loss function. This
technique is also sometimes called backward propagation of errors, because the error is
calculated at the output and distributed back through the network layers.
Backpropagation requires the derivative of the loss function with respect to the network output to
be known, which typically (but not necessarily) means that a desired target value is known. For
this reason, it is considered to be a supervised learning method, although it is used in some
unsupervised networks such as autoencoders.
Backpropagation is also a generalization of the delta rule to multi-layered feedforward networks,
made possible by using the chain rule to iteratively compute gradients for each layer. It is closely
related to the Gauss–Newton algorithm, and is part of continuing research in neural
backpropagation. Backpropagation can be used with any gradient-based optimizer.
Equations for Gradient Computations in Back Propogation Algorithm
BACK PROPOGATION STEPS
Phase 1: propagation
Each propagation involves the following steps:
1. Propagation forward through the network to generate the output value(s)
2. Calculation of the cost (error term)
3. Propagation of the output activations back through the network using the training pattern
target to generate the deltas (the difference between the targeted and actual output values)
of all output and hidden neurons.
Phase 2: weight update
For each weight, the following steps must be followed:
1. The weight's output delta and input activation are multiplied to find the gradient of the
weight.
2. A ratio (percentage) of the weight's gradient is subtracted from the weight. This ratio
(percentage) influences the speed and quality of learning; it is called the learning rate.
The greater the ratio, the faster the neuron trains, but the lower the ratio, the more
accurate the training is. The sign of the gradient of a weight indicates whether the error
varies directly with, or inversely to, the weight. Therefore, the weight must be updated in
the opposite direction, "descending" the gradient.
Disadvantages of Back Propagation Algorithm
● The actual performance of BackPropagation on particular data is completely
dependent on its inputs.
● Backpropagation can be sensitive to noisy data and outliers.
● Fully matrix-based approach to backpropagation over a mini-batch.

Fletcher-Reeves Model (Updation to Backpropagation Algorithm)


Despite the general success of back-propagation method in the learning process, several major
deficiencies are still needed to be solved. The convergence rate of back-propagation is very low
and hence it becomes unsuitable for large problems. Furthermore, the convergence behavior of
the back-propagation algorithm depends on the choice of initial values of connection weights and
other parameters used in the algorithm such as the learning rate and the momentum term.
Improving the training efficiency of neural network based algorithms is an active area of
research and numerous papers have been proposed in the literature. Among these Fletcher and
Reeves algorithm was turn out to be more efficient than Backpropagation Algorithm.
The following iterative algorithm is proposed for changing the gradient based search direction
using a gain value. The gradient based search direction is a function of gradient of error with
respect to weights.
Algorithm :

Step 1 Initialize the weight vector with random values and the vector of gain values with one.
Step 2 Calculate the gradient of error w.r.t. to weights using Equation, and gradient of error w.r.t.
to gain using Equation .
Step 3 Use the gradient weight vector and gradient of gain calculated in step 2 to calculate the
new weight vector using equation and vector of new gain values using equation for use in the
next epoch.
Step 4 Repeat the following steps 2 and 3 on an epoch-byepoch basis until the selected error
minimization criteria is satisfied.

Artificial Neural Network (ANN) Approach

An Artificial Neural Network (ANN) is an information processing paradigm that is


inspired by the way biological nervous systems, such as the brain, process information.
The key element of this paradigm is the new structure of the information processing
system. It is composed of a huge number of highly interconnected processing elements (neurons)
working together to solve specific problems.
ANNs, like people, learn by example. An ANN is configured for a particular application,
such as pattern recognition or data classification, through a learning process. Learning in
biological systems adds adjustments to the synaptic connections that exist between the neurons.
The ANN has capability to extract the relationship between the inputs and outputs of a
process, without the physics being explicitly provided. Thus, these properties of ANN are well
suited to the problem of weather forecasting. The main purpose is to develop the most suitable
ANN architecture and its associated
training technique for weather prediction. This development will be based on using two
different neural network architecture to demonstrate the suitable one for this application.
Back Propagation (BPN) feed forward network and radial basis function network which
were trained by differential evolution algorithm are the selected architectures in this
study.
The basic architecture of the both Radial Basis Functions (RBF) neural network and
multilayer feed forward neural networks are given. Components of a modern weather
forecasting system include the following modules: data collection, data assimilation and
numerical weather prediction
Basic Diagram for ANN
Sigmoid Function and their Usage in Neural Network
It is used in neural networks to give logistic neurons real-valued output that is a smooth and
bounded function of their total input. It also has the added benefit of having nice derivatives
which make learning the weights of a neural network easier.A neural network element computes
a linear combination of its input signals, and applies a sigmoid function to the result. A reason
for its popularity in neural networks is because the sigmoid function satisfies a property between
the derivative and itself such that it is computationally easy to perform.

Steps involved in ANN approach

A. Data collection:
Observations of atmospheric pressure, temperature, wind speed, wind direction, humidity
and precipitation are made near the earth’s surface by trained observers, automatic
weather stations.
The World Meteorological Organization acts to standardize the instrumentation,
observing practices and timing of these observations worldwide.
B. Data assimilation: During the data assimilation process, information gained from the
observations is used in conjunction with a numerical model most recent forecast for the
time that observations were made to produce the meteorological analysis. This is the best
estimate of the current state of the atmosphere. It is a three-dimensional representation of the
distribution of temperature, moisture and
wind.
The features considered in this study are bar temperature, bar reading, sea level pressure,
mean sea level pressure, dry bulb temperature, wet bulb temperature, duw point
temperature, vapor pressure, wind speed, humidity, cloudiness, precipitation, wind
direction, wind speed and for prediction of rain. It is easy to implement and produces
desirable forecasting result by training the given data set.
C. Numerical weather prediction:
Numerical Weather Prediction (NWP) uses the power of computers to make a forecast.
Complex computer programs, also known as forecast models, run on supercomputers and
provide predictions on many atmospheric variables such as temperature, pressure, wind
and rainfall. A forecaster examines how the features predicted by the computer will interact to
produce the day’s weather.

Implementation
TEST CASE

The test cases determines whether the weather is rainy, foggy, thunderstorm or sunshine.

1) 0 22 10 13 9 101 47 1021 1017 3 1 15


We take the following values as in dataset and it predicts that the weather for the above
attribute of data is fog.

2) 1 37 25 13 6 44 15 1009 1003 8 1 13 1000


We take the following values as in dataset and it predicts that the weather for the above
attribute of data is rain.

Result and Discussion


Artificial Neural Network the Back Propagation Algorithm is implemented and the variations in
parameters are observed. According to these variations the logic in Back Propagation will be
developed and the change in other parameters with respect to one parameter will be predicted. All
the information of parameters which changed and predicted is collected together and the new
classification of weather will be drawn, whether the future day will be rainy day, sunny day or
windy day and whether on that particular day the probability of experiencing a thunderstorm is
high or not.
CONCLUSION
It concludes that there can be a new method to predict the future weather with the help of back
propagation training algorithm. It was found that the network learns very fast with back
propagation algorithm. The results are more accurate for predicting the future
weather. Back propagation is a gradient descent algorithm which learns by minimizing
the error in the output by adjusting the weights in the network. Our model has potential to
capture the complex relationships between many factors that contribute to certain weather
condition.
FUTURE SCOPE
The accuracy from weather forecasting model using ANN and Back Propagation Algorithm is
more than other Statistical Model. An extension to this technology can be done using any of the
other technique instead of Data mining and Different Algorithm.
Furthermore, in order to improve the efficiency of the neural network algorithms other statistical
based feature selection techniques, statistical indicators can be integrated. In another perspective
fuzzy techniques can be incorporated, which an inferential, probality-based approach to data
comparisons is allowing to infer, based on probabilities, the strength of the relationships between
attributes in the data sets and to achieve better predictability rate.

REFERENCES

1. Hernández, E.; Sanchez-Anguix, V.; Julian, V.; Palanca, J.; Duque, N. Rainfall prediction:
A deep learning approach. In International Conference on Hybrid Artificial Intelligence
Systems; Springer: Cham, Switzerland, 2016; pp. 151–162.
2. Goswami, B.N. The challenge of weather
prediction. Resonance 1996, 1, 8–17. [CrossRef]
3. Nayak, D.R.; Mahapatra, A.; Mishra, P. A survey on rainfall prediction using artificial
neural network. Int. J. Comput. Appl. 2013, 72, 16.
4. Kashiwao, T.; Nakayama, K.; Ando, S.; Ikeda, K.; Lee, M.; Bahadori, A. A neural network-
based local rainfall prediction system using meteorological data on the internet: A case
study using data from the Japan meteorological agency. Appl. Soft Comput. 2017, 56, 317–
330. [CrossRef]
5. Mislan, H.; Hardwinarto, S.; Sumaryono, M.A. Rainfall monthly prediction based on
artificial neural network: A case study in Tenggarong Station, East Kalimantan, Indonesia.
Procedia Comput. Sci. 2015, 59, 142–151. [CrossRef]
6. Muka, Z.; Maraj, E.; Kuka, S. Rainfall prediction using fuzzy logic. Int. J. Innov. Sci. Eng.
Technol. 2017, 4, 1–5.
7. Jimoh, R.G.; Olagunju, M.; Folorunso, I.O.; Asiribo, M.A. Modeling rainfall prediction
using fuzzy logic.
Int. J. Innov. Res. Comput. Commun. Eng. 2013, 1, 929–936.
8. Wu, J.; Liu, H.; Wei, G.; Song, T.; Zhang, C.; Zhou, H. Flash flood forecasting using
support vector regression model in a small mountainous catchment. Water 2019, 11, 1327.
[CrossRef]
9. Poornima, S.; Pushpalatha, M.; Sujit Shankar, J. Analysis of weather data using forecasting
algorithms. In Computational Intelligence: Theories, Applications and Future Directions—
Volume I. Advances in Intelligent Systems and Computing; Verma, N., Ghosh, A., Eds.;
Springer: Singapore, 2019; Volume 798.
10. Manideep, K.; Sekar, K.R. Rainfall prediction using different methods of Holt winters
algorithm: A big data
approach. Int. J. Pure Appl. Math. 2018, 119, 379–386.
11. Gundalia, M.J.; Dholakia, M.B. Prediction of maximum/minimum temperatures using Holt
winters method with excel spread sheet for Junagadh region. Int. J. Eng. Res. Technol.
2012, 1, 1–8.
12. Puah, Y.J.; Huang, Y.F.; Chua, K.C.; Lee, T.S. River catchment rainfall series analysis
using additive Holt–Winters method. J. Earth Syst. Sci. 2016, 125, 269–283. [CrossRef]
13. Patel, D.P.; Patel, M.M.; Patel, D.R. Implementation of ARIMA model to predict Rain
Attenuation for KU-band 12 Ghz Frequency. IOSR J. Electron. Commun. Eng. (IOSR-
JECE) 2014, 9, 83–87. [CrossRef]
14. Graham, A.; Mishra, E.P. Time series analysis model to forecast rainfall for Allahabad
region. J. Pharmacogn. Phytochem. 2017, 6, 1418–1421.
15. Salas, J.D.; Obeysekera, J.T.B. ARMA model identification of hydrologic time series. Water
Resour. Res. 1982,
18, 1011–1021. [CrossRef]
16. Chen, W.; Xie, X.; Wang, J.; Pradhan, B.; Hong, H.; Bui, D.T.; Duan, Z.; Ma, J. A
comparative study of logistic model tree, random forest, and classification and regression
tree models for spatial prediction of landslide susceptibility. Catena 2017, 151, 147–160.
[CrossRef]
17. Kusiak, A.; Wei, X.; Verma, A.P.; Roz, E. Modeling and prediction of rainfall using radar
reflectivity data: A data-mining approach. IEEE Trans. Geosci. Remote Sens. 2013, 51,
2337–2342. [CrossRef]
18. Kannan, M.; Prabhakaran, S.; Ramachandran, P. Rainfall forecasting using data mining
technique. Int. J. Eng. Technol. 2010, 2, 397–401.
19. Mehrotra, R.; Sharma, A. A nonparametric nonhomogeneous hidden Markov model for
downscaling of multisite daily rainfall occurrences. J. Geophys. Res. Atmos. 2005, 110.
[CrossRef]

20. Niu, J.; Zhang, W. Comparative analysis of statistical models in rainfall prediction. In
Proceedings of the 2015 IEEE International Conference on Information and Automation,
Lijiang, China, 8–10 August 2015; pp. 2187–2190.
21. Dash, Y.; Mishra, S.K.; Panigrahi, B.K. Rainfall prediction of a maritime state (Kerala),
India using SLFN and ELM techniques. In Proceedings of the 2017 International
Conference on Intelligent Computing, Instrumentation and Control Technologies
(ICICICT), Kannur, India, 6–7 July 2017; pp. 1714–1718.
22. Poornima, S.; Pushpalatha, M.; Aglawe, U. Predictive analytics using extreme learning
machine. J. Adv. Res. Dyn. Control Syst. 2018, 10, 1959–1966.
23. Choi, E.; Schuetz, A.; Stewart, W.F.; Sun, J. Using recurrent neural network models for
early detection of heart failure onset. J. Am. Med Inform. Assoc. 2016, 24, 361–370.
[CrossRef]
24. Tran, N.; Nguyen, T.; Nguyen, B.M.; Nguyen, G. A multivariate fuzzy time series resource
forecast model for clouds using LSTM and data correlation analysis. Procedia Comput. Sci.
2018, 126, 636–645. [CrossRef]
25. Zhuang, N.; Kieu, T.D.; Qi, G.J.; Hua, K.A. Deep differential recurrent neural networks.
arXiv 2018,
arXiv:1804.04192.
26. Schuster, M.; Paliwal, K.K. Bidirectional recurrent neural networks. IEEE Trans. Signal
Process. 1997,
45, 2673–2681. [CrossRef]
27. Graves, A.; Mohamed, A.R.; Hinton, G. Speech recognition with deep recurrent
neural networks. In Proceedings of the 2013 IEEE International Conference on Acoustics,
Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013; pp. 6645–6649.
28. Kalchbrenner, N.; Danihelka,
I.; Graves, A. Grid long short-term memory. arXiv 2015,
arXiv:1507.01526.
Salman, A.G.; Heryadi, Y.; Abdurahman, E.; Suparta, W. Single layer & multi-layer long short-
term memory (LSTM) model with intermediate variables for weather forecasting. Procedia
Comput. Sci. 2018,

You might also like