0% found this document useful (0 votes)
72 views23 pages

Final RP

This document discusses developing a system to predict flood risks in India based on rainfall data. It aims to analyze rainfall measurements to raise alerts if a flood may occur, allowing people to take protective measures. The system would use machine learning techniques to process meteorological datasets and notify the Indian Meteorological Department and public of potential flood events. It describes comparing existing flood prediction systems, outlining functional and non-functional requirements, and modeling the system's entity relationships and data flows.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views23 pages

Final RP

This document discusses developing a system to predict flood risks in India based on rainfall data. It aims to analyze rainfall measurements to raise alerts if a flood may occur, allowing people to take protective measures. The system would use machine learning techniques to process meteorological datasets and notify the Indian Meteorological Department and public of potential flood events. It describes comparing existing flood prediction systems, outlining functional and non-functional requirements, and modeling the system's entity relationships and data flows.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 23

1

ABSTRACT

The present study aims to predict any potential flood chances that might arise due to the
rainfall happening at a particular place in India. Flood is related to calamities, though seem unrelated
at first sight but are highly affected and decided by the amount of rainfall at a place. The rainfall
which will be measured in mm has a lot of attributes associated with it. For an instance, the rainfall
could be the daily rainfall, monthly rainfall, seasonal rainfall or the annual rainfall, each representing
and affecting various biological events.

The goal of Flash-flood disaster prediction is to provide information. People and


organizations can use this to reduce weather-related losses and enhance societal benefits, including
protection of life and property, public health and safety, and support of economic prosperity and
quality of life. As it is quite clear that flood is indeed important and devastating calamity which
affects a large population of a demographic every year.

So in this situation, if can make a system, or for large user, our research can be used for
analysing the rainfall of the region a person is living in, and can raise alarm for the person if there
might can arise a situation of flood, he/ she can take useful measures to ensure least damage to him
and his property. Same thing could be used by the Indian Meteorological Department of India to
plan and process these events and make sure less people are affected by the calamity.
2

INTRODUCTION

1.1 BACKGROUND AND OBJECTIVES


In this chapter we will discuss background and objectives. Floods are one of the most
cataclysmic events in the face of Earth, which are exceptionally unpredictable to demonstrate and
difficult to predict at an earlier point of time. The examination of improvement of flood-forecast
designs has added to diminish hazard, policy suggestion, minimization of the loss of lives and
damage caused to properties pertaining to floods.
To re-enact the compound numerical interpretations of physical actions of floods, during the
last two decades, neural system blueprints have assisted in improving and advancing flood
prediction structure providing better execution and cost-effective solutions. This STUDY helps
predict the occurrence of a flood by rainfall dataset with machine learning techniques to prevent this
problem of floods.

1.2 SCOPE OF THIS STUDY


This study developed smart disaster prediction application using flood risk analytics towards
sustainable climate action. Specifically, it aims to monitor the metrological datasets, design a
notification management system for community flood warning, apply statistical modelling
algorithms for flood prediction and flood risk analytics, and deploy the flood monitoring in a web-
based system.
This will help for accurate disaster forecast based on the parameters used in the study to predict

possible flood that will affect the community.


1.3 OUTLINE OF THE STUDY REPORT
Floods are among the most destructive natural disasters, which are highly complex to model. The
research on the advancement of flood prediction models contributed to risk reduction, policy
suggestion, minimization of the loss of human life, and reduction of the property damage associated
with floods. We look at weather data and the future predicted the weather to plan our days
accordingly. Having visualizations helps us understand that data better. Developing this STUDY
using the Streamlit library (a python framework) we can create a responsive front-end which gives
us more time to work on the actual back-end and the services we aim to provide. The android app
3

will be the geolocation API of android to detect the location of the user and will fetch the rainfall
from Meteorological Department, then on the back-end we will run the algorithm to predict the
chances of any calamity in the area and will notify the user simultaneously in Real Time.

Front End : Python


Back End : MySQL Server
4

SOFTWARE REQUIREMENTS ANALYSIS

2.1 COMPARISON OF EXISTING SYSTEMS


In this chapter we will discuss the comparison of existing system. Weather forecasting is mainly
concerned with the prediction of weather condition in the given future time. Weather forecasts provide
critical information about future weather. There are various approaches available in weather forecasting, from
relatively simple observation of the sky to highly complex computerized mathematical models. The
prediction of weather condition is essential for various applications. Some of them are climate monitoring,
drought detection, severe weather prediction, agriculture and production, planning in energy industry,
aviation industry, communication, pollution dispersal, and so forth. In military operations, there is a
considerable historical record of instances when weather conditions have altered the course of battles.
Accurate prediction of weather conditions is a difficult task due to the dynamic nature of atmosphere. The
existing system implemented generic methodology of incremental mean clustering is proposed for weather

forecasting. This research has been done on limited dataset.


Among the natural disasters, floods are the most destructive, causing massive damage to
human life, infrastructure, agriculture, and the socioeconomic system. Governments, therefore, are
under pressure to develop reliable and accurate maps of flood risk areas and further plan for
sustainable flood risk management focusing on prevention, protection, and preparedness. Flood
prediction models are of significant importance for hazard assessment and extreme event
management. Robust and accurate prediction contribute highly to water recourse management
strategies, policy suggestions and analysis, and further evacuation modelling. Thus, the importance
of advanced systems for short-term and long-term prediction for flood and other hydrological events
is strongly emphasized to alleviate damage. However, the prediction of flood lead time and
occurrence location is fundamentally complex due to the dynamic nature of climate condition.
Therefore, today’s major flood prediction models are mainly data-specific and involve various
simplified assumptions. The proposed system implement the framework as web applications to
input the datasets such as temperature, humidity, wind speed and light speed datasets. Based on
these values, verify the flood levels using machine learning algorithm to send alert to corresponding
users.
5

2.2 STUDY REQUIREMENTS SPECIFICATION

FUNCTIONAL REQUIREMENTS

Requirement 1: Admin Module


Admin can add, modify and delete the datasets related to metrological information. Admin
has the responsibility for maintaining the all details in server and server can be design in server.
Admin can upload the datasets from various places.

Requirement 2: Pre-processing
In this module, admin can convert the unstructured datasets into structured datasets based on
irrelevant data removal and missing values estimation.

Requirement 3: Similarity measurements


This module explains about admin verify the details based on multiple weather types using
machine learning algorithms.

Requirement 4: Alert system


This module explains the notification process. If the admin check the dataset and identify the flood
means, automatically send alert to users based on appropriate the locations.
6

NON - FUNCTIONAL REQUIREMENTS

HARDWARE SPECIFICATION:

• Processor : Intel core processor 2.6.0 GHZ

• RAM : 4GB

• Hard disk : 160 GB

• Compact Disk : 650 Mb

• Keyboard : Standard keyboard

Monitor : 15 inch color monitor


SOFTWARE SPECIFICATION:

• Operating system : Windows OS


• Front End : Python

• Back end : MySQL Server

• Tool : Pycharm
7

SOFTWARE MODELING

3.1 ER DIAGRAM

In this chapter we will discuss ER diagram. An entity–relationship model (ER model) is describe
interrelated things of interest in a specific domain of knowledge. A basic ER model is composed of
entity types (which classify the things of interest) and specifies relationships it can exist between
entities (instances of those entity types). In software engineering, an ER model is commonly based
to create and represent things a business needs to remember in order to perform the business
process. Continuously, the ER model becomes an abstract of data model that defines data or
information structure. It can be implemented in a database, typically a relational database. Rectangle
represents the entity. Circle represents the attributes of entity. Diamond shape represents the
relationship between the entities.
2
3

3.2 DATA FLOW DIAGRAM

A two-dimensional diagram explains how data is processed and transferred in a system. The
graphical depiction identifies each source of data and how it interacts with other data sources to
reach a common output. Individuals seeking to draft a data flow diagram must identify external
inputs and outputs, determine how the inputs and outputs relate to each other, and explain with
graphics how these connections relate and what they result in. This type of diagram helps business
development and design teams visualize how data is processed and identify or improve certain

aspects.

Data flow Symbols:

Symbol Description

An entity. A source of
data or a destination for
data.

A process or task that is


performed by the system.

A data store, a place


where data is held between
processes.

A data flow.
4

LEVEL 0

LEVEL 1

LEVEL 2
5

LEVEL 3
6

SOFTWARE DESIGN

4.1 TABLE DESIGN

Register Table

Field Type Null Default

FirstName varchar(50) Yes NULL

LastName varchar(50) Yes NULL


Gender varchar(50) Yes NULL

Age varchar(50) Yes NULL


Mobile varchar(50) Yes NULL

Email varchar(50) Yes NULL

City varchar(50) Yes NULL


Address varchar(50) Yes NULL
UserName varchar(50) Yes NULL

Password varchar(50) Yes NULL

WEATHER table
7

Field Type Null Default

id varchar(50) Yes NULL

DATE varchar(50) Yes NULL

TIME varchar(50) Yes NULL

TEMPRATURE varchar(50) Yes NULL

HUMIDITY varchar(50) Yes NULL

LIGHTING varchar(50) Yes NULL

WIND_SPEED varchar(50) Yes NULL


type varchar(50) Yes NULL

RESULT table
Field Type Null Default
id varchar(50) Yes NULL

DATE varchar(50) Yes NULL


TIME varchar(50) Yes NULL

TEMPRATURE varchar(50) Yes NULL

HUMIDITY varchar(50) Yes NULL

LIGHTING varchar(50) Yes NULL


WIND_SPEED varchar(50) Yes NULL

type varchar(50) Yes NULL

Result varchar(50) Yes NULL

4.2 PROCESS DESIGN


8

MODULES
 Admin module

 Pre-processing

 Rules construction

 Classification

 Notification

MODULE DESCRIPTIONS

Admin Module
Data mining is the process of collecting, searching through, and analyzing an immense
amount of data in a database, as to discover patterns or relationships. Data mining is a term from
computer science. Sometimes it is also called knowledge discovery in databases (KDD). In this
module we can upload the weather datasets which includes temperature, humidity, wind speed and
lighting values.
These datasets are collected from metrological repository.

Pre-processing
Data pre-processing is a data mining technique that involves transforming raw data into an
understandable format. Then calculate the missing value estimation and irrelevant data removal
approaches.

Rules construction
In this module cluster the datasets using similarity measures. Using machine learning
algorithm to find the similarity based on group the attributes. Machine learning is used to calculate
the similarity metric between subnets. Based on similar attributes construct the data into various
weather types.
9

Classification
In this module we can construct the decision tree based on temperature, humidity, wind
speed, lighting speed. The threshold values of each attributes are constructed using decision tree
algorithm.

Notification module
In this module, we can analyse the flood details based location. Finally send the alert to users
in appropriate locations. This model will help in recognizing and controlling flash floods in an urban
area and also set risk management standards, based on the rainfall received in that region of the

nation on a yearly basis. It lays focus on saving costs by being proactive instead of being reactive.
10

CHAPTER 5

SOFTWARE IMPLEMENTATION

5.1. IMPLEMENTATION OF TABLES USING RELATIONAL DATABASE

CREATE TABLE `regtb` (

`id` bigint(250) NOT NULL auto_increment,

`TEMPRATURE` varchar(250) NOT NULL,

`HUMIDITY` varchar(250) NOT NULL,

`LIGHTING` varchar(250) NOT NULL,

`WINDSPEED` varchar(250) NOT NULL,

`Location` varchar(250) NOT NULL,

`Mobile` varchar(250) NOT NULL,

`Date` varchar(250) NOT NULL,

`Result` varchar(500) NOT NULL,

PRIMARY KEY (`id`)

) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=3 ;


11

5.2. IMPLEMENTATION OF PROCESSES USING PROGRAMMING


LANGUAGES

Frond End: Python


Python is an interpreted high-level programming language for general-purpose
programming. Created by Guido van Rossum and first released in 1991, Python has a design
philosophy that emphasizes code readability, notably using significant whitespace. It provides
constructs that enable clear programming on both small and large scales. In July 2018, Van
Rossum stepped down as the leader in the language community. Python features a dynamic type
system and automatic memory management. It supports multiple programming paradigms,
including object-oriented, imperative, functional and procedural, and has a large and
comprehensive standard library. Python interpreters are available for many operating systems.
CPython, the reference implementation of Python, is open source software and has a
communitybased development model, as do nearly all of Python's other implementations. Python
and CPython are managed by the nonprofit Python Software Foundation. Rather than having all
of its functionality built into its core, Python was designed to be highly extensible. This compact
modularity has made it particularly popular as a means of adding programmable interfaces to
existing applications. Van Rossum's vision of a small core language with a large standard library
and easily extensible interpreter stemmed from his frustrations with ABC, which espoused the
opposite approach. While offering choice in coding methodology, the Python philosophy rejects
exuberant syntax (such as that of Perl) in favor of a simpler, less-cluttered grammar. As Alex
Martelli put it: "To describe something as 'clever' is not considered a compliment in the Python
culture."Python's philosophy rejects the Perl "there is more than one way to do it" approach to
language design in favour of "there should be one—and preferably only one—obvious way to do
it".

Python's developers strive to avoid premature optimization, and reject patches to


noncritical parts of CPython that would offer marginal increases in speed at the cost of clarity.
[ When speed is important, a Python programmer can move time-critical functions to extension
modules written in languages such as C, or use PyPy, a just-in-time compiler. CPython is also
available, which translates a Python script into C and makes direct C-level API calls into the
Python interpreter. An important goal of Python's developers is keeping it fun to use. This is
12

reflected in the language's name a tribute to the British comedy group Monty Python and in
occasionally playful approaches to tutorials and reference materials, such as examples that refer
to spam and eggs (from a famous Monty Python sketch) instead of the standard for and bar.

A common neologism in the Python community is pythonic, which can have a wide range
of meanings related to program style. To say that code is pythonic is to say that it uses Python
idioms well, that it is natural or shows fluency in the language, that it conforms with Python's
minimalist philosophy and emphasis on readability. In contrast, code that is difficult to
understand or reads like a rough transcription from another programming language is called
unpythonic. Users and admirers of Python, especially those considered knowledgeable or
experienced, are often referred to as Pythonists, Pythonistas, and Pythoneers. Python is an
interpreted, object-oriented, high-level programming language with dynamic semantics. Its high-
level built in data structures, combined with dynamic typing and dynamic binding, make it very
attractive for Rapid Application Development, as well as for use as a scripting or glue language
to connect existing components together. Python's simple, easy to learn syntax emphasizes
readability and therefore reduces the cost of program maintenance. Python supports modules and
packages, which encourages program modularity and code reuse. The Python interpreter and the
extensive standard library are available in source or binary form without charge for all major
platforms, and can be freely distributed. Often, programmers fall in love with Python because of
the increased productivity it provides. Since there is no compilation step, the edit-test-debug
cycle is incredibly fast. Debugging Python programs is easy: a bug or bad input will never cause
a segmentation fault. Instead, when the interpreter discovers an error, it raises an exception.
When the program doesn't catch the exception, the interpreter prints a stack trace. A source level
debugger allows inspection of local and global variables, evaluation of arbitrary expressions,
setting breakpoints, stepping through the code a line at a time, and so on. The debugger is written
in Python itself, testifying to Python's introspective power. On the other hand, often the quickest
way to debug a program is to add a few print statements to the source: the fast edit-test-debug
cycle makes this simple approach very effective.

Python’s initial development was spearheaded by Guido van Rossum in the late 1980s.
Today, it is developed by the Python Software Foundation. Because Python is a multiparadigm
language, Python programmers can accomplish their tasks using different styles of programming:
13

object oriented, imperative, functional or reflective. Python can be used in Web development,
numeric programming, game development, serial port access and more.
There are two attributes that make development time in Python faster than in other programming
languages:

1. Python is an interpreted language, which precludes the need to compile code before
executing a program because Python does the compilation in the background. Because
Python is a high-level programming language, it abstracts many sophisticated details
from the programming code. Python focuses so much on this abstraction that its code can
be understood by most novice programmers.
2. Python code tends to be shorter than comparable codes. Although Python offers fast
development times, it lags slightly in terms of execution time. Compared to fully
compiling languages like C and C++, Python programs execute slower. Of course, with
the processing speeds of computers these days, the speed differences are usually only
observed in benchmarking tests, not in real-world operations. In most cases, Python is
already included in Linux distributions and Mac OS X machines.

Back End:My SQL

MySQL is the world's most used open source relational database management system
(RDBMS) as of 2008 that run as a server providing multi-user access to a number of databases.
The MySQL development STUDY has made its source code available under the terms of the
GNU General Public License, as well as under a variety of proprietary agreements. MySQL was
owned and sponsored by a single for-profit firm, the Swedish company MySQL AB, now owned
by Oracle Corporation.

MySQL is a popular choice of database for use in web applications, and is a central component
of the widely used LAMP open source web application software stack—LAMP is an acronym for
"Linux, Apache, MySQL, Perl/PHP/Python." Free-software-open source STUDYs that require a
full-featured database management system often use MySQL.For commercial use, several paid
editions are available, and offer additional functionality. Applications which use MySQL
databases include: TYPO3, Joomla, Word Press, phpBB, MyBB, Drupal and other software built
14

on the LAMP software stack. MySQL is also used in many high-profile, large-scale World Wide
Web products, including Wikipedia, Google(though not for searches), ImagebookTwitter, Flickr,
Nokia.com, and YouTube.

Inter images

MySQL is primarily an RDBMS and ships with no GUI tools to administer MySQL databases or
manage data contained within the databases. Users may use the included command line tools, or
use MySQL "front-ends", desktop software and web applications that create and manage MySQL
databases, build database structures, back up data, inspect status, and work with data records. The
official set of MySQL front-end tools, MySQL Workbench is actively developed by Oracle, and
is freely available for use.

Graphical

The official MySQL Workbench is a free integrated environment developed by MySQL


AB, that enables users to graphically administer MySQL databases and visually design database
structures. MySQL Workbench replaces the previous package of software, MySQL GUI Tools.
Similar to other third-party packages, but still considered the authoritative MySQL frontend,
MySQL Workbench lets users manage database design & modeling, SQL development
(replacing MySQL Query Browser) and Database administration (replacing MySQL
Administrator).MySQL Workbench is available in two editions, the regular free and open source
Community Edition which may be downloaded from the MySQL website, and the proprietary
Standard Edition which extends and improves the feature set of the Community Edition.
15

SOFTWARE TESTING AND EVALUATION

6.1. TEST CASES DESIGN

A test case has components that describe input, action and an expected response, in order to
determine if a feature of an application is working correctly. A test case is a set of instructions on
“HOW” to validate a particular test objective/target, which when followed will tell us if the
expected behaviour of the system is satisfied or not.

Characteristics of a good test case:

• Accurate: Exacts the purpose.


• Economical: No unnecessary steps or words.
• Traceable: Capable of being traced to requirements.
• Repeatable: Can be used to perform the test over and over.
•Reusable: Can be reused if necessary.
S.NO Scenario Input Excepted output Actual output

1 Dataset upload CSV file Upload in home Upload in home page


page

2 Weather Predict the Weather values Weather values


prediction weather types

3 User form Temperature, Stored in database Stored in database


humidity,
lighting
speed,wind
speed, location
16

4 Prediction Matched with Flood prediction Flood prediction


database
17

CONCLUSION

7.1. CONCLUSION

In this study, we have tested machine learning techniques used in the process of flood
detection. The results have proved that the machine learning algorithms could be used for
detecting flooded areas with high accuracy. To achieve accurate results, both good-quality data
and an effective machine learning algorithm are essential. At first, the attributes of the training
points were collected. This step is extremely important, because the success of the learning phase
is highly dependent on it. High temporal and spatial data resolution are the key factors when
dealing with flood monitoring. In particular, datasets are uploaded in the form of CSV file, and
implemented classification algorithm to predict the flood types. Finally provide the alert about
flood.

7.2. LIMITATIONS OF THE PROPOSED SYSTEM

• Limited datasets are used in this STUDY

7.3. FUTURE SCOPE

In future we can develop the various machine learning and deep learning algorithms to
classify the weather types.

You might also like