Extend Wireless Sensor Networks Lifetime Which Use Cluster-Based Routing Protocol, Namely Leach
Extend Wireless Sensor Networks Lifetime Which Use Cluster-Based Routing Protocol, Namely Leach
The development of an extended the WSN ‘s lifetime the campus, greenhouse management and other workshops
that uses LEACH protocol is a complicated and intriguing that uses different sensor to provide different real-time data
task. It involves a trench apprehension of the needs of IPRC for the campus Nsabiyumva willy ,2015).
Huye Campus WSN, the impact of quick energy drainage on
the lifetime of the network due to various threats such as III. RESEARCH GAPS
sinkhole reduce gradually the performance of the network,
and the complexity of the algorithm needed to solve the The LEACH (Low-Energy Adaptive Clustering
problem. In this study a comparative study technique was Hierarchy) protocol is a well-known protocol for wireless
followed to analyze the connection between the forecasted sensor networks (WSNs) that uses clustering to improve
variable and criterion variable(dependent variable).Many energy efficiency. However, as research on LEACH has
were found from each single in the sample ,one mark for evolved, various gaps and challenges have emerged,
every variable .This study approach were appropriate and suggesting areas for further research.
matching to our study since the researcher had to gather
based on current situation of WSN of the campus to Energy Efficiency in Large-Scale Networks:
increase lifetime. LEACH struggles with energy efficiency and
scalability when applied to large-scale networks with many
II. BACKGROUND OF STUDY nodes. In LEACH, cluster heads are chosen randomly
without considering node residual energy or proximity to
The research background of WSNs can be traced back other cluster heads, which can lead to unbalanced energy
to the late 1990s and early 2000s when the proliferation of consumption.
microsensor technology and advancements in wireless
communication paved the way for their exploration and Security Vulnerabilities:
development Hossam Mahmoud Ahmad Fahmy LEACH was not designed with security as a primary
(2016).Early research in WSNs primarily focused on concern, making it susceptible to attacks like eavesdropping,
fundamental issues such as energy efficiency, routing replay attacks, and node compromise, which can
protocols, and data aggregation techniques. Energy compromise data integrity and network resilience.
efficiency was a critical concern due to the limited power
resources of sensor nodes, which are often powered by Mobility Support:
batteries or energy harvesting mechanisms. Researchers LEACH assumes static sensor nodes and does not
explored techniques to minimize energy consumption at support mobile nodes, which limits its applications in
different layers of the network protocol stack, including the scenarios like vehicular ad hoc networks or animal tracking.
physical, MAC, routing, and application layers Waltenegus
Dargie(2010).Routing protocols played a crucial role in Load Balancing and Cluster Head Longevity:
WSNs to enable efficient data delivery from sensor nodes to The random selection of cluster heads can lead to
sink nodes or base stations. uneven load distribution, causing some nodes to deplete
their energy faster than others. This imbalance can reduce
Traditional routing protocols such as LEACH (Low- network lifetime and reliability.
Energy Adaptive Clustering Hierarchy) and SPIN (Sensor
Protocols for Information via Negotiation) were developed Energy Overhead in Cluster Formation:
to address the unique characteristics and constraints of The frequent re-clustering in LEACH introduces
WSNs, such as node mobility, network topology changes, significant energy overhead due to control message
and energy conservation by Holger Karl (2007). Today, exchange, which can be counterproductive for network
WSN research continues to evolve with a focus on lifetime.
addressing emerging challenges such as security and privacy
concerns, scalability issues, interoperability among IV. METODOLOGY
heterogeneous sensor nodes, integration with emerging
technologies like Internet of Things (IoT) and edge Methods of collecting data: Tools/Instruments
computing, and the development of self-organizing, self- Data collection is the process of gathering and
healing WSN architectures capable of adapting to dynamic measuring information on variables of interest in a
environmental conditions Abdulrahman Yarali, systematic way, allowing you to answer research questions,
PhD(2020).However, the decentralized nature and limited test hypotheses, and evaluate outcomes. In general, data
security features in WSNs make them vulnerable to a variety collection aims to ensure the information is accurate,
of attacks, including sinkhole attacks. A sinkhole attack is a reliable, and relevant to the subject of study. In this study,
type of network-layer attack in which a malicious node, or a the researcher used a questionnaire as research instrument
"sinkhole," attempts to attract network traffic by falsely and analyze the secondary data. Approaching people with a
advertising itself as an optimal route to the base station. This questionnaire is the best way to collect both qualitative and
attack is particularly harmful because, once the malicious quantitative data from respondents.
node successfully attracts traffic, it can disrupt the network
by Dropping packets, Altering or delaying packets , Energy Data Analysis
drain and so on. IPRC-Huye has introduced this technology Data analysis is the process of examining, cleaning,
in different domain such as in detecting water tanks level in transforming, and interpreting data to extract meaningful
insights, answer research questions, or support decision- interpreted and used for decision-making. Data processing
making. It’s a crucial step in turning raw data into actionable was mainly done though MATLAB as a simulation tool.
information. Data analysis is expected to offer clarity on the
topic of the study and the respondants’perceptions, as well MATLAB
as increase readers understanding of the topic and temper All the network topologies were simulated with
their interest in this portion of the research. SPSS were used MATLAB. Moreover, it was backdoored to facilitate
to analyses data and present the results using data analysis changes in network with the pre-available protocols.
tools used in scientific analysis (Burns ,2022). MATLAB is short for MATrix LABoratory and it revolves
around vectors and matrices. MATLAB additionally
Data Processing addresses algebraic and differential equations, which are
Data processing is the act of converting raw data into a quite relevant in linear algebra.
usable form through a series of systematic operations. The
goal is to transform data into information that can be easily
conduced in Integrated Polyclinic Regional College-Huye
It is a powerful tool for creating 2D and 3D Campus as case study because it was the best place which
imagesMATLAB offers visually appealing graphic has enough and appropriate infrastructure.
capabilities. This is, in my opinion one of the easiest
languages to write mathematical programsAt last, it is Ethical Considerations
programming language as well. Others include signal The ethical consideration will be useful to safeguard
processing, image in addition to optimization that can be the data collected within this research. Research ethics are
download from the MATLAB via its tool boxes (Chidiebere, likely a limited set of requirements imposed on researchers
2017). to be truthful and respectful with all people their study
affected by or the results report. Stricter ethical codes are
Research Design codified with Researcher typically. It will respect the values
Research design is the structured plan for conducting a of people researching with. Except that data was only
research study. It outlines the methods and procedures for helping me to do the academic work and aid my
collecting, analyzing, and interpreting data, ensuring that the contribution in quality of education. Personal data were
study effectively addresses the research question or guaranteed confidential and none of the respondents was
hypothesis. A well-defined research design maximizes the accused based on his/her statement.
validity and reliability of the results, making the findings
more credible and actionable. The experiments of this Conceptual Framework
research were accomplished based on the algorithm. Conceptual framework (or theoretical framework) is a
structure which can hold or support many things to interpret
The output from experiments were compared with and explain them. It is a theoretical base that enables
control samples. Sample input data were used to test change researchers, scholars or professionals to conceptualize and
in the outputs. The correlated research was to variables of organize their research so one can present it within the
new result at the end. Hence, we do not have continuous specific field or discipline.
data, random data sample were assumed. The study was
Random Nodes Distribution nodes and dead nodes are distributed randomly and the
During the creation of the field dimension, it is algorithm will form the clusters, elect cluster heads.
composed by of 30 nodes as shown in figure (5). The normal
Cluster Formation
Clustering involves managing a limited number of energy consumption is relatively minimal. The use of
logical groups composed of physical network nodes while clustering facilitates the rapid identification of routes since
the network operates. Clusters represent the logical groups. only the cluster heads communicate with the base station.
During the initial formation of the cluster, it can identify The diagram below depicts how communication occurs
compromised nodes and remove them. The primary defense within the cluster. It involves single-hop communication
for secure clustering is the elimination of at-risk nodes from node to cluster. The structure of the wireless sensor
during the cluster setup. The clustering process aims to network consists of 30 sensor nodes that are connected to a
minimize energy consumption for all sensors. Clusters are single base station. These sensors are organized into three
established based on the physical proximity of the distinct clusters. This network design enables the sensors to
nodes.Data transmission from a cluster to a base station is use less energy while sending data within the network.
carried out by a Cluster Head (CH). In this configuration,
Algorithm2.Sinkhole attack and energy lifetime extension for transmission and reception ETX=ERX=50*0.000
This algorithm focuses on energy efficiency for 000 001.
increasing network lifetime that run on LEACH protocol in
cluster based WSN. It is starting by the initial parameters, The step number one was to set the location in random
energy of normal nodes E0=0.5J, environment size 100m x way, then choose the strongest node as acting as CH after
100m, number of node 100, maximum number of round that will choose active CHs and put the remain particular
RMAX=4000, data aggregation energy. node in dead mode. For every round, will compute the node
energy dissipated by detecting 1st energy dissipated in its
EDA=5*0.000 000 001; active cluster head and sleep round then edit all nodes of energy, all former procedures
cluster head are elected randomly, optimal election will iterate the most of CHs are died.
probability of node to become cluster p=10%, energy model