0% found this document useful (0 votes)
7 views3 pages

Normalization

Normalization is a database design technique aimed at reducing data redundancy and eliminating anomalies such as update, deletion, and insertion anomalies. It involves dividing larger tables into smaller ones and establishing relationships between them, as proposed by Edgar Codd and further developed into various normal forms. While normalization offers advantages like improved performance and reduced database size, it also has disadvantages such as increased complexity and potential performance slowdowns.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views3 pages

Normalization

Normalization is a database design technique aimed at reducing data redundancy and eliminating anomalies such as update, deletion, and insertion anomalies. It involves dividing larger tables into smaller ones and establishing relationships between them, as proposed by Edgar Codd and further developed into various normal forms. While normalization offers advantages like improved performance and reduced database size, it also has disadvantages such as increased complexity and potential performance slowdowns.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Normalization

Normalization is a database design technique that reduces data redundancy and eliminates
undesirable characteristics like Insertion, Update and Deletion Anomalies. Normalization rules
divides larger tables into smaller tables and links them using relationships. The purpose of
Normalisation in SQL is to eliminate redundant (repetitive) data and ensure data is stored
logically.

The inventor of the relational model Edgar Codd proposed the theory of normalization of data
with the introduction of the First Normal Form, and he continued to extend theory with Second
and Third Normal Form. Later he joined Raymond F. Boyce to develop the theory of Boyce-
Codd Normal Form.

It is the methodology of arranging a data model to capably store data in an information base. The
completed impact is that tedious data is cleared out, and just data related to the attribute is taken
care of inside the table. Normalization regularly incorporates isolating an information base into
at least two tables and describing associations between the tables. The objective is to isolate data
so that expands, deletions, and changes of abroad may be made in just one table and thereafter
multiplied through whatever survives from the information base by methods for the described
associations.

Normalization in DBMS: Anomalies, Advantages, Disadvantages

Anomalies in Database

1) Update Anomalies: When several instances of the same data are scattered across the database
without proper relationship/link, it could cause strange conditions where a few of the instances
will get updated with new values whereas some of them will not. This leaves the database in an
inconsistent state.

2) Deletion Anomalies: Incomplete deletion of a particular data section which leaves some
residual instances. The database creator remains unaware of such unwanted data as it is present
at a different location.

3) Insertion Anomalies: This occurs when an attempt to insert data into a non-existent record.

Paying attention to these anomalies can help to maintain a consistent database.

ADVANTAGES OF NORMALIZATION
Normalization in DBMS: Anomalies, Advantages, Disadvantages: At a basic level,
normalization is the simplification of any bulk quantity to an optimum value. In the digital
world, normalization usually refers to database normalization which is the process of organizing
the columns (attributes) and tables (relations) of a relational database to minimize data repetition.
In the process of database creation, normalization involves organizing data into optimal tables in
such a way that the results obtained are always unambiguous and clear in concept.

Though database normalization can have the effect of duplication of data, it completely removes
data redundancy. This process can be considered as a refinement process after the initial
identification of data objects that are to be included in the database. It involves identification of
the relationship between the data objects and defining the tables required and the columns to be
added within each table.

This article is all about Normalization in DBMS: Anomalies, Advantages and disadvantages.

Normalization in DBMS: Anomalies, Advantages, Disadvantages

If a database design is not done properly, it may cause several anomalies to occur in it.
Normalization is essential for removing various anomalies like:

Anomalies in Database

1) Update Anomalies: When several instances of the same data are scattered across the database
without proper relationship/link, it could cause strange conditions where a few of the instances
will get updated with new values whereas some of them will not. This leaves the database in an
inconsistent state.

2) Deletion Anomalies: Incomplete deletion of a particular data section which leaves some
residual instances. The database creator remains unaware of such unwanted data as it is present
at a different location.

3) Insertion Anomalies: This occurs when an attempt to insert data into a non-existent record.

Paying attention to these anomalies can help to maintain a consistent database.

ADVANTAGES OF NORMALIZATION

1) A smaller database can be maintained as normalization eliminates the duplicate data.


Overall size of the database is reduced as a result.
2) Better performance is ensured which can be linked to the above point. As databases
become lesser in size, the passes through the data becomes faster and shorter thereby
improving response time and speed.
3) Narrower tables are possible as normalized tables will be fine-tuned and will have lesser
columns which allows for more data records per page.
4) Fewer indexes per table ensures faster maintenance tasks (index rebuilds).
5) A more modest information base can be kept up as standardization disposes of the copy
information. Generally speaking size of the information base is diminished thus.
6) Better execution is guaranteed which can be connected to the above point. As information
bases become lesser in size, the goes through the information turns out to be quicker and
more limited in this way improving reaction time and speed.
7) Narrower tables are conceivable as standardized tables will be tweaked and will have
lesser segments which considers more information records per page.
8) Fewer files per table guarantees quicker support assignments (file modifies).
9) Also understands the choice of joining just the tables that are required.

Disadvantages of Normalization :

1) More tables to join as by spreading out information into more tables, the need to join
table’s increments and the undertaking turns out to be more dreary. The information base
gets more enthusiastically to acknowledge too.
2) Tables will contain codes as opposed to genuine information as the rehashed information
will be put away as lines of codes instead of the genuine information. Thusly, there is
consistently a need to go to the query table.
3) Data model turns out to be incredibly hard to question against as the information model is
advanced for applications, not for impromptu questioning. (Impromptu question is an
inquiry that can’t be resolved before the issuance of the question. It comprises of a SQL
that is developed progressively and is typically built by work area cordial question
devices.). Subsequently it is difficult to display the information base without
understanding what the client wants.
4) As the typical structure type advances, the exhibition turns out to be increasingly slow.
5) Proper information is needed on the different ordinary structures to execute the
standardization cycle effectively. Reckless use may prompt awful plan loaded up with
significant peculiarities and information irregularity.
6) More tables to join as by spreading out data into more tables, the need to join table’s
increases and the task becomes more tedious. The database becomes harder to realize as
well.
7) Tables will contain codes rather than real data as the repeated data will be stored as lines
of codes rather than the true data. Therefore, there is always a need to go to the lookup
table.
8) As the normal form type progresses, the performance becomes slower and slower.
9) Proper knowledge is required on the various normal forms to execute the normalization
process efficiently. Careless use may lead to terrible design filled with major anomalies
and data inconsistency.

You might also like