0% found this document useful (0 votes)
8 views

lecture-4

The document discusses database design and modeling, emphasizing the importance of a well-structured database for data integrity, efficient retrieval, and security. It covers key concepts such as entity-relationship modeling, normalization, and the database design process, which includes requirements analysis, conceptual, logical, and physical design stages. Additionally, it highlights the role of functional dependencies in ensuring data integrity and guiding the normalization process.

Uploaded by

pekogroup2017
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

lecture-4

The document discusses database design and modeling, emphasizing the importance of a well-structured database for data integrity, efficient retrieval, and security. It covers key concepts such as entity-relationship modeling, normalization, and the database design process, which includes requirements analysis, conceptual, logical, and physical design stages. Additionally, it highlights the role of functional dependencies in ensuring data integrity and guiding the normalization process.

Uploaded by

pekogroup2017
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

Database Systems and Security

Database Design and Modeling

Evrad KAMTCHOUM

CENTER FOR CYBERSECURITY AND MATHEMATICAL CRYPTOLOGY


THE UNIVERSITY OF BAMENDA

November 14, 2024

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 1 / 48


Contents

1 Introduction

2 Entity-Relationship Modeling
Functional Dependencies
Database Normalization
Database Denormalization
Integrity Rules and Constraints

3 Logical Data Model

4 Conclusion

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 2 / 48


Introduction

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 3 / 48


Introduction to Database Design

Definition
Database design is the process of creating a detailed data model of a
database. It involves identifying the data requirements, organizing the
data into tables, and defining the relationships between tables.

A well-designed database is crucial for efficient data storage, retrieval,


and manipulation.
In this lecture, we will explore the principles and best practices of
database design, including entity-relationship modeling,
normalization, and denormalization.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 4 / 48


Importance of Database Design

Good database design is crucial for the success of any application or


organization. A well-designed database offers the following benefits:
Data Integrity: Ensures accuracy and consistency of data.
Efficient Data Retrieval: Optimizes query performance for faster
data retrieval.
Scalability: Allows for easy expansion and scalability as the data
grows.
Data Security: Ensures data security and access control to protect
sensitive information.
Maintainability: Facilitates easier maintenance and updates to the
database schema.
By investing time and effort into database design, organizations can build
a solid foundation for their data management needs.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 5 / 48


Key Concepts in Database Design

There are several key concepts in database design that form the
foundation of a well-structured database:
Entity: Represents a real-world object or concept, such as a customer
or product.
Attribute: Describes the characteristics or properties of an entity.
Relationship: Defines how entities are related to each other.
Cardinality: It describes the number of instances of one entity that
can be associated with a single instance of another entity.
Normalization: Process of organizing data to minimize redundancy
and dependency.
Denormalization: Technique used to optimize query performance by
introducing redundancy.
Understanding these concepts is essential for designing effective database
schemas.
Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 6 / 48
Best Practices in Database Design

To ensure the success of database design projects, it’s important to follow


best practices, such as:
Requirements Analysis: Understand the data requirements and
business needs before designing the database.
Normalization: Normalize the database schema to reduce
redundancy and improve data integrity.
Naming Conventions: Use clear and consistent naming conventions
for tables, columns, and other database objects.
Documentation: Document the database schema, relationships, and
business rules for future reference.
Testing: Test the database design thoroughly to ensure it meets the
requirements and performs as expected.
Following these best practices can help create a robust and efficient
database system.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 7 / 48


Types of Data Models
Data models are conceptual representations used to describe the structure and
organization of data within a database. There are several types of data models,
each suited for different purposes and levels of abstraction.
1 Hierarchical Data Model: This model organizes data in a tree-like
structure with a single root, where each parent can have multiple children
but each child has only one parent. It was widely used in early database
management systems but is less common today due to its limitations in
representing complex relationships.
2 Network Data Model: Similar to the hierarchical model, the network data
model allows each record to have multiple parent and child records, enabling
more complex relationships. It introduced the concept of sets to represent
data relationships and was an improvement over the hierarchical model.
3 Relational Data Model: The relational model organizes data into tables
(relations) consisting of rows (tuples) and columns (attributes). It defines
relationships between tables using keys and provides a powerful framework
for querying and manipulating data. The relational model is the basis for
most modern database management systems.
Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 8 / 48
Types of Data Models (2)

4 Entity-Relationship Model: The entity-relationship (ER) model represents


the relationships between different entities using entities, attributes, and
relationships. It provides a graphical representation of the database structure
and is commonly used for database design and modeling.
5 Object-Oriented Data Model: In the object-oriented data model, data is
represented as objects with attributes and methods. It extends the concepts
of the relational model to include inheritance, encapsulation, and
polymorphism, making it suitable for representing complex real-world
scenarios.
6 Document Data Model: The document data model stores data in flexible,
semi-structured documents such as JSON or XML. It is suitable for storing
unstructured or hierarchical data and is commonly used in
document-oriented databases like MongoDB.
Each type of data model has its advantages and limitations, and the choice of
model depends on factors such as the nature of the data, the requirements of the
application, and the capabilities of the database management system.
Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 9 / 48
Database Modeling
Definition
Database modeling is the process of creating a conceptual representation
of the structure and content of a database.

Conceptual Modeling: Creating a high-level description of the data,


independent of implementation details.
Logical Modeling: Designing the database schema and relationships
between entities.
Physical Modeling: Defining how the database will be implemented,
including storage structures and indexing strategies.
Normalization: Organizing data to minimize redundancy and
dependency.
Modeling Tools: Various tools such as Entity-Relationship Diagrams
(ERDs) or UML diagrams can be used for modeling.
Iterative Process: Database modeling is often an iterative process,
refining the model based on feedback and changing requirements.
Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 10 / 48
Conceptual Modeling

Definition
Conceptual modeling involves creating a high-level description of the data
to be stored in the database, independent of any specific implementation
details.

Abstraction: Representing real-world entities and relationships in a


simplified manner.
Entities and Relationships: Identifying the main entities and their
relationships in the domain.
Attributes: Defining the properties or characteristics of each entity.
Constraints: Specifying any business rules or constraints that apply
to the data.
Modeling Techniques: Various techniques such as
Entity-Relationship (ER) diagrams or Unified Modeling Language
(UML) diagrams can be used.
Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 11 / 48
Logical Modeling
Definition
Logical modeling involves designing the structure and relationships of a
database at a conceptual level, without considering implementation details
such as storage mechanisms.

Entity-Relationship (ER) Diagrams: Graphical representations of


entities, attributes, and relationships.
Entities: Objects or concepts in the domain being modeled.
Attributes: Properties or characteristics of entities.
Relationships: Connections between entities, describing how they are
related to each other.
Cardinality: Specifies the number of instances of one entity that can
be associated with another entity.
Normalization: Organizing data to reduce redundancy and
dependency, typically through techniques like First Normal Form
(1NF), Second Normal Form (2NF), and Third Normal Form (3NF).
Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 12 / 48
Physical Modeling

Definition
Physical modeling involves translating the logical database design into the actual
implementation details, including data storage structures, indexing strategies, and performance
optimization techniques.

Storage Structures: Determining how data will be stored on disk, such as tables, indexes,
and partitions.
Indexes: Creating indexes to speed up data retrieval operations by providing efficient
access paths to data.
Data Types and Constraints: Specifying the data types and constraints for each attribute
in the database schema.
Normalization: Ensuring the database schema is in an optimal normalized form to
minimize redundancy and dependency.
Performance Optimization: Implementing strategies to improve query performance and
overall system efficiency, such as query optimization and caching.
Data Integrity: Enforcing data integrity constraints, such as foreign key relationships and
unique constraints, at the database level.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 13 / 48


Database Design Process

Definition
The database design process involves several stages aimed at creating an efficient, secure, and
scalable database system that meets the requirements of the organization or application.

1 Requirements Analysis: Gathering and analyzing requirements from stakeholders to


understand the data needs and business processes.
2 Conceptual Design: Creating a high-level conceptual model of the database, including
entities, attributes, and relationships.
3 Logical Design: Translating the conceptual model into a logical database schema,
including tables, keys, and relationships.
4 Physical Design: Implementing the logical database schema into a physical database,
including storage structures, indexing, and performance optimization.
5 Implementation: Building and deploying the database system based on the physical
design.
6 Testing and Evaluation: Testing the database system to ensure it meets the requirements
and performs as expected, and gathering feedback for improvement.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 14 / 48


Entity-Relationship Modeling

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 15 / 48


Entity-Relationship Modeling

Entity-Relationship (ER) modeling is a technique used to visualize


and design the structure of a database.
It represents the entities (objects or concepts), attributes (properties
of entities), and relationships between entities.
It allows designers to create a conceptual schema of the database,
which includes entities, attributes, and relationships between entities.
Key components of ER modeling include entities, attributes,
relationships, and cardinality constraints.
By creating an ER diagram, you can visually represent the structure of
the database and identify the relationships between different entities.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 16 / 48


Notation in Entity-Relationship Modeling

There are several symbols and conventions used in entity-relationship


modeling to represent entities, attributes, and relationships:
Entity: Represented by a rectangle with the entity name written
inside.
Attribute: Represented by an oval connected to the corresponding
entity.
Relationship: Represented by a diamond shape connected to the
related entities. Cardinality and participation constraints can be
indicated using lines or labels.
Using standardized notation helps ensure consistency and clarity in ER
diagrams.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 17 / 48


Example of Entity-Relationship Diagram
Let’s consider a simple example of an entity-relationship diagram for a university database:

Figure: Entity-Relationship Diagram for a University Database

In this diagram, we have entities such as Student, Course, and College, with attributes and
relationships defined between them.
Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 18 / 48
Introduction to Functional Dependencies

Definition
Functional dependencies are a key concept in database theory, describing
the relationship between attributes in a relation. They help ensure data
integrity and guide the process of normalization.

Importance
Functional dependencies aid in the process of data modeling, where
designers define the structure and relationships of data entities in a
database. By identifying and modeling functional dependencies, designers
can create accurate and efficient database schemas.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 19 / 48


Concepts of Functional Dependencies

The key concepts of functional dependencies include:


Definition: A functional dependency is a constraint between two sets
of attributes in a relation, where the value of one set determines the
value of the other set.
Notation: Functional dependencies are typically denoted as X → Y ,
where X is the determinant set of attributes and Y is the dependent
set of attributes.
Closure: The closure of a set of attributes X , denoted as X + , is the
set of all attributes functionally determined by X .
Keys: A superkey is a set of attributes that functionally determines
all other attributes in the relation. A candidate key is a minimal
superkey.
Understanding these concepts is essential for designing and analyzing
relational database schemas.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 20 / 48


Example of Functional Dependencies

Let’s consider a simple example of a database table representing


employees:

EmployeeID FirstName LastName


1 John Smith
2 Alice Johnson

In this table, we observe the following functional dependencies:


EmployeeID → FirstName, LastName
FirstName 9 EmployeeID
These functional dependencies help maintain data integrity and guide the
normalization process.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 21 / 48


Functional Dependencies and Normalization

Functional dependencies play a crucial role in the process of


normalization. By identifying and analyzing functional dependencies,
we can decompose relations into smaller, well-structured relations.

The normalization process involves transforming a relation into


various normal forms, such as First Normal Form (1NF), Second
Normal Form (2NF), Third Normal Form (3NF), and so on, to
eliminate redundancy and dependency.

Functional dependencies help ensure that each relation satisfies the


requirements of the desired normal form, contributing to data
integrity and efficiency in the database.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 22 / 48


Introduction to Database Normalization

Definition
Database normalization is a process used to organize a database schema in
such a way that reduces redundancy and dependency of data.

It helps in improving data integrity and minimizing anomalies during


data manipulation.
It involves breaking down large tables into smaller, related tables and
defining relationships between them.
Normalization is typically carried out through a series of normalization
forms, such as First Normal Form (1NF), Second Normal Form
(2NF), Third Normal Form (3NF), and so on.
Each normalization form removes certain types of data redundancy
and dependency, leading to a more efficient and maintainable
database design.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 23 / 48


Objectives of Database Normalization

The primary objectives of database normalization are:


Eliminate Redundancy: Minimize duplication of data within the
database to save storage space and ensure data consistency.
Minimize Dependency: Reduce the dependency of non-key
attributes on the primary key to prevent update anomalies.
Ensure Data Integrity: Improve data integrity by enforcing
constraints and reducing the risk of data inconsistencies.
By achieving these objectives, database normalization helps in creating a
well-structured and efficient database schema.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 24 / 48


Normal Forms in Database Normalization

There are several normal forms defined in the process of database normalization,
each building upon the previous one:
1 First Normal Form (1NF): Ensures that each column contains atomic
values, eliminating repeating groups.
2 Second Normal Form (2NF): Meets the requirements of 1NF and ensures
that non-key attributes are fully dependent on the primary key.
3 Third Normal Form (3NF): Meets the requirements of 2NF and eliminates
transitive dependencies between non-key attributes.
4 Boyce-Codd Normal Form (BCNF): A stricter version of 3NF, ensuring
that every determinant is a candidate key.
5 Fourth Normal Form (4NF): Addresses multi-valued dependencies and
further reduces redundancy.
6 Fifth Normal Form (5NF): Eliminates join dependencies by decomposing
relation schemas.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 25 / 48


Example of Database Normalization

Let’s consider a simple example of a denormalized database schema for


storing customer orders:

OrderID CustomerName Item Price


1 John Laptop $1000
2 John Phone $500
3 Alice Tablet $700

This schema violates 1NF because the Item column contains multiple
values. By normalizing the schema and separating items into a separate
table, we can eliminate redundancy and achieve higher levels of
normalization.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 26 / 48


Example of Second Normal Form (2NF)
Consider a simplified example of a database table representing orders:

OrderID CustomerID Product


1 101 Laptop
2 101 Phone
3 102 Tablet

In this table:
OrderID is the primary key.
CustomerID is a non-key attribute.
Product is a non-key attribute.
This table violates the Second Normal Form (2NF) because:
Product is functionally dependent on part of the primary key
(OrderID), but not on the entire primary key (OrderID, CustomerID).
To bring this table into 2NF, we need to split it into two separate tables:
Orders and Products, with a foreign key relationship between them.
Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 27 / 48
Example of Third Normal Form (3NF)

Let’s continue with our previous example of a database table representing orders. After applying
Second Normal Form (2NF), we have two separate tables: Orders and Products.

OrderID CustomerID
1 101
2 101
3 102

OrderID Product
1 Laptop
2 Phone
3 Tablet

However, the Orders table still contains transitive dependencies. For example, CustomerID is
functionally dependent on OrderID, but not directly on the primary key.
To achieve Third Normal Form (3NF), we need to further decompose the Orders table into two
separate tables: Orders and Customers, ensuring that each table represents a single entity and
eliminates transitive dependencies.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 28 / 48


Example of Boyce-Codd Normal Form (BCNF)

Let’s continue with our example of a database table representing orders. After applying Third
Normal Form (3NF), we have three separate tables: Orders, Customers, and Products.

OrderID ProductID
1 101
2 102
3 103

ProductID ProductName
101 Laptop
102 Phone
103 Tablet

The Orders table is now in Third Normal Form (3NF), but it still contains a dependency
between OrderID and ProductID. To achieve Boyce-Codd Normal Form (BCNF), we need to
decompose the Orders table further.
We can create a new table, OrderDetails, to represent the relationship between orders and
products, with OrderID and ProductID as its primary key. This ensures that there are no
non-trivial functional dependencies on candidate keys.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 29 / 48


Example of Fourth Normal Form (4NF)

Let’s consider a database table representing orders and products, which is already in
Boyce-Codd Normal Form (BCNF):

OrderID ProductID
1 101
2 102
3 103

ProductID ProductName
101 Laptop
102 Phone
103 Tablet

However, there might still be multi-valued dependencies present. For example, a single order can
contain multiple products, and a single product can appear in multiple orders.
To achieve Fourth Normal Form (4NF), we need to further decompose the table to remove
multi-valued dependencies. We can create a new table, OrderItems, to represent the relationship
between orders and products, with OrderID and ProductID as its primary key.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 30 / 48


Example of Fifth Normal Form (5NF)
Consider a database schema representing a university’s course enrollment
system. The database includes the following tables:
Students (StudentID, StudentName)
Courses (CourseID, CourseName)
Enrollments (StudentID, CourseID, Grade)
In this schema, the Enrollments table represents a many-to-many
relationship between students and courses, with an additional attribute,
Grade, indicating the grade achieved by the student in the course.
However, there might be cases where a student takes multiple courses and
receives different grades for each course. This creates a multi-valued
dependency between StudentID and CourseID.
To achieve Fifth Normal Form (5NF), we can decompose the Enrollments
table further. We can create a new table, Grades, to represent the
relationship between students, courses, and grades, with StudentID,
CourseID, and Grade as its primary key.
Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 31 / 48
Introduction to Database Denormalization

Definition
Database denormalization is a process used to improve the performance of
a database by adding redundancy to the data model.

While normalization focuses on minimizing redundancy,


denormalization aims to optimize query performance by reducing the
need for joins and aggregations.
Denormalization is often used in data warehouses, reporting systems,
and high-performance databases where query performance is critical.
However, it can also introduce some degree of data redundancy and
increase the complexity of data maintenance.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 32 / 48


Concepts of Database Denormalization

The key concepts of database denormalization include:


Redundancy: Introducing redundancy into the database schema by
duplicating data across tables or adding redundant columns.
Performance Optimization: Improving query performance by
reducing the need for joins and simplifying query execution.
Data Integrity: Maintaining data integrity despite redundancy by
carefully managing updates, inserts, and deletes.
Trade-offs: Balancing the benefits of improved performance against
the drawbacks of increased storage requirements and potential data
inconsistencies.
By understanding these concepts, database administrators can make
informed decisions about when and how to denormalize a database.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 33 / 48


Advantages of Database Denormalization

There are several advantages to database denormalization:


Improved Read Performance: Reducing the need for joins and
simplifying query execution can lead to faster read performance,
especially for complex queries.
Simplified Queries: Denormalized schemas often result in simpler
and more intuitive queries, making it easier for developers to write
and maintain code.
Reduced Overhead: By precomputing and storing aggregated data,
denormalization can reduce the computational overhead of generating
reports and analytics.
Better User Experience: Faster response times and simplified
queries can improve the overall user experience of applications.
These advantages make denormalization a valuable technique for
optimizing database performance in certain scenarios.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 34 / 48


Disadvantages of Database Denormalization

Despite its advantages, database denormalization also has some


disadvantages:
Increased Storage Requirements: Introducing redundancy can lead
to increased storage requirements, especially for large datasets.
Data Inconsistencies: Redundant data increases the risk of data
inconsistencies if updates, inserts, or deletes are not carefully
managed.
Complexity: Denormalized schemas can be more complex to manage
and maintain, especially as the database evolves over time.
Maintenance Overhead: Maintaining denormalized data requires
additional effort to ensure data integrity and consistency.
These disadvantages highlight the importance of carefully considering the
trade-offs before denormalizing a database.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 35 / 48


Best Practices for Database Denormalization

To maximize the benefits of database denormalization while minimizing its


drawbacks, consider the following best practices:
Identify Performance Bottlenecks: Analyze query performance and
identify areas where denormalization could improve performance.
Use Denormalization Sparingly: Only denormalize when necessary
and avoid overdenormalizing the database.
Monitor Data Consistency: Implement mechanisms to monitor and
maintain data consistency, such as triggers or scheduled jobs.
Document Changes: Keep detailed documentation of
denormalization decisions and changes to the database schema.
Following these best practices can help ensure that database
denormalization is implemented effectively and responsibly.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 36 / 48


Example of Database Denormalization

Let’s consider a simplified example of a database schema representing a library’s


catalog:

BookID Title Author


1 War and Peace Leo Tolstoy
2 Pride and Prejudice Jane Austen

BookID Genre
1 Fiction
2 Fiction

In this schema, the books are stored in a Books table with BookID, Title, and
Author columns. Additionally, there is a Genres table with BookID and Genre
columns.
To simplify queries and improve performance, we may choose to denormalize the
schema by adding a Genre column directly to the Books table. This introduces
redundancy but can optimize read performance by eliminating the need for joins.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 37 / 48


Introduction to Integrity Rules and Constraints

Integrity rules and constraints are mechanisms used to enforce data


integrity in a database.
They define rules and conditions that data must adhere to, ensuring
accuracy, consistency, and reliability of the data.
Integrity rules also contribute to data security by preventing
unauthorized modifications or deletions of data. For example,
constraints can restrict access to sensitive data or prevent users from
inadvertently altering critical information.
The various types of integrity rules and constraints commonly used in
database management systems are explored hereafter.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 38 / 48


Types of Integrity Rules and Constraints

1 Entity Integrity: Ensures that each row in a table has a unique


identifier, typically enforced by primary key constraints.
2 Referential Integrity: Ensures the consistency of relationships
between tables by enforcing foreign key constraints. It ensures that
references between tables are valid and that data remains
synchronized.
3 Domain Integrity: Ensures that data values in a column adhere to
specific data types, formats, or ranges. This is enforced by data type
constraints, check constraints, or domain constraints.
4 User-Defined Integrity: Allows users to define custom integrity rules
and constraints based on specific business requirements. These rules
are enforced using triggers, stored procedures, or application logic.
By enforcing these integrity rules and constraints, database systems can
maintain the quality and reliability of the data stored within them.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 39 / 48


Example of Integrity Rules and Constraints

Entity Integrity: Each employee record in the Employees table must


have a unique EmployeeID, enforced by the primary key constraint on
the EmployeeID column.
Referential Integrity: The DepartmentID column in the Employees
table must have a corresponding DepartmentID in the Departments
table, enforced by the foreign key constraint.
Domain Integrity: The Birthdate column in the Employees table
must contain valid date values, enforced by the date data type
constraint.
User-Defined Integrity: A custom constraint could be defined to
ensure that the salary of an employee cannot be negative, enforced
using a trigger or stored procedure.
These integrity rules and constraints help maintain the consistency and
accuracy of the database.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 40 / 48


Logical Data Model

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 41 / 48


Key Components of Logical Data Model

Definition
A Logical Data Model (LDM) is a representation of the data elements and
their relationships in a database, independent of any specific database
management system or physical implementation.

The key components of a LDM are:


1 Entities
2 Attributes
3 Relationships
4 Constraints
5 Keys
6 Data Types

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 42 / 48


From CDM to LDM

For entities: Any entity becomes a table, the properties of the entity
are the attributes of the table, the identifier of the entity is the
primary key of the table.
For relationships: That depends on the cardinalities. Two cases are
possible:
one-to-one or one-to-many: The relationship is materialised by the
addition of a foreign key.
many-to-many: The relationship is transfromed into a new table.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 43 / 48


Constraints and Keys
Constraints enforce rules or conditions on data.
Types of Constraints:
Primary Key or relation constraint: Every relation in the relational
model is identified by a primary key which must be unique and not null.
Therefore, each tuple (or record in the database) is also identified by a
primary key.
Foreign Key or referential integrity constraint: Foreign keys are
used to connect tables (or relations) between them. The values of
these foreign keys are included in the field of the primary key from
which they arise.
Unique Constraint: It states that each value in a field must be unique.
Check Constraint: It is a type of constraint in a database
management system that defines a rule that must be satisfied for data
to be valid or acceptable.
ALTER TABLE Person
ADD CONSTRAINT CHK_Age CHECK (age BETWEEN 18 AND 100);

Keys uniquely identify records within a table.


Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 44 / 48
Benefits of Logical Data Model

Provides a clear and consistent representation of data.

Facilitates communication between stakeholders.

Serves as a blueprint for database implementation.

Helps ensure data integrity and consistency.

Supports data analysis and decision-making processes.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 45 / 48


Conclusion

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 46 / 48


Conclusion

Key Takeaways
Database design is a critical aspect of building a robust and efficient
database system. By following principles such as entity-relationship
modeling, normalization, and denormalization, you can create a
well-structured database that meets the data requirements of your
application while optimizing performance and ensuring data integrity.
Entity-Relationship (ER) modeling is a powerful technique for designing and
visualizing the structure of a database. By defining entities, attributes, and
relationships, designers can create clear and concise representations of the
database schema.
Integrity rules and constraints are essential components of database
management systems, ensuring the accuracy, consistency, and reliability of
the data stored within them. By enforcing entity integrity, referential
integrity, domain integrity, and user-defined integrity, databases can
maintain data quality and prevent data corruption.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 47 / 48


Conclusion (2)

Key Takeaways
Database normalization is a critical process in database design, aimed at organizing the
database schema to reduce redundancy and dependency of data. By achieving higher
levels of normalization, databases can improve data integrity, minimize anomalies, and
optimize performance.
Functional dependencies are a fundamental concept in database theory, describing the
relationship between attributes in a relation. They help ensure data integrity, guide the
process of normalization, and facilitate efficient database design.
Database denormalization is a powerful technique for improving the performance of a
database by introducing redundancy into the data model. By carefully balancing the
advantages and disadvantages of denormalization and following best practices, database
administrators can optimize query performance and enhance the user experience of
applications.
The logical data model and database schema are fundamental components of database
design and management, providing a structured representation of the data and its
relationships.

Evrad KAMTCHOUM (CCMC (UBa)) Database Systems November 14, 2024 48 / 48

You might also like