Database design and Development
Database design and Development
Student’s name
List which assessment criteria Pass Merit Distinction
the Assessor has awarded.
1
Assessor signature Date
Internal Verifier
signature Date
Programme Leader
signature (if required) Date
LO2 Develop a fully functional relational database system, based on an existing system design
* Please note that grade decisions are provisional. They are only confirmed once internal and external moderation has taken place and
grades decisions have been agreed at the assessment board.
Assignment Feedback
Formative Feedback: Assessor to Student
Action Plan
Summative feedback
Assessor Date
signature
Student Date
signature
General Guidelines
1. A Cover page or title page – You should always attach a title page to your assignment. Use
previous page as your cover sheet and make sure all the details are accurately filled.
2. Attach this brief as the first section of your assignment.
3. All the assignments should be prepared using a word processing software.
4. All the assignments should be printed on A4
sized papers. Use single side printing.
5. Allow 1” for top, bottom, right margins and 1.25” for the left margin of each page.
1. The font size should be 12 points, and should be in the style of Time New Roman.
2. Use 1.5 line spacing. Left justify all paragraphs.
3. Ensure that all the headings are consistent in terms of the font size and font style.
4. Use footer function in the word processor to insert Your Name, Subject, Assignment No, and
Page Number on each page. This is useful if individual sheets become detached for any reason.
5. Use word processing application spell check and grammar check function to help editing your
assignment.
Important Points:
1. It is strictly prohibited to use textboxes to add texts in the assignments, except for the
compulsory information. eg: Figures, tables of comparison etc. Adding text boxes in the body
except for the before mentioned compulsory information will result in rejection of your work.
2. Carefully check the hand in date and the instructions given in the assignment. Late submissions
will not be accepted.
3. Ensure that you give yourself enough time to complete the assignment by the due date.
4. Excuses of any nature will not be accepted for failure to hand in the work on time.
5. You must take responsibility for managing your own time effectively.
6. If you are unable to hand in your assignment on time and have valid reasons such as illness,
you may apply (in writing) for an extension.
7. Failure to achieve at least PASS criteria will result in a REFERRAL grade.
8. Non-submission of work without valid reasons will lead to an automatic RE FERRAL. You will
then be asked to complete an alternative assignment.
9. If you use other people’s work or ideas in your assignment, reference them properly using
HARVARD referencing system to avoid plagiarism. You have to provide both in-text citation
and a reference list.
10. If you are proven to be guilty of plagiarism or any academic misconduct, your grade could be
reduced to A REFERRAL or at worst you could be expelled from the course
Student Declaration
I hereby, declare that I know what plagiarism entails, namely to use another’s work and to present it
as my own without attributing the sources in the correct form. I further understand what it means to
copy another’s work.
5. I acknowledge that the attachment of this document signed or not, constitutes a binding
agreement between myself and Pearson, UK.
6. I understand that my assignment will not be considered as submitted if this document is not
attached to the assignment.
Academic Year
Unit Tutor
Issue Date
Submission Date
Submission format
Part 1: The submission should be in the form of an individual written report written in a concise,
formal business style using single spacing and font size 12. You are required to make use of
headings, paragraphs and subsections as appropriate, and all work must be supported with
research and referenced using Harvard referencing system. Please also provide in-text citation
and bibliography using Harvard referencing system. The recommended word limit is 3,000–
3,500 words, although you will not be penalised for exceeding the total word limit.
Part 2: The submission should be in the form of a fully functional relational database system
demonstrated to the Tutor; and an individual written report (please see details in Part 1 above).
Part 3: The submission should be in the form of a witness statement of the testing completed
by the Tutor; technical documentation; and a written report (please see details in Part 1 above).
LO1 Use an appropriate design tool to design a relational database system for a substantial
problem.
LO2 Develop a fully functional relational database system, based on an existing system
design.
LO3 Test the system against user and system requirements.
LO4 Produce technical and user documentation.
Assignment Brief and Guidance:
Assignment brief
Polly Pipe is a water sports provider and installer based in Braintree, England. They need you
to design and implement a database that meets the data requirements. These necessities are
defined in this scenario and below are samples of the paper records that the Polly Pipe
preserves.
Polly Pipe is focused in placing aquariums at business customers. Customers can request
several installations, but each installation is tailor-made for a specific customer. Facilities are
classified by type. One or more employees are assigned to each facility. Because these
facilities are often very large, they can include carpenters and masons as well as water
installers. The facilities use equipment such as aquariums, air pumps and thermostats. There
can be multiple computers in a facility.
Below are examples of paper records that Polly Pipe currently maintains.
(Note: -It is allowed to have your own assumptions and related attributes within the scope of the case study
given)
1.2. Design set of simple interfaces to input and output for the above scenario using
Wireframe or any interface-designing tool. Evaluate the effectiveness of the given design (ERD
and Logical design) in terms of the identified user and system requirements.
Activity 2
Activity 2.1
a. Develop a relational database system according to the ER diagram you have created
(Use SQL DDL statements). Provide evidence of the use of a suitable IDE to create a
simple interface to insert, update and delete data in the database. Implement proper
security mechanisms in the developed database.
Evaluate the database solution developed and its effectiveness with relevant to the
user and system requirements identified, system security mechanisms (EX: -User
groups, access permissions) and the maintenance of the database.
Activity 2.2
a. Explain the usage of DML with below mentioned queries by giving at least one single
example per each case from the developed database. Assess the usage of the below
SQL statements with the examples from the developed database to prove that the
data extracted through them are meaningful and relevant to the given scenario.
Select/ Where / Update / Between / In / Group by / Order by / Having
Activity 3
Activity 3.1
Provide a suitable test plan to test the system against user and system requirements. provide
relevant test cases for the database you have implemented. Assess how the selected test
data can be used to improve the effectiveness of testing.
Note: - Learner needs to give expected results in a tabular format and screenshots of the actual results with
the conclusion
Activity 3.2
Get independent feedback on your database solution from the non-technical users and some
developers (use surveys, questioners, interviews or any other feedback collecting method)
and make recommendations and suggestions for improvements in a separate
conclusion/recommendations section.
Activity 4
Produce a technical documentation and a user guide for the developed database system.
Suitable diagrams (Use case diagram, class diagram, flow charts, DFD level 0 and 1) should
be included in the technical documentation to show data movement in the system. Assess
the developed database by suggesting future enhancements to ensure the effectiveness of
the system.
Grading Criteria Achieved Feedback
12
M2 Implement a fully functional database system that
includes system security and database maintenance.
13
Database Solution for
Pollypipe
Acknowledgement
Despite the efforts of the learner this assignment was made possible by several bodies of
people. First and foremost, I would like to appreciate our assessor Mrs. Nalini Basnayake
for the guidance and knowledge provided throughout the completion of this assignment.
Finally, I would like to thank my friends and family for all of their help and
encouragement in making this endeavour a success.
14
Table of Contents
(2.0) The usage of SQL DDL statements to produce a functional database ... 37
2.1.0 What is SQL? ....................................................................................................... 37
2.2.0 What are SQL queries? ....................................................................................... 38
2.3.0 The usage of Insert, Update and Delete queries................................................. 45
2.4.0 Implementation of security mechanisms for data protection ........................... 48
2.5.0 Evaluating effectiveness, security, maintenance and user-system requirements
......................................................................................................................................... 53
2.6 Evidence of data extractions from the database for a given set of keywords..... 53
(4.0) Technical information and user guide for the system ........................... 79
4.4 Future improvements to the system ....................................................................... 82
15
(5.0) References ............................................................................................. 83 (6.0)
Annexures .............................................................................................. 84
17
List of tables
Introduction
What is data?
“Data refers to distinct pieces of information, usually formatted and stored in a way
that is concordant with a specific purpose.” (Beal, 2022)
18
computerized software? Paper databases work well when the data being gathered is
largely static, there isn't a lot of data being collected, and there aren't many reporting
obligations. When data has to be used for various purposes, changes regularly, or you
want to be able to print the data, a database application is a preferable option. A database
application is also a preferable option if you need to calculate the data, choose subsets,
organize the entries using various criteria, or create reports or summaries
For the given assignment the learner is expected to design a fully functional relational
database system utilizing Microsoft Server Management Studio, test it and produce a
technical report regarding the system.
Designing a relational database system for a substantial problem.
• User requirements: which are sometimes referred to as "user needs," specify what
the user accomplishes with the system, such as the tasks that users must be able to
carry out. In a User Needs Document (UND), narrative material is typically used
to describe user requirements. The user typically approves user requirements,
which are the main source of information for formulating system requirements.
Determining what the user genuinely wants a database to perform is a crucial and
challenging phase in database design. This is due to the fact that the user
frequently struggles to fully express their requirements and desires, and that the
information they do supply may also be unreliable, erroneous, or contradictory.
The developer is accountable for fully comprehending what the consumer desires.
Because of this, user needs and system requirements are typically treated
differently. (Parker, 2012)
For the given scenario the learner identifies that the client as “Polly pipe”, a water sport
provider and installer which is situated at Braintree, England. The client has previously
maintained a pen and paper data storing method which has many bottlenecks. There are
also several constraints when it comes designing the database.
19
01. Customers can request several installations, but each installation is tailor-made for
a specific customer.
02. Facilities are classified by type. One or more employees are assigned to each
facility.
03. The facilities use equipment such as aquariums, air pumps and thermostats and if
they are larger in size can include carpenters and masons as well as water
installers.
04. Some facilities may or may not contain multiple computers.
System requirements: The components that developers utilize to construct the system
are called system needs. These are the standard "must" clauses that specify what the
system "shall do."
Assumed non-functional requirements of the database system for the scenario The
non-functional requirements are the answers for the cases given below.
I. Security: Does your product send or retain sensitive data? Does your IT
department have any rules that must be followed? What security standards are
followed in your sector? (IEE standards)
20
II. Capacity: What are the current and foreseeable storage needs for your system in
terms of capacity? How will your system expand to meet the software
requirements? (365MB for the setup and at least 1TB of storage)
IV. Reliability and Availability: What is the critical failure time under regular usage
for reliability and availability? Does the user require 24-hour access to this?
(Yes)
V. Usability: How simple is the product to use? What characterizes the product's
user experience? (Simple User interface with a tutorial demo)
21
Figure 2 : Relational diagram for the scenario
22
7. Facility_staff Table Facility ID
8. Installation_customer Table Installation ID
9. Installation_equipment Table Installation ID, Equipment Name
10. Installation_staff Table Installation ID
Normalization
Normalization is the process of structuring the data in the database. It is used to minimize
the redundancy from a relation or set of relations. It is also used to eliminate undesirable
characteristics like Insertion, Update, and Deletion Anomalies.
II. Deletion anomaly: the circumstance where data deletion causes the inadvertent
loss of some other crucial data.
III. Update anomaly: occurs when updating numerous rows of data is necessary to
update a single data value.
Advantages of normalization
• Data redundancy is reduced. (The process of maintaining data in two or more
locations inside a database is known as redundancy.)
• Improved database structure overall.
• Maintains better data consistency.
• Much more adaptable database architecture.
• The idea of relationship integrity is enforced.
23
Normalization is done in a step like order and each form is unique from one
another but also is linked with one another.
• More tables to join because when data is dispersed over more tables, the number
of table joins required rises, making the operation more time-consuming. It gets
more difficult to actualize the database as well.
• Because repeated data will be kept as lines of codes rather than actual data,
tables will include codes rather than actual data. As a result, using the lookup
table is always necessary.
• Since the data model is designed for applications rather than ad hoc querying, it
becomes quite challenging to the query against it.
• As the normal form type progresses, the performance becomes slower and
slower. (Anon., n.d.)
Form Description
2NF
A relation will be in 2NF if it is in 1NF and all non-key attributes are fully functional dependent on
the primary key. (No Partial dependencies)
24
3NF A relation will be in 3NF if it is in 2NF and no functional transitive dependency exists.
4NF
A relation will be in 4NF if it is in Boyce Codd's normal form and has no multi-valued
dependency.
5NF A relation is in 5NF. If it is in 4NF and does not contain any join dependency, joining should be
lossless.
For the given scenario, the learner is required to produce a database system that fulfils
Normalization up to the 3rd Normalization Form.
Using the relational diagram, the learner has created the tables that fits the user
requirements using Normalization forms.
If a table's atomicity is singular, it is said to be in its first normal form. According to the
principle of atomicity, no one cell may store more than one value. It can only include
one attribute with a single value. The multi-valued attribute, composite attribute, and
their combinations are forbidden by the first normal form.
Staff Table
Staff ID Staff Name Staff Type
123 John Lennon Plumber
The table must first be in First Normal Form in order for it to be in Second Normal Form.
There shouldn't be any partial dependencies in the table. The correct subset of the
25
candidate key should provide a non-prime attribute because of the partial reliance in this
situation.
Eg: Hence there are no examples to show, a sample table will be used.
Both Book ID and Reader ID form composite keys; which is a combination of two or
more columns in a table that can be used to uniquely identified. The non key “Book
name” columns depend on “Book ID” which is also known as a partial dependency;
occurs when a non-prime attribute is functionally dependent on part of a candidate key.
So, the tables are split into two keep it in 2NF.
The table must first be in the Second Normal Form in order for it to be in the Third
Normal Form. On-prime attributes that are not a part of the candidate key should not
depend on other non-prime characteristics in a table, according to the second
requirement, which states that there should be no transitive reliance for non-prime
attributes.
Eg: Hence there are no examples to show, a sample table will be used.
Employee ID Emp_name Address Work Work ID
In the table above, the “Employee ID” decides the “Work ID” which relates to “Work”,
this 3-column relationship is also known as a transitive relationship; is a relationship
26
between one non-prime property and another non-prime attribute, both of which are
completely reliant on the candidate key.
• The learner has changed the equipment table and has given a unique ID as
“Equipment ID” to uniquely identify each equipment and to eliminate atomic
values in 1NF.
• For both installation and facility tables the learner has identified the existence of
multi valued attributes for Equipment and Staff columns. So, the learner has split
the tables to generate atomic values and eliminate multi valued attributes.
• The learner has also split the “Assigned date” column in the installation table
into two separate columns in the installation table to maintain 1NF.
• In order to maintain level 2NF all the partial dependencies must be removed so
no composite keys should exist in the table. Composite key is a primary key
made of two or more columns.
• For the customer table however, it is clearly visible that Emails and Addresses
are both unique keys per customer forming a composite key so the learner has
27
divided the table into two as Customer order table and Customer information
table.
• For the tables to be in 3NF, they should not have any functional transitive
dependencies.
• The learner has noticed that there are no transitive dependencies present in the
tables therefore all the tables are in 3NF.
Staff Table
Staff ID Staff Name Staff Type
Equipment Table
Equipment ID Equipment Name Equipment Type
Facility Table
Facility ID Facility Installation Equipment Staff
Type ID
Installation Table
Installation Order Installation Installation Needed Wanted Assigned
ID ID Name Type Equipment Staff time
Customer Table
Customer ID Customer Name Customer Address Customer Email
28
After Normalization
Staff Table
Staff ID Staff Name Staff Type
Equipment Table
Equipment ID Equipment Name Equipment Type
Facility Table
Facility ID Facility Type
Installation Table
Installation ID Installation Type Address Start date End date
29
Installation Customer Table
Installation ID Customer ID
Customer Table
Figure 3 : All the tables with records after normalization with sample records
30
Designing user interfaces
The figures below show the input and output interfaces that interact with the database.
02.Installation Table
31
Figure 5 : Installation interface
32
Figure 6: Staff interface
Evaluating the effectiveness of the design to meet user and system requirements
Both the Entity relationship and the relational model diagrams were created according to
the user and system requirements. An Entity Relationship Diagram or ER diagrams are
made out of Entities, attributes and relationships. It depicts the connections between sets
of entities. Entities are fragments of data or objects that could be a name, place, person a
group of people. Attributes can be called as features which describe each entity. After
gathering data from the documentation of the scenario the learner has selected entities and
assigned attributes from the data tables that were used earlier.
Attributes:
• Customer - Customer ID, Customer name, Customer email, Customer Address
• Staff - Staff ID, Staff Name, Staff Type
• Equipment - Equipment ID, Equipment Type, Equipment Name
• Installations - Installation ID, Installation Name, Installation Type, Address, Staff
Needed, assigned time, Equipment Used
• Facilities - Facility ID, Facility Type, Number of Staff, Number of Equipment
The computers of each facility are not much pronounced in the system so the learner has decided
to keep it as an equipment.
Further understanding the requirements of the user, the client, Polly pipe expects a design
a database that meets the data requirements. Polly Pipe is focused in placing aquariums at
business customers which maintains quite a mediocre number of records as water
installations are not much demanding.
“Customers can request several installations, but each installation is tailor-made for a
specific customer” which explains the existence of repetitive records that were fixed with
normalization.
“Facilities are classified by type” that can be used as an attribute for the facility table.
“One or more employees are assigned to each facility” describes the cardinality
relationship between facility and staff/employee table.
Because these facilities are often very large, they can include carpenters and masons as
well as water installers which explains the expanding relationship between both facility
and staff tables.
34
The facilities use equipment such as aquariums, air pumps and thermostats. There
can be multiple computers in a facility. Computers can also be considered as an
attribute in a facility but the learner used it as an equipment that is not used the
manufacturing process of an installation.
Considering the design and the interface and the normalizations the learner has managed
to design the data system to uphold all the conditions of the user that were mentioned
earlier. The quality of data has been increased as normalization up to normalization form
3 has mostly eliminated insertion, deletion and updating anomalies. All the tables have
unique identifiers also known as Primary keys, while no partial dependencies and no
transitive dependencies are present.
Meanwhile the learner has decided to go with a minimalistic approach to design the user
interfaces that will help input and output the data for the user.
For the system requirements, the learner used the assumed 5 factors to evaluate the
system.
According to Figure 07 the user can easily add a new staff member, delete,
modify or add a new staff member to the database.
One of the most dangerous risks to a database is the accidental loss of data. In
order to avoid situations as such the learner recommends uploading the data
into a cloud server to back up the data.
Just like the staff data, the customer data can easily me updated or modified to
avoid data inaccuracy.
The interfaces have a global search feature that allows the user search any
record in seconds.
35
The customer interface in Figure 03 has a report button which will generate a
text report regarding a particular customer from his or her personal information
to the installation data.
What is SQL?
SQL, or Structured Query Language is a database language that can access and
manipulate data. SQL can be categorized up to 4 types of languages from how it affects
the database using keywords that are known as statements which are;
• Data Query Language statements defines the selection commands that allows to do
selections in the database.
• Data Definition Language statements allow a user to create, change or removing data
objects which are tables in many cases that exist in a database. Unlike creating,
changing and removing are known as “Altering” and “Dropping” in SQL by order.
Eg:
ALTER ... (All statements beginning with ALTER)
ANALYZE
ASSOCIATE STATISTICS
AUDIT
COMMENT
36
CREATE ... (All statements beginning with CREATE)
DISASSOCIATE STATISTICS
DROP ... (All statements beginning with DROP)
FLASHBACK ... (All statements beginning with FLASHBACK)
• Data Manipulating Language statements are command lines that manipulates data
objects, mostly tables to Insert, Update or Delete records in the database.
Eg:
CALL
DELETE
EXPLAIN PLAN
INSERT
LOCK TABLE
MERGE
SELECT
UPDATE
• Finally, Data Control Language statements allow the user to limit the access the
database by altering access to the database.
Eg:
GRANT
REVOKE
(oracle, n.d.)
Using these statements, SQL is able to: Create databases, create tables, delete
databases, delete tables, insert data into tables, Update data in existing tables, delete data
in existing tables, retrieve data from tables, do analytics, Grant or Take access
permissions.
37
An SQL query is are lines of code written using one or more or the SQL statements
mentioned earlier. These are written inside an IDEs that support database objects such
as; Microsoft Access, Microsoft SQL server management studio, Datagrip etc.
For the scenario the learner is expected to create a functional database using an IDE. So,
the learner has used Microsoft SQL Server Management Studio 18 to develop the
database. The database is named as “Polly pipe” and the tables can be seen in the figures
below.
38
Figure 8 : Using the database
39
Figure
10 :
Creating
the
equipment table
05. Creating the customer table
40
Figure 12: Creating the facility table
41
Figure 14 : Creating facility_equipment table
42
Figure 16: Creating the installation_equipment table
43
Figure 18 : Creating the installation_customer table
44
Figure 20 : After 2 insertions
45
Figure 22 : After deleting the record
46
Figure 24 : After updating the record
Why is the security of data important when it comes to databases? As mentioned earlier, a
database may be used to store information such as data for analytics, data to make
important decisions for a company. These may also contain very sensitive data such as
personal data passwords and usernames or even a key for an important process. So, using
at least one security mechanism is very important to protect the data from both physical
and internal threats as well as from data corruption. A database may have different types
of security mechanisms for protection such as Authentications, Database Encryption
methods, Backups for the database, Physical security, Access Controls, Web Application
Firewalls, Use of Strong passwords etc. For the scenario the learner has decided to assign
roles to some members from the staff itself to maintain the database and to have access
privileges for the company. A database may have either user level or role-level security.
User level security is like having a common key and a username for the whole company,
the advantage of this is to not having to use multiple accounts for each individual. But the
main disadvantage is, in a situation where an employee resigns from a company, the
company has to change the key every time. Where role level provides unique keys and
grants roles that gives access to different people in the company everything is maintained
by an Administrator.
47
For the scenario provided, the learner has decided to grant access permissions to members
working in Polly pipe. First the leaner had to login as the System Admin (sa) to grant
permissions and create new logins to the server. After doing so, the learner has used T-
SQL scripts to create three new roles as the “db_admin”, “db_security”, “db_readonly”
and db_writeonly. Both the admin and the security have permissions to alter, create, do
shutdowns and even create new SQL connections with new data tables if needed so, but
the read-only roles only have permissions to read the data while the write only can be
used for an employee such as a data entry operator who have permissions to read and
write data to the database. The learner has also created a user called the “super_user” who
has all the permissions granted which can be used in an emergency situation. All the
evidences of the security mechanism can be seen in the figures below. The privileges each
user have can be seen from the figure 27 below.
48
Figure 26 : login in with the admin account
49
.
Figure 28 : https://docs.microsoft.com/en-us/sql/relational-databases/security/authentication-access/database-levelroles?view=sq
l-server-ver16
50
Figure 31 : creating writers’ login
After creating the logins, the roles were given permissions from the properties section
for each profile.
Figure 32 : After adding all the logins into the pollypipe database
51
Evaluating the Effectiveness, Security and User and System Requirements and Maintenance
The pollypipe company database solution was implemented using The Microsoft SQL
Server Management Studio 18 which maintains all the customer, installation and facility
records. This implementation covers all the required information for the client.
Unfortunately, after normalization, some data that needs to be together such as
equipment and staff members had to be split and add into new tables to maintain
atomicity so there is hardly any redundancy could be maintained. It may also consume
time when searching for records manually. On the perspective of cost, this solution does
not cost at all as the software used in the process are freeware. After normalization some
of the tables have been expanded inside producing more records which could exist in a
single table this might be time consuming when searching for a single record, inserting,
updating and deleting one. But the records seem to be cleaner, as deletion, updating and
insertion anomalies cannot exist now so the database can be managed without any
frustration. The interfaces are made in a minimalistic manner using the Visual Studio
IDE, the interfaces are made up simply because the ability to sort out the data later and
communicate with the database without any collision of data. The database is mainly
operated by three personnel as the Administrator, the Security admin and a Super user
which can be helpful during an emergency. The employees are categorized as the
readonly and write only personnel. The data can be secured using an encryption key but
the learner has used user logins and assigned roles as shown in figure 27 as the data is
not very sensitive also as a security measurement the database can be backed up to an
external server if needed. The database has fulfilled the requirement of the user
requirements as the database solution is fully functional and records can be stored and
can be altered according to the database operators. The database can be accessed via the
interfaces created and even has additional features such as a global search mechanism.
The data is secured using a user role permission system. The system requirements are
fulfilled as the maintenance is easy because of the smaller capacity being rendered to the
database from the system it can be backed up easily. The database can be accessed from
anywhere if it had a cloud facility.
Evidence of data extractions from the database for a given set of keywords.
For this the learner has used the “installations table” as it has a variety of columns.
52
Figure 33 : All the records before extractions
53
Figure 36 : Where query
54
Figure 40 : Between query
55
Figure 42 : Group BY query
56
\
To query and alter database data, one uses the SQL data manipulation language (DML).
The SELECT, INSERT, UPDATE, and DELETE SQL DML command statements are
described below.
The strategy, goals, timetable, estimation, deliverables, and resources needed to carry out
testing on a software product are all described in detail in a test plan. The test plan aids
57
in estimating the amount of work required to verify the application's quality. The test
manager carefully monitors and controls every aspect of the test plan to ensure that
software testing activities are carried out according to a specified methodology. Testing
for the database scenario was done using 5 criteria. Check valid connection, Data type
Validation, Input Validation, Data Integrity –Insert, update, delete, Keys, back up &
restore.
Table 5: Test cases
Test Main Required Input Output data Expected Final
data Status
Case Objective Steps Output
ID
primary
key
TC4 Data input • Leave a record A record Error about Error about
validation: with empty with primary key primary key Pass
fields constraints constraints
Not null empty
constrains fields
58
TC5 Data • Insert a record in A normal “Query “Query
integrity: record executed executed
a table
successfully” successfully”
Insertions using an insert with message message Pass
query complete
columns
7 out of the 8 test cases were a pass. The failed test could’ve been a pass if the learner
logged in from a different server or a different database.
The effectiveness of the test cases and the reasoning for the choices
Database Connection :
59
The effectiveness of of these test cases depend on its capacity to guarantee that the
application's "connect to server" module performs effectively. By purposefully inputting
wrong credentials for SQL authentication, this test replicates a scenario where an
unauthorized person tries to access the database. The importance of this test cannot be
emphasized, as a successful connection with wrong credentials might constitute a severe
security risk, potentially allowing unauthorized access to critical information. This test
case acts as a vital gatekeeper, ensuring the integrity and security of the database by
evaluating the system's capacity to reject incorrect authentication attempts. Successfully
passing this test offers assurance that the application has effective security mechanisms
in place to defend against unwanted access.
This test case focuses on the effectiveness of the system in managing data type
validation. Attempting to introduce data in the wrong type assists to detect if the system
can appropriately recognize and reject improper data types. The value of this test rests in
preserving data correctness and integrity inside the database. If the system fails to check
data types appropriately, it might lead to data corruption, impacting the dependability of
the information stored. Successful completion of this test gives confidence that the
system is able to handle varied data types, lowering the risk of data anomalies and
ensuring the database's overall dependability.
This test case analyzes the effectiveness of the system in imposing unique key
restrictions. By attempting to add two records with the same ID key, the test replicates a
scenario where duplicate entries may jeopardize the uniqueness of primary keys. The
relevance of this test is clear in preventing data redundancy and ensuring the integrity of
the database structure. If the system enables the entry of duplicate key values, it might
lead to confusion, mistakes, and threaten the accuracy of the recorded data. Successfully
passing this test indicates that the system is resilient in enforcing unique constraints,
hence protecting the database's structural integrity.
60
This test case focuses on the effectiveness of the system in enforcing not null
requirements. Leaving a record with empty fields is an intentional attempt to check
whether the system successfully recognizes and rejects incomplete data submissions.
The value of this test is in assuring the completeness and correctness of the recorded
data. If the system accepts entries with empty necessary fields, it might lead to data
inconsistency and jeopardize the trustworthiness of the database. Successfully passing
this test indicates that the system appropriately enforces not null requirements, ensuring
that vital data is always present and correct.
This test case assesses the system's ability to preserve data integrity throughout insert
operations. By entering a record into a table using an insert query, the test tries to
validate that the system successfully adds data without breaking any integrity
requirements. The value of this test is clear in ensuring that new data is effectively
incorporated without damaging the existing database structure. Successful completion of
this test offers assurance that the system can manage insertions without adding mistakes
or inconsistencies, hence protecting the overall integrity of the database.
This test case tests the system's capabilities to maintain data integrity throughout delete
operations. By removing a record from a table via a query, the test checks if the system
successfully removes data without producing undesired side effects. The value of this
test is in verifying that deletions do not damage the relational structure of the database.
If not managed effectively, deletions might lead to orphaned entries or broken
relationships, weakening the database's integrity. Successful completion of this test
demonstrates that the system can conduct deletions without affecting the overall
coherence and consistency of the database.
Data Backup :
The success of this test case is vital for defending against data loss. By initiating a
backup via the prescribed procedures, the test analyzes the system's capacity to make a
trustworthy copy of the database. The value of this test is clear in disaster recovery
61
scenarios, when a backup acts as a key resource for restoring lost or damaged data.
Successful execution of this test offers confidence that the backup process is functioning
and that the system can rapidly produce a snapshot of the database, limiting the risk of
data loss in unanticipated circumstances.
Data Restoring :
This test case focuses on the system's capacity to recover data from a backup file,
guaranteeing business continuity in the face of data loss. By following the given
procedures to restore a database from a ".bak" file, the test analyzes the system's
efficiency in recovering data. The relevance of this test cannot be emphasized, since the
ability to recover data is a vital component of a strong data management strategy.
Successful completion of this test assures that the system can rapidly and accurately
recover from data loss, minimizing downtime and ensuring the continuity of company
activities.
To resolve the “Fail” issue observed in the data restoration test case for PollyPipe, it is
necessary to analyze the root cause of the failure and adopt remedial actions. First and
foremost, a comprehensive analysis of the database restoration process should be done
to discover any anomalies or mistakes in the processes indicated. This includes checking
the "Right click on databases and click restore database and locate the “.bak” file"
approach to confirm that it matches with the desired workflow. Additionally, the
integrity of the backup files (.bak) must be validated to assure that they are full and
uncorrupted. If any differences are identified, correcting the backup creation procedure
is critical. Furthermore, it is important to run test restores in a controlled environment to
replicate various circumstances and validate the system's capability to recover data
successfully. Addressing any faults in the restoration process quickly and adopting
rigorous testing standards will strengthen the dependability of the data restoring
capabilities for PollyPipe, providing a rapid and efficient recovery in the case of data
loss.
In the thorough test scenario applied for PollyPipe, the majority of the test cases
generated successful outcomes, with seven out of eight tests passing successfully. The
tests addressed crucial areas of database administration, including valid connection
checks, data type validation, enforcement of unique keys, validation of not null
62
constraints, data integrity during insertions and deletions, and the basic tasks of data
backup and restoration. The successful execution of these tests is indicative of
PollyPipe's strong database solution, ensuring safe access, accurate data representation,
and structural integrity. However, the lone failure in the "Data Input Validation: Not Null
Constraints" test raises questions about the system's capacity to consistently enforce
necessary field requirements, potentially creating a danger to data completeness and
correctness. As a water sports firm, the efficiency of these tests is vital for PollyPipe to
maintain a secure and trustworthy database, crucial for maintaining client information,
equipment inventory, and operating data. The good outcomes strengthen the general
reliability of the database solution, stressing its relevance in allowing smooth business
operations and maintaining data integrity for PollyPipe. Addressing the failed test is vital
to reinforce the system's resilience and ensure full data validation in all elements of its
functioning.
Figure 45 : TC1
63
Figure 46 : TC2
Figure 47 : TC3
64
Figure 48 : TC4
Figure 49 : TC5
65
Figure 50 : TC6
Figure 51 : TC7
66
Figure
52 : TC8
The database testing was done based on 5 criteria as mentioned in the above figures.
▪ Check valid connection
▪ Data type Validation
▪ Input Validation
▪ Data Integrity –Insert, update, delete, Keys ▪ Back up & restore.
“A SqlConnection object represents a unique session to a SQL Server data source. With
a client/server database system, it is equivalent to a network connection to the server.
SqlConnection is used together with SqlDataAdapter and SqlCommand to increase
performance when connecting to a Microsoft SQL Server database.” (Microsoft, n.d.)
Not having a valid connection from the connection portal to the database will not allow
access to enter the database unless it’s authorized by the windows use so, that was used
as a security mechanism and was tested as a test case with a dummy account as seen in
TC1 figure above.
67
II. Data type and input validation
The component of a database that maintains data consistency while using SQL is data
type validation. Check, unique, not null, and primary constraints are the three basic
categories of constraints in SQL. To ensure that a statement about the data is accurate for
every row in a table, check constraints are utilized. No two rows will ever have the
identical values in the columns, thanks to the unique constraint. A column's not null
constraint indicates that the column must have data by stating that it cannot be empty.
The not null constraint in SQL, however, may only be applied to a single column. Last
but not least, the main key constraint combines the unique constraint with the not null
constraint, thus no two rows may have the same primary key as shown in TC2, TC3 and
TC4.
The most current values and status of shared data should be displayed on all forms and
screens for each CRUD operation. On one screen, the value shouldn't be changed while
showing an older value on another.
The end-user mostly uses the "CRUD" actions made possible by the DB Tool while the
program is running.
• D: Delete – When a user ‘Removes’ any record from the system, ‘Delete’ operation of
DB is performed. (Anon., n.d.)
68
Since most of the keywords are shown in earlier figures, the leaner decided used one
action from the CRUD actions which is the insert query as seen in TC5.
“Backups keep your important files safe and secure from data loss. You can also encrypt
the backup file or the storage media for added security” (Anon., n.d.) The steps of backing
up and restoration can be found from Microsoft’s official website.
Evaluating the database solution regarding the user and system requirements
The user needed a database solution that can has the integrities of insertions, creations,
deletions and updating for the information from the earlier provided table samples to
store the data with a security mechanism. The database was given roles and special users
to gain access with limited permission. Inside the database are eleven tables namely, The
customer table, staff table, equipment table, installation table, installation_customer
table, installation_staff table, installation_equipment table, facility table, facility_staff
table, facility_equipment table. All these tables will allow the database operator to input
or extract perfect information regarding an installation. A random customer can request
multiple installations without an issue, a facility has different types, can contain
additional employees and equipment. On the system requirement side, the database has
security mechanism as mentioned earlier. The system is compatible with older versions
of MSSMS, uses smaller storage capacity, has good security features and is reliable.
As suggestions the learner thinks that, the database needs to have minimal number of
tables, the security could have been better like additional features of encryption keys and
certificates of permission (“A certificate is a database-level securable contained by the
database that is its parent in the permissions hierarchy”: - Microsoft-) The database
could have been made from a newer version of MSSMS to avoid any bugs or any risk to
the database. More advanced testing could have been done to avoid more problems with
the database.
69
Gathering feedback for the database solution which was implemented
In order to gather feedback, the learner used “Google forms” to create a survey to gather
feedback regarding the database solution.
Why a survey was used ? A survey is one of the most famous methods to collect data or
information regarding a product or a social standard.
• Surveys cost not too much money. Particularly for online and mobile surveys, the
cost per responder is quite low. The cost per answer is frequently far lower than the
cost of delivering a paper survey or a phone poll, and the number of possible
responses might reach thousands. Even if incentives are provided to responders.
• A huge population's characteristics can be described through surveys. No other
research methodology can offer such a wide range of capabilities, which guarantees
a more accurate sample to collect focused findings from which to draw conclusions
and make significant judgments.
• Online surveys, email surveys, social media surveys, paper surveys, mobile
surveys, telephone surveys, and face-to-face interview surveys are just a few of the
various ways that surveys may be administered.
The learner used two of his colleagues as the non-technical users to gain feedback about
the database, and learn more about future improvements to increase the performance for
the system.
70
Below are the feedback survey results of two colleagues namely Mr. Mishayel and Mr.
Gihan. Both was given the operation manual and guide to read, showed the normalized
table system and test out the system to use some SQL queries to operate some data, check
the interfaces that the learner developed, and to provide some feedback about the database
system.
71
The Feedback results of Mr. Mishayel
72
Figure 54: Feedback form page 2 of Mr. Mishayel
73
Figure 55: Feedback form page 3 of Mr. Mishayel
74
Figure 58: Feedback form page 1 of Mr. Gihan
75
Figure 59: Feedback form page 2 of Mr. Gihan
76
Figure 60 : Feedback form page 3 of Mr. Gihan
77
78
79
80
81
Future improvements to the database
As the developer of the database solution, the learner realized that it lacks security as the
database has to rely on username and a password and the database needs more
improvement in terms of data entry and the table design. So, the learner is hoping to
upgrade the database by adding an encryption key security and from the feedback from
non-technical users; to eliminate the additional time-consuming tables to improve
efficiency for the database. The learner also hopes to add an automated backup system
so the user does not have to worry about data loss. Just like the global search system, the
learner also thinks to add a global report feature. (To generate reports to a specific
record)
82
References
83
Annexures
Annexures
84
Figure 62: Class diagram for the system
85
Figure 67 : Level 1 Data flow diagram for the system
86