0% found this document useful (0 votes)
26 views

Chapter 3 Database Testing

Database testing involves checking the integrity, accuracy, and reliability of data in a database system. It forms an integral part of software testing and ensures data is stored and retrieved correctly. The primary goal is to identify defects to maintain reliability. Common challenges include complex data, lack of SQL knowledge, data privacy issues, and performance testing.

Uploaded by

Lemmy Munene
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

Chapter 3 Database Testing

Database testing involves checking the integrity, accuracy, and reliability of data in a database system. It forms an integral part of software testing and ensures data is stored and retrieved correctly. The primary goal is to identify defects to maintain reliability. Common challenges include complex data, lack of SQL knowledge, data privacy issues, and performance testing.

Uploaded by

Lemmy Munene
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Chapter 3

Database Testing
Database testing involves checking the integrity, accuracy, and reliability of data in a database
system. It forms an integral part of software testing, ensuring that application data is stored
appropriately and retrieved in a consistent, accurate manner. Various aspects are addressed under
database testing, including data validity, data integrity, performance checks, and more. This
process is crucial to maintain the quality and reliability of a software system, thereby enhancing
its overall performance.
Database testing is a critical process within software development and quality assurance that
focuses on assessing the accuracy, integrity, reliability, and performance of a database system. It
involves executing a series of tests and checks to ensure that the database is functioning as
expected, data is stored correctly, and queries produce accurate results. The primary goal of
database testing is to identify and address any defects, inconsistencies, or issues within the
database to maintain its reliability and effectiveness.

Importance of Database Testing:


Data Accuracy: It helps ensure that data is stored, retrieved, and manipulated correctly.
Data Integrity: Testing prevents data corruption, duplication, and other integrity issues, which
could lead to incorrect decisions.
System Reliability: Thorough testing helps identify potential issues early and ensures the
database system’s stability.

1
Chapter 3

Performance Optimization: By identifying performance bottlenecks and inefficiencies, database


testing helps improve the system’s responsiveness and scalability.
Compliance and Security: Rigorous testing ensures that sensitive data is protected and that the
database system complies with relevant regulations.
Cost Savings: Detecting and addressing issues early in the development cycle is more cost-
effective than dealing with them after deployment.
User Experience: Accurate data and efficient query performance contribute to a positive user
experience.
Types of Database Testing
There are several types of database testing, each aimed at different aspects of the database’s
functionality. These include
Structural Testing: It includes testing of stored procedures, triggers, views, and schema testing.
Functional Testing: This testing verifies whether the operations performed by the database such
as insert, delete, update, retrieve are functioning as expected.
Non-Functional Testing: This includes testing for performance, stress, compatibility, and
security of the database to ensure that it can handle data, users, and queries effectively under
various conditions.
Boundary Testing: Here, the database’s reaction to boundary values in the input domain is
tested, focusing on the limit conditions within the software, database, or among specific
partitions.
Regression Testing: This involves testing the database after modifications have been made to
ensure that previous functionality still works as expected.
Data Migration Testing: This testing ensures that the migration of data from the old system to
the new system was successful while maintaining data integrity.
Benefits of Database Testing
Data Accuracy: Ensuring that data is stored, retrieved, and manipulated accurately.
Data Integrity: Preventing data corruption, duplication, and inconsistencies.
Early Issue Detection: Identifies defects and issues in the early stages of development, reducing
the cost and effort of fixing problems later.
Improved System Reliability: Thorough testing minimizes the likelihood of database failures,
ensuring the stability and availability of the system for users.

2
Chapter 3

Enhanced Performance: Performance testing uncovers bottlenecks, allowing optimization of


query execution and overall system responsiveness.
Security Enhancement: Security testing helps uncover vulnerabilities, ensuring that sensitive
data is protected.
Positive User Experience: Accurate data and efficient query performance result in a seamless
user experience.
Cost Savings: Early issue detection and prevention through testing reduce the costs associated
with fixing problems post-deployment.
Compliance Adherence: Helps ensure that the database system complies with relevant industry
regulations and standards.
Efficient Development: Provides developers with confidence that changes to the database
schema or code do not introduce unintended side effects.
Optimized Resource Utilization: Identifying performance bottlenecks and optimizing queries
can lead to more efficient resource utilization.
Risk Mitigation: Reduces the risk of critical data loss, system failures, and security breaches.
Supports Continuous Integration/Delivery: Integrates seamlessly into automated build and
deployment pipelines, enabling faster and more reliable release cycles.
Cross-System Compatibility: Ensures the database functions correctly across different
environments and configurations.
Stakeholder Confidence: Instills confidence in stakeholders, including users, management, and
investors, about the system’s reliability and quality.

3
Chapter 3

How to Perform Database Testing


Understanding Requirements: The first step is to gain a thorough understanding of the
database’s requirements, including its structure, features, and expected performance.
Creating a Test Plan: Based on the requirements, a test plan should be developed that outlines
what aspects of the database will be tested, the testing methods that will be used, and the
expected outcomes.
Setting Up a Test Environment: A separate test environment should be set up that mirrors the
production environment as closely as possible.
Test Execution: The tests outlined in the test plan are then executed. This could include running
scripts to test the database’s functionality, injecting test data to test its handling of data, and
monitoring its performance under load.
Analyze Test Results: After the tests have been executed, the results should be analyzed to
identify any discrepancies between the expected and actual results.
Report Findings: Any issues or defects identified should be reported to the development team
for resolution.
Retesting and Regression Testing: Once the identified issues have been addressed, the tests
should be re-run to verify the fixes. It’s also important to perform regression testing to ensure
that no new issues have been introduced.
Tools for Database Testing
We can categorize the tools on the basis of their functionalities.

4
Chapter 3

Database Unit Testing

 DbUnit
 tSQLt
 utPLSQL
Database Functional and Regression Testing

 Selenium
 TestNG
Database Performance Testing

 Apache JMeter
 HammerDB
Database Security Testing

 SQLMap
 AppScan
 Netsparker
Database Management Systems with Built-in Testing Tools

 SQL Server Data Tools (SSDT)


 Oracle SQL Developer
General Testing Frameworks with Database Integration

 JUnit
 PyTest
Common Challenges in Database Testing
Database testing, while essential, does present its own set of challenges. Some of these include:
Complexity of Data: Databases can contain a vast amount of complex data, making it difficult to
validate every piece of data effectively.
Lack of SQL Knowledge: SQL is the language of databases, and a lack of proficiency in SQL
can hinder effective database testing.
Data Privacy: Testing often requires the use of real data, which could raise privacy and
confidentiality issues.
Scalability: Large databases can be difficult to manage and test efficiently.
Synchronization Issues: If data is not properly synchronized between different database
environments, it can lead to inconsistencies and errors during testing.
Over Reliance on GUI: Graphical User Interface (GUI) based tools can sometimes mask
underlying database issues that would have been detected through SQL queries.

5
Chapter 3

Handling Test Data: Managing, generating, and cleaning up test data is a significant challenge
in database testing.
Performance Testing: Simulating real-world loads for performance testing can be difficult and
resource-intensive.
Data Migration Testing: Ensuring data integrity during migration between different database
systems or versions can prove challenging.

Database Security
Database security consists of the controls organizations implement to prevent unauthorized
access or data breaches from database files, database management systems (DBMS), and
connected systems. Security controls consist of architecture techniques, application design,
procedures, processes, and tools that make the data more difficult to access and use.

Database security done poorly will harm operations efficiency, application performance, and
user experience. Security must be balanced against operations needs with the goal of reducing
risk to an acceptable level while maintaining usability.

Database security best practices and controls apply specifically to databases. However, databases
do not exist in pure isolation, so organizations must also defend the broader ecosystem. To be
adequately defended, effective database security also requires the implementation of more
general security best practices applied to related systems.

Database Security Best Practices


To protect a database, it must reside in a secured environment, protected by its own perimeter
security, and accessed by secured users. These practices specifically secure databases and
database data.

1. Separate Database Servers


By definition, web servers must be publicly accessible to be used, but this also paints web
servers as a primary target for attack. A successful attack may grant an attacker access to the host
server for the website or application, which allows an attacker to access anything else hosted on
the server.

Databases should be segregated to a separate container, physical server, or virtual server to allow
for additional hardening and to prevent access if the website or application is breached. Only the
required ports on the separate server should be opened and, where possible, an organization
should change the default communication ports to make attacks more challenging to execute.

Some recommend setting up an HTTPS proxy server between the database and the queries, but
separating the web server and the database server functionally achieves the same result.
However, a proxy server may be beneficial for internal-network databases that may be queried
directly by authorized network users or devices.

6
Chapter 3

To further secure the database, consider placing the database server on a separate physical or
virtual network segment, with highly restricted access privileges. Microsegmentation of this type
can prevent attackers that have gained more general network access from easily moving laterally
to access a database server that might not appear on a compromised user’s network.

2. Use Database Firewalls


Databases only become useful if accessed, but that access must be protected. The first layer of
defense comes from database-specific firewalls that deny access by default. The only traffic
allowed through the firewall should come from specific applications, web servers, or users that
need to access the data, and without a specific need, the firewall should deny the database from
initiating outbound connections.

Direct access to the database should be limited or denied if the use-case will allow it. Firewall
rule changes must be controlled by change management procedures and trigger alerts for security
monitoring.

Organizations can deploy specialized database tools that include special firewalls such as
the Oracle Audit Vault and Database Firewall, dedicated physical or virtual next generation
firewall (NGFW), or web application firewall (WAF) solutions. Organizations with more limited
resources may simply deploy a hardened version of the database server’s operating system
firewall.

3. Secure Database User Access


The least possible number of users, applications, and application programming interfaces (APIs)
should access the database. Any access should be granted only after network or application
authorization and, even then, all access should be based on the principle of least privilege and
granted for the least time possible. This best practice can be broken down into three sub-
categories: user authorization, privileged access, and development and operations (DevOps) use
of databases.

User Authorization
Access control to the database is managed by the system administrator, or admin. The admin
grants permissions defined by roles and by adding user accounts to those database roles. For
example, the role of row-level security (RLS) restricts read and write access to rows of data
based on a user’s identity, role memberships, or query execution context.

Specialized database security solutions may allow for centralized management of identities and
permissions, minimize password storage, and enable password rotation policies. Comprehensive
access management might not be practical for smaller organizations, but it remains important to
manage permissions via roles or groups instead of individual users.

Administrators also need to harden the database access rules:

 Null passwords should not be allowed


 Temporary installation files that may contain passwords should be deleted

7
Chapter 3

 Default accounts should be deleted if not needed or else change passwords from default
settings
 Require unique IDs for all users for tracking and logging
 Users and applications should use separate accounts
 Inactive users should be disabled or deleted on a schedule
 Elevated database privileges should be logged, reported, and potentially generate security
alerts
 User groups and access rights should be reviewed on a periodic basis
 Accounts should automatically lock after a number of failed logins, usually
recommended as six failed login attempts

Privileged Access
Admins should have only the bare minimum privileges needed to perform required tasks, and
only for the duration they specifically need access. Privileged access should be granted
temporarily and revoked continuously. Larger organizations automate access management
using privileged access management (PAM) software. PAM provides authorized users with a
temporary password, logs activities, and prevents sharing passwords.

DevOps Database Use


Although not typically considered users, DevOps teams need to create test environments to
verify that applications can access and use databases correctly. Unfortunately, using live or
production database data often leads to accidental data leaks.

To avoid issues, DevOps should use the following practices:

 Sensitive data should be limited to the production environment


 Test environments should be physically and logically separated from production
environments
 Test environments should use separate roles and permissions than production
environments
 Developers should not get access to production environments unless absolutely necessary
 Test environments should never contain real production data; synthetic or anonymized
datasets should be used instead

4. Harden The Database


Just as the server must be hardened, the database should also be hardened to prevent simple
attacks and exploits.

Database hardening varies according to the type of database platform, but the common steps
include strengthening password protection and access controls, securing network traffic, and
encrypting sensitive fields in the database.

All unused or unnecessary services or functions of the database should be removed or turned off
to prevent unrecognized exploitation.

8
Chapter 3

All database security controls provided by the database should be enabled. Some will be enabled
by default and others may have specific reasons to be disabled, but each should be evaluated and
all reasoning for disabled controls documented. Where possible admins can enable row-level
security and dynamic data masking for sensitive data.

DevOps should design the database so that sensitive information remains in segregated tables.
Admins should also continuously audit the data to discover sensitive data to determine if the
segregated tables need to be modified or additional security applied. Some regulatory or
compliance standards will mandate specific data discovery requirements that need to be
implemented, followed, and documented to prove compliance.

5. Audit And Continuously Monitor Database Activity


DevOps designs with certain expectations, but after integrating databases with applications and
moving them into a production environment, some unexpected access, user queries, or data
behavior may occur. Admins need to continuously monitor and audit database logs, data, and
activity including:

 User login logs, especially attempted and failed logins


 Locked accounts (from excessive failed login attempts)
 Database privilege escalation
 Database data extraction, copying, or deletion (particularly large-scale changes or
extractions)
 Access to sensitive or regulated data (may be required for compliance)
 New account creation

Audits often can detect anomalous activity, and security teams can establish security alerts on
critical events to warn security teams or to enable security information and event
management (SIEM) tool alerts. Database activity monitoring (DAM) and file integrity
monitoring software can provide specialized security alerts independent of native database
logging and audit functions.

6. Test Your Database Security


Although audits can catch malicious activity in progress, organizations should not wait for
attacks to test their database deployments. Database vendors should be monitored for updates
and patch management processes should update databases with minimal delay.

Yet patching only addresses publicly announced vulnerabilities. Some database vendors will
offer security and configuration testing tools, such as Oracle’s Database Security Assessment
Tool, that can help identify risks. However, these tools should not be assumed to provide 100%
assurance and should be complemented by subsequent testing using vulnerability
scans and penetration tests that simulate potential attacks to expose misconfigurations,
inadvertently accessible data, and other issues.

7. Database Data Best Practices


Databases structure data, but the data contained within the database also needs to be protected.
The first step requires an organization to store only the protected data required for the business

9
Chapter 3

function. Eliminating excessive data or purging unnecessary historical information can minimize
risk exposure.

Next, the data must be intentionally controlled. Redundancy of protected data should be
eliminated throughout the system, and shadowing of protected data outside the system of record
must be avoided wherever possible. Hashing functions can be applied to protected data elements
before storing data required for matching purposes outside of the system. Wherever possible,
protected data such as health information or credit card numbers should be dissociated from
personally identifiable information (PII).

Encryption should also be pursued for extra protection. Many vendors provide solutions to
encrypt data at rest, data in transit, or even data in use. Within the database, DevOps can encrypt
or use data masking to obscure data within tables. Some encryption tools even allow for data to
be processed and searched without decryption so that the data always remains encrypted and
protected.

Some cloud vendors, such as Oracle, will encrypt stored data by default or provide encryption
key management tools, such as Azure Key Vault. However, organizations themselves bear the
responsibility for ensuring adequate protection throughout the data storage and data transfer
process.

7 Related System Security Best Practices


If a security practice does not apply specifically to databases, it cannot be considered solely a
component of database security. However, this does not diminish the importance of these
practices or that these security practices should be in place to ensure database security.

1. Physical Security Best Practices


Although it can sometimes be overlooked, physical security must not be assumed. An attacker’s
physical access to a data center can undermine even the best cybersecurity practices and
technology. Securing a physical environment containing servers and network equipment should
be the first best practice for fundamental IT security.

Onsite data centers require physical security measures such as cameras, locks and staffed
security personnel, and any physical access servers should be controlled, logged, and regularly
reviewed. If regular access is not typical, then alerts should be generated.

Assets hosted in the cloud may fall outside of an organization’s direct physical control, but not
outside of an organization’s responsibility. The organization still needs to confirm adequate
physical security, which typically will be met by the cloud vendor’s compliance with the
physical security standards within a compliance guideline such as:

 ISO 27001
 ISO 20000-1
 NIST SPs (SP 800-14, SP 800-23, and SP 800-53)
 Department of Defense Information Assurance Technical Framework
 SSAE 18 SOC 1 Type II, SOC 2 Type II and SOC 3

10
Chapter 3

2. Use Web Application And Network Firewalls


Firewalls provide foundational protection for all IT assets. In addition to deploying a firewall for
the database, organizations need to deploy next generation firewalls (NGFW) to protect their
networks and web application firewalls to protect the websites and applications accessing the
database.

These more general firewalls protect the organization as a whole against attacks that affect
databases as well as other systems, such as SQL injection attacks and distributed denial of
service (DDoS) attacks.

3. User Authentication
When databases authorize users for access, the inherent assumption is that the user has already
been authenticated and proven their identity. Security best practices require the authentication or
identity verification of all types of users such as guests, employees, customers, and
administrators. User authentication security sub-categories include insider threat management,
user verification, and privileged access management (PAM).

Insider Threat Management


Some data can be so valuable, criminal organizations will pay employees to leak the data or even
place their own members into jobs under false pretenses to gain access to data. To minimize
these insider threat issues, organizations should conduct background checks for programmers,
contractors, security professionals, database administrators, and anyone who may be able to
access or redirect sensitive information.

Once employee identities are confirmed, organizations then implement user and entity behavior
analytic (UEBA) tools, UEBA features on other security tools, and audit logs to look for signs of
inappropriate or abnormal behavior. Keep in mind that stolen credentials used by a hacker will
appear to most security tools as authorized access until unusual behavior is detected. Lastly,
establish a policy for deactivating accounts or unnecessary access when employees switch to
different roles or leave the company.

User Verification
To maintain the integrity of a proven identity, users must regularly or even constantly (in the
case of zero trust) confirm their identity. Passwords remain the most commonly used method for
identity; however, some organizations have begun to deploy passwordless authentication.

For administrator and other privileged accounts, organizations should always use multi-factor
authentication (MFA). For the most important data organizations should consider using physical
MFA such as magnetic cards, USB tokens and other methods that cannot be stolen or easily
replicated by remote attackers.

For organizations using passwords, strong passwords and password management should be used:

 Password complexity (a mix of capitalization, numbers and special characters) or


password phrases (much longer passwords) should be required
 Password length should be at least 8 characters, longer for privileged accounts

11
Chapter 3

 Password hashes should be stored encrypted and salted


 Accounts should be locked after repeated and failed login attempts; as many as six for
normal user accounts and as little as three for privileged or administrator accounts
 Passwords should expire

Organizations that deploy password managers can require more complexity and more frequent
password expirations without worrying about users storing their passwords in unsecured or
vulnerable locations. User access should be regularly renewed to prevent access from obsolete
and forgotten users or devices.

Privileged Account Management

Administrator access abuse can cause enormous damage so admin credentials must be protected
with extra measures. Not only should the password requirements be stronger, but organizations
may also need to consider privileged access management (PAM) tools that generate temporary
passwords with limited privileges so that authorized users must authenticate every time they
access the database.

Whether using specialized tools or not, privileged access should enforce additional rules:

 No password sharing
 All sessions and activities are logged and regularly reviewed
 All user privilege escalation should be logged and regularly reviewed

4. Device Security
All devices that access the database, and the network in general, need to be verified and
continuously monitored for potential compromise. Antivirus protection provides the minimum
level of protection, but for more protection organizations often deploy endpoint detection and
response (EDR) tools or extended detection and response (XDR) tools that provide more
proactive detection.

Admin devices should be further constrained by the use of IP and MAC address
constriction, whitelisting, or network access control (NAC). These measures limit the number of
devices allowed to access sensitive areas to prevent stolen credentials from being as useful to a
hacker.

For infrastructure related to the database (or other sensitive systems) the organization should
document all devices, applications, and tools. Furthermore, configuration files and source code
must be locked down, only accessible by protected admin accounts, and protected by change
management policies and tools.

Lastly, all systems should be monitored. Networks should be monitored by XDR or intrusion
detection and prevention systems (IDPS) tools. All security systems should feed alerts to security
information and event monitoring (SIEM) tools, security operations centers (SOCs), or managed
detection and response (MDR) teams.

5. Application And API Security

12
Chapter 3

Applications and APIs connecting to the database or other IT resources must be secured. DevOps
should begin by applying vulnerability scanning tools to internally developed websites and
applications. Larger organizations will deploy application security tools and API security tools to
further protect and monitor systems.

6. Regularly Update Your Operating System And Patches

The best security tools and strategies will be undermined by poor maintenance. All systems,
applications, tools, and firmware should be monitored for newly released patches or disclosed
vulnerabilities. Critical systems, such as those connecting to database systems, should be
prioritized for regular patch management and vulnerability management. Software supply chain
components, such as open source libraries, should also be tracked and addressed for
vulnerabilities and updates.

7. Business Continuity Best Practices


Even the best plan runs into problems. Whether the issue stems from a disgruntled employee, a
malicious hacker, a power failure, or a flood, the best practices of business
continuity and disaster recovery design systems for resilience and rapid recovery.

Redundant architecture designs maintain uptime in the event of a system failure. Servers can be
made more resilient by using active-passive redundancy for fail-over recovery or by using load
balancing servers that split potential loads over multiple servers.

Data and system backups protect against complete system failure or malicious activity. Backups
should be regular and highly protected. Best practices follow the 3-2-1 backup rule, with three
copies of backup data, two types of storage, and at least one copy stored offsite and offline.
There should be absolutely no public access to backups and backups should be encrypted and
stored separately from encryption keys.

Backups should include not just data, but also the settings, software applications, and
configurations of the supporting infrastructure to enable rapid recovery of affected systems.
Backups for mission critical infrastructure should be tested on a regular basis to verify the
effectiveness of the backup processes as well as to set benchmarks for recovery expectations.

13

You might also like