Chapter 3 Database Testing
Chapter 3 Database Testing
Database Testing
Database testing involves checking the integrity, accuracy, and reliability of data in a database
system. It forms an integral part of software testing, ensuring that application data is stored
appropriately and retrieved in a consistent, accurate manner. Various aspects are addressed under
database testing, including data validity, data integrity, performance checks, and more. This
process is crucial to maintain the quality and reliability of a software system, thereby enhancing
its overall performance.
Database testing is a critical process within software development and quality assurance that
focuses on assessing the accuracy, integrity, reliability, and performance of a database system. It
involves executing a series of tests and checks to ensure that the database is functioning as
expected, data is stored correctly, and queries produce accurate results. The primary goal of
database testing is to identify and address any defects, inconsistencies, or issues within the
database to maintain its reliability and effectiveness.
1
Chapter 3
2
Chapter 3
3
Chapter 3
4
Chapter 3
DbUnit
tSQLt
utPLSQL
Database Functional and Regression Testing
Selenium
TestNG
Database Performance Testing
Apache JMeter
HammerDB
Database Security Testing
SQLMap
AppScan
Netsparker
Database Management Systems with Built-in Testing Tools
JUnit
PyTest
Common Challenges in Database Testing
Database testing, while essential, does present its own set of challenges. Some of these include:
Complexity of Data: Databases can contain a vast amount of complex data, making it difficult to
validate every piece of data effectively.
Lack of SQL Knowledge: SQL is the language of databases, and a lack of proficiency in SQL
can hinder effective database testing.
Data Privacy: Testing often requires the use of real data, which could raise privacy and
confidentiality issues.
Scalability: Large databases can be difficult to manage and test efficiently.
Synchronization Issues: If data is not properly synchronized between different database
environments, it can lead to inconsistencies and errors during testing.
Over Reliance on GUI: Graphical User Interface (GUI) based tools can sometimes mask
underlying database issues that would have been detected through SQL queries.
5
Chapter 3
Handling Test Data: Managing, generating, and cleaning up test data is a significant challenge
in database testing.
Performance Testing: Simulating real-world loads for performance testing can be difficult and
resource-intensive.
Data Migration Testing: Ensuring data integrity during migration between different database
systems or versions can prove challenging.
Database Security
Database security consists of the controls organizations implement to prevent unauthorized
access or data breaches from database files, database management systems (DBMS), and
connected systems. Security controls consist of architecture techniques, application design,
procedures, processes, and tools that make the data more difficult to access and use.
Database security done poorly will harm operations efficiency, application performance, and
user experience. Security must be balanced against operations needs with the goal of reducing
risk to an acceptable level while maintaining usability.
Database security best practices and controls apply specifically to databases. However, databases
do not exist in pure isolation, so organizations must also defend the broader ecosystem. To be
adequately defended, effective database security also requires the implementation of more
general security best practices applied to related systems.
Databases should be segregated to a separate container, physical server, or virtual server to allow
for additional hardening and to prevent access if the website or application is breached. Only the
required ports on the separate server should be opened and, where possible, an organization
should change the default communication ports to make attacks more challenging to execute.
Some recommend setting up an HTTPS proxy server between the database and the queries, but
separating the web server and the database server functionally achieves the same result.
However, a proxy server may be beneficial for internal-network databases that may be queried
directly by authorized network users or devices.
6
Chapter 3
To further secure the database, consider placing the database server on a separate physical or
virtual network segment, with highly restricted access privileges. Microsegmentation of this type
can prevent attackers that have gained more general network access from easily moving laterally
to access a database server that might not appear on a compromised user’s network.
Direct access to the database should be limited or denied if the use-case will allow it. Firewall
rule changes must be controlled by change management procedures and trigger alerts for security
monitoring.
Organizations can deploy specialized database tools that include special firewalls such as
the Oracle Audit Vault and Database Firewall, dedicated physical or virtual next generation
firewall (NGFW), or web application firewall (WAF) solutions. Organizations with more limited
resources may simply deploy a hardened version of the database server’s operating system
firewall.
User Authorization
Access control to the database is managed by the system administrator, or admin. The admin
grants permissions defined by roles and by adding user accounts to those database roles. For
example, the role of row-level security (RLS) restricts read and write access to rows of data
based on a user’s identity, role memberships, or query execution context.
Specialized database security solutions may allow for centralized management of identities and
permissions, minimize password storage, and enable password rotation policies. Comprehensive
access management might not be practical for smaller organizations, but it remains important to
manage permissions via roles or groups instead of individual users.
7
Chapter 3
Default accounts should be deleted if not needed or else change passwords from default
settings
Require unique IDs for all users for tracking and logging
Users and applications should use separate accounts
Inactive users should be disabled or deleted on a schedule
Elevated database privileges should be logged, reported, and potentially generate security
alerts
User groups and access rights should be reviewed on a periodic basis
Accounts should automatically lock after a number of failed logins, usually
recommended as six failed login attempts
Privileged Access
Admins should have only the bare minimum privileges needed to perform required tasks, and
only for the duration they specifically need access. Privileged access should be granted
temporarily and revoked continuously. Larger organizations automate access management
using privileged access management (PAM) software. PAM provides authorized users with a
temporary password, logs activities, and prevents sharing passwords.
Database hardening varies according to the type of database platform, but the common steps
include strengthening password protection and access controls, securing network traffic, and
encrypting sensitive fields in the database.
All unused or unnecessary services or functions of the database should be removed or turned off
to prevent unrecognized exploitation.
8
Chapter 3
All database security controls provided by the database should be enabled. Some will be enabled
by default and others may have specific reasons to be disabled, but each should be evaluated and
all reasoning for disabled controls documented. Where possible admins can enable row-level
security and dynamic data masking for sensitive data.
DevOps should design the database so that sensitive information remains in segregated tables.
Admins should also continuously audit the data to discover sensitive data to determine if the
segregated tables need to be modified or additional security applied. Some regulatory or
compliance standards will mandate specific data discovery requirements that need to be
implemented, followed, and documented to prove compliance.
Audits often can detect anomalous activity, and security teams can establish security alerts on
critical events to warn security teams or to enable security information and event
management (SIEM) tool alerts. Database activity monitoring (DAM) and file integrity
monitoring software can provide specialized security alerts independent of native database
logging and audit functions.
Yet patching only addresses publicly announced vulnerabilities. Some database vendors will
offer security and configuration testing tools, such as Oracle’s Database Security Assessment
Tool, that can help identify risks. However, these tools should not be assumed to provide 100%
assurance and should be complemented by subsequent testing using vulnerability
scans and penetration tests that simulate potential attacks to expose misconfigurations,
inadvertently accessible data, and other issues.
9
Chapter 3
function. Eliminating excessive data or purging unnecessary historical information can minimize
risk exposure.
Next, the data must be intentionally controlled. Redundancy of protected data should be
eliminated throughout the system, and shadowing of protected data outside the system of record
must be avoided wherever possible. Hashing functions can be applied to protected data elements
before storing data required for matching purposes outside of the system. Wherever possible,
protected data such as health information or credit card numbers should be dissociated from
personally identifiable information (PII).
Encryption should also be pursued for extra protection. Many vendors provide solutions to
encrypt data at rest, data in transit, or even data in use. Within the database, DevOps can encrypt
or use data masking to obscure data within tables. Some encryption tools even allow for data to
be processed and searched without decryption so that the data always remains encrypted and
protected.
Some cloud vendors, such as Oracle, will encrypt stored data by default or provide encryption
key management tools, such as Azure Key Vault. However, organizations themselves bear the
responsibility for ensuring adequate protection throughout the data storage and data transfer
process.
Onsite data centers require physical security measures such as cameras, locks and staffed
security personnel, and any physical access servers should be controlled, logged, and regularly
reviewed. If regular access is not typical, then alerts should be generated.
Assets hosted in the cloud may fall outside of an organization’s direct physical control, but not
outside of an organization’s responsibility. The organization still needs to confirm adequate
physical security, which typically will be met by the cloud vendor’s compliance with the
physical security standards within a compliance guideline such as:
ISO 27001
ISO 20000-1
NIST SPs (SP 800-14, SP 800-23, and SP 800-53)
Department of Defense Information Assurance Technical Framework
SSAE 18 SOC 1 Type II, SOC 2 Type II and SOC 3
10
Chapter 3
These more general firewalls protect the organization as a whole against attacks that affect
databases as well as other systems, such as SQL injection attacks and distributed denial of
service (DDoS) attacks.
3. User Authentication
When databases authorize users for access, the inherent assumption is that the user has already
been authenticated and proven their identity. Security best practices require the authentication or
identity verification of all types of users such as guests, employees, customers, and
administrators. User authentication security sub-categories include insider threat management,
user verification, and privileged access management (PAM).
Once employee identities are confirmed, organizations then implement user and entity behavior
analytic (UEBA) tools, UEBA features on other security tools, and audit logs to look for signs of
inappropriate or abnormal behavior. Keep in mind that stolen credentials used by a hacker will
appear to most security tools as authorized access until unusual behavior is detected. Lastly,
establish a policy for deactivating accounts or unnecessary access when employees switch to
different roles or leave the company.
User Verification
To maintain the integrity of a proven identity, users must regularly or even constantly (in the
case of zero trust) confirm their identity. Passwords remain the most commonly used method for
identity; however, some organizations have begun to deploy passwordless authentication.
For administrator and other privileged accounts, organizations should always use multi-factor
authentication (MFA). For the most important data organizations should consider using physical
MFA such as magnetic cards, USB tokens and other methods that cannot be stolen or easily
replicated by remote attackers.
For organizations using passwords, strong passwords and password management should be used:
11
Chapter 3
Organizations that deploy password managers can require more complexity and more frequent
password expirations without worrying about users storing their passwords in unsecured or
vulnerable locations. User access should be regularly renewed to prevent access from obsolete
and forgotten users or devices.
Administrator access abuse can cause enormous damage so admin credentials must be protected
with extra measures. Not only should the password requirements be stronger, but organizations
may also need to consider privileged access management (PAM) tools that generate temporary
passwords with limited privileges so that authorized users must authenticate every time they
access the database.
Whether using specialized tools or not, privileged access should enforce additional rules:
No password sharing
All sessions and activities are logged and regularly reviewed
All user privilege escalation should be logged and regularly reviewed
4. Device Security
All devices that access the database, and the network in general, need to be verified and
continuously monitored for potential compromise. Antivirus protection provides the minimum
level of protection, but for more protection organizations often deploy endpoint detection and
response (EDR) tools or extended detection and response (XDR) tools that provide more
proactive detection.
Admin devices should be further constrained by the use of IP and MAC address
constriction, whitelisting, or network access control (NAC). These measures limit the number of
devices allowed to access sensitive areas to prevent stolen credentials from being as useful to a
hacker.
For infrastructure related to the database (or other sensitive systems) the organization should
document all devices, applications, and tools. Furthermore, configuration files and source code
must be locked down, only accessible by protected admin accounts, and protected by change
management policies and tools.
Lastly, all systems should be monitored. Networks should be monitored by XDR or intrusion
detection and prevention systems (IDPS) tools. All security systems should feed alerts to security
information and event monitoring (SIEM) tools, security operations centers (SOCs), or managed
detection and response (MDR) teams.
12
Chapter 3
Applications and APIs connecting to the database or other IT resources must be secured. DevOps
should begin by applying vulnerability scanning tools to internally developed websites and
applications. Larger organizations will deploy application security tools and API security tools to
further protect and monitor systems.
The best security tools and strategies will be undermined by poor maintenance. All systems,
applications, tools, and firmware should be monitored for newly released patches or disclosed
vulnerabilities. Critical systems, such as those connecting to database systems, should be
prioritized for regular patch management and vulnerability management. Software supply chain
components, such as open source libraries, should also be tracked and addressed for
vulnerabilities and updates.
Redundant architecture designs maintain uptime in the event of a system failure. Servers can be
made more resilient by using active-passive redundancy for fail-over recovery or by using load
balancing servers that split potential loads over multiple servers.
Data and system backups protect against complete system failure or malicious activity. Backups
should be regular and highly protected. Best practices follow the 3-2-1 backup rule, with three
copies of backup data, two types of storage, and at least one copy stored offsite and offline.
There should be absolutely no public access to backups and backups should be encrypted and
stored separately from encryption keys.
Backups should include not just data, but also the settings, software applications, and
configurations of the supporting infrastructure to enable rapid recovery of affected systems.
Backups for mission critical infrastructure should be tested on a regular basis to verify the
effectiveness of the backup processes as well as to set benchmarks for recovery expectations.
13