System Design Assignment 2
System Design Assignment 2
Solution:
Solution:
1. Strong Consistency: Guarantees that after a write, all reads will return the latest
written value. This ensures immediate synchronization across all replicas but may
introduce delays due to synchronization mechanisms.
2. Session Consistency: Guarantees that a user will always see their most recent
write during a session. However, once the session ends, consistency across users
may not be guaranteed.
3. Weak Consistency: No guarantee of consistency between replicas, meaning data
can be inconsistent temporarily. Often used in systems that value responsiveness
over perfect accuracy.
Solution:
Solution:
Definition:
Purpose:
Authentication: Confirms who the user is. It establishes the identity of a user
attempting to access the system.
Authorization: Determines what the authenticated user is allowed to do within the
application. It assigns or restricts privileges based on the user’s role or access rights.
Process:
Example:
Authentication: A user logs into their account with a username and password.
Authorization: The user is granted access to certain pages or actions within the web
application, based on their role, such as an "Admin" being able to view user data,
while a "User" can only access their profile.
Order:
Focus:
Authentication: Focuses on verifying identity and confirming that the user is who
they say they are.
Authorization: Focuses on what the authenticated user can do or access within the
system.
5. What are the main types of software testing, and how do they differ in terms of
objectives and techniques?
Solution:
Unit Testing
Integration Testing
System Testing
Objective: Validate the complete and integrated application against the specified
requirements.
Technique: End-to-end tests are executed to ensure that the entire system functions
properly as a whole.
Regression Testing
Objective: Ensure that new code changes don’t affect existing functionality.
Technique: Re-runs previous test cases (often automated) after updates or
enhancements to confirm no regressions.
Acceptance Testing
Objective: Verify if the application meets requirements and is ready for deployment.
Technique: Often conducted by users or stakeholders (User Acceptance Testing),
focusing on functionality, usability, and business scenarios.
Performance Testing
Objective: Assess how the application performs under different levels of load,
identifying potential bottlenecks.
Technique: Load testing and stress testing using tools like Apache JMeter or
LoadRunner to simulate varying traffic.
Security Testing
Reusability
Automated Testing: Once created, automated tests can be reused in multiple testing
cycles. Automated scripts are efficient for testing during development phases.
Manual Testing: No need to write scripts, but every test is done from scratch. It's not
reusable and requires human effort for each round of testing.
Accuracy
Automated Testing: Delivers high precision and consistency, reducing human error in
repetitive tasks. Test results will be the same each time, ensuring accuracy.
Manual Testing: While capable of detecting nuanced issues, it is prone to human
error, especially when testers become fatigued or overlook details.
Cost-Effectiveness
Automated Testing: High initial setup costs, including creating the test scripts and
tools. However, it is cost-effective in the long term for with repetitive testing needs.
Manual Testing: Lower upfront cost as it does not require writing scripts. However,
for large applications, it becomes costly as it relies on human and time for each test.
Test Coverage
Adaptability to Changes
Automated Testing: Adapting to changes in the application may require time and
effort to update the test scripts. However, once updated, it can easily handle
frequent code changes.
Manual Testing: Easily adaptable to application changes without needing script
revisions. However, it can be inconsistent and time-consuming with frequent
updates.
7. How can continuous integration and continuous deployment (CI/CD) pipelines
enhance software quality assurance practices?
Continuous Integration (CI) ensures that code is integrated into a shared repository
several times a day. Every time code changes are pushed, automated tests are
triggered to verify the new build's quality.
Benefit: Ensures consistent testing, identifying defects early and preventing issues
from piling up. This leads to more reliable code at all times.
CI/CD allows for automated testing to run after every code commit or pull request.
This frequent testing helps in identifying and fixing bugs at an early stage.
Benefit: Minimizes the time and cost to fix bugs by addressing issues immediately,
improving overall code quality.
CI/CD provides fast feedback on each commit, helping developers identify issues in
real-time. This means that if tests fail, developers can address them before pushing
further changes.
Benefit: Promotes a proactive approach in fixing issues, which increases the
efficiency of development cycles.
The CI/CD pipeline reduces the need for manual intervention by automating testing,
building, and deployment processes.
Benefit: This minimizes the risk of human error, ensuring tests are executed
consistently every time, and reduces the time spent on manual testing and
deployment tasks.
Code Review
Description: Involves testing the internal workings of the software, including source
code, configuration settings, and user input validation mechanisms. Testers have full
knowledge of the system's design.
Purpose: Focuses on identifying security vulnerabilities at every level of the
application, from code logic to configuration
9. How does penetration testing differ from vulnerability scanning, and what role
does each play in a comprehensive security testing strategy?
Solution: Objective
Approach
Penetration Testing: Conducted manually or with the help of tools by ethical hackers
(pen testers) who imitate an attacker’s behavior. This test often goes beyond known
vulnerabilities, exploring unanticipated attack vectors.
Vulnerability Scanning: Automated and routine. Tools like Nessus continuously scan
the system or network for known issues based on a database of vulnerabilities.
Depth of Testing
Frequency
Penetration Testing: Acts as a “red team” approach, offering deeper insights into the
potential real-world impact of exploiting vulnerabilities and helping to identify new
or unique attack vectors that automated tools might not detect.
Vulnerability Scanning: Serves as an ongoing monitoring tool, helping organizations
stay on top of known security flaws and outdated systems. It is excellent for
continuous security assessment but often needs to be complemented by penetration
testing to understand the real impact of identified weaknesses.
10. What are the key principles of Structured Analysis and Structured Design
(SA/SD), and how do they contribute to effective system design?
Soution:
This principle involves clearly distinguishing data from the processes that operate on it.
Data structures are defined independently of the functions.
Contribution: It simplifies the system, improving clarity, maintainability, and scalability.
Decomposition:
The system is broken down into smaller, manageable components, starting from high-
level functions and gradually detailed into sub-functions.
Contribution: Allows for simpler development and troubleshooting by focusing on
individual, smaller components.
Modularization:
Dividing the system into well-defined, independent modules that can be developed and
tested separately.
Contribution: Enhances reusability, reduces complexity, and facilitates changes without
impacting other parts of the system.
Top-Down Approach:
The design begins at a high level and progresses to lower levels of detail.
Contribution: Ensures alignment with business requirements while providing a broad
understanding before delving into details.
Focus on Data Flow:
Systems are designed with an emphasis on how data flows through the processes, using
tools like Data Flow Diagrams (DFDs) to map data movement.
Contribution: Helps ensure data consistency and correct processing, improving system
efficiency.
Information Hiding:
Each module hides its internal workings, exposing only necessary interfaces to other
components.
Contribution: Reduces dependencies between modules, promoting flexibility and ease
of maintenance.
Iterative Refinement:
The design process is iterative, with the system being continuously refined as more
details emerge.
Organized and Maintainable System: SA/SD helps structure systems clearly, making
them easier to develop and modify.
Modular Approach: By emphasizing modularization, systems become easier to
maintain, test, and upgrade.
Risk Mitigation: Tools like DFDs allow potential issues to be identified early in the
design phase, preventing future problems.