0% found this document useful (0 votes)
131 views

Software Quality Assurance Training Material: Manual Testing

This document provides an overview of manual testing topics including: 1. Manual testing involves testing software without programming by using the client application and ensuring everything works as expected based on requirements. 2. Proper automation scripts can be developed by first manually testing the application. 3. The document outlines different types of testing (e.g. black box, white box, gray box testing), the software development life cycle, test metrics, and other manual testing concepts.

Uploaded by

Sunitha Gubbala
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
131 views

Software Quality Assurance Training Material: Manual Testing

This document provides an overview of manual testing topics including: 1. Manual testing involves testing software without programming by using the client application and ensuring everything works as expected based on requirements. 2. Proper automation scripts can be developed by first manually testing the application. 3. The document outlines different types of testing (e.g. black box, white box, gray box testing), the software development life cycle, test metrics, and other manual testing concepts.

Uploaded by

Sunitha Gubbala
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 14

Software Quality Assurance Training Material

MANUAL TESTING

Manual Testing Page 1/14


Table of Contents:

Manual Testing Topics...................................................................................................................................... 3


Chp1: Introduction to Software Testing............................................................................................................. 3
Chp2: Introduction to Software Development Life Cycle (SDLC) – C.O PressMan..........................................4
Chp3: Introduction to ISO & SEI-CMM.............................................................................................................. 4
Chp4: Introduction to Testing Life Cycle........................................................................................................... 4
Chp5: Types of Testing.................................................................................................................................... 6
Chp6: Test Metrics............................................................................................................................................ 9
Chp7: Basic Flow of Testing **......................................................................................................................... 9
Chp8: Bug Tracking Tool.................................................................................................................................. 9
Chp9: White Box Testing.................................................................................................................................. 9
Chp10: Entrance and Exit criteria * *.............................................................................................................. 10
Chp11: Release Notes................................................................................................................................... 10
Chp12: Other Related Topics......................................................................................................................... 10

Manual Testing Page 2/14


Manual Testing Topics

Quick Notes on Manual Testing

 Manual Testing is a Black Box testing and doesn’t require any programming language where as
White Box testing requires programming language.
 Manual testing will be done using client/front end application and make sure every thing is working
as expected based requirements documents.
 Product will not be release without testing the application manually (mostly).
 Proper automation scripts can be developed by testing the application manually before we start
automation.
 Drawbacks: Manual Testing is time consuming and requires heavy resources some times (when it
requires testing on multiple OS’s and browsers etc).

Chp1: Introduction to Software Testing

What is Software Testing?


Software testing is more than just error detection; Testing software is operating the software under
controlled conditions, to (1) verify that it behaves "as specified"; (2) to detect errors, and (3) to
validate that what has been specified is what the user actually wanted.
a. Verification is the checking or testing of items, including software, for conformance and
consistency by evaluating the results against pre-specified requirements. [Verification: Are we
building the system right?]
b. Error Detection: Testing should intentionally attempt to make things go wrong to determine if
things happen when they shouldn't or things don't happen when they should.
c. Validation looks at the system correctness – i.e. is the process of checking that what has been
specified is what the user actually wanted. [Validation: Are we building the right system?] In other
words, validation checks to see if we are building what the customer wants/needs, and verification
checks to see if we are building that system correctly. Both verification and validation are necessary,
but different components of any testing activity.

What is the difference between verification and validation?

Verification: Verification is a process to ensure that the software that is made, matches the original
design. In other words, it checks whether the software is made according to the criteria and
specification described in the requirement document. It is to check whether you built the product right
as per design. It is a low level checking. (It is done in walk-through meetings generally). It checked
whether it is made accordingly to the design..
Validation: Validation is a process to check whether the product design fits the client’s need. It
checks whether you built the right thing. It checks whether it is designed properly.

What is software testing methodology?

There are four major milestones in Development cycle: Planning Phase, Design Phase,
Development Phase, and Stabilization Phase. The Planning Phase culminates in the completion of
the Planning Docs Milestone (Requirements plus Functional Spec). The Design Phase culminates in
the completion of the Design Spec and Test Plan / Test Spec. The Development Phase culminates
in the Code Complete Milestone. The Stabilization Phase culminates in the Release Milestone.
During the first two phases, testing plays a supporting role, providing ideas and limited testing of the
planning and design documents. Throughout the final two stages, testing plays a key role in the
project.

TEST STAGES
Milestone 1 - Planning Phase
Manual Testing Page 3/14
Milestone 2 - Design Phase
Milestone 2a - Usability Testing
Milestone 3 - Developing Phase
Milestone 3a - Unit Testing (Multiple)
Milestone 3b - Acceptance into Internal Release Testing
Milestone 3c - Internal Release Testing
Milestone 3d - Acceptance into Alpha Testing
Milestone 3e - Alpha Testing
Milestone 4 - Stabilization Phase
Milestone 4a - Acceptance into Beta Testing
Milestone 4b - Beta Testing
Milestone 4c - Release to Manufacturing (RTM)
Milestone 4d - Post Release

TEST LEVELS
Build Tests
Level 1 - Build Acceptance Tests
Level 2 - Smoke Tests
Level 2a - Bug Regression Testing

Milestone Tests
Level 3 - Critical Path Tests

Release Tests
Level 4 - Standard Tests
Level 5 - Suggested Test

BUG REGRESSION
BUG TRIAGE
SUSPENSION CRITERIA AND RESUMPTION REQUIREMENTS
TEST COMPLETENESS

Standard Conditions:
Bug Reporting & Triage Conditions

How do you create a test strategy?

The test strategy is a formal description of how a software product will be tested.
It includes introduction, Test Objectives, Test Process, Test Methodology, Test Scope, Release
Criteria for Testing (exit criteria), Test Lab configuration, resource and schedule for test activities,
acceptance criteria, test environment, test tools, test priorities, test planning, executing a test pass
and types of test to be performed.

A test strategy is developed for all levels of testing, as required. The test team analyzes the
requirements, writes the test strategy and reviews the plan with the project team.
A test plan may include test cases, conditions, test environment, a list of related tasks, pass/fail
criteria and risk assessment. Inputs for this process:

• A description of the required hardware and software components, including test tools. This
information comes from the test environment, including test tool data.
• A description of roles and responsibilities of the resources required for the test and schedule
constraints. This information comes from man-hours and schedules.
• Testing methodology. This is based on known standards.
• Functional and technical requirements of the application. This information comes from
requirements, change request, technical and functional design documents.
• Requirements that the system can not provide, e.g. system limitations. Outputs for this process:

Manual Testing Page 4/14


• An approved and signed off test strategy document, test plan, including test cases.
• Testing issues requiring resolution. Usually this requires additional negotiation at the project
management level.

What is the general testing process?

The general testing process is the creation of a test strategy (which sometimes includes the creation
of test cases), creation of a test plan/design (which usually includes test cases and test procedures)
and the execution of tests.

(i) Black Box Testing: This testing is done to find the circumstances in which program doesn’t
behave according to the specification.
Black box testing refers to test activities using specification-based testing methods and criteria to
discover program errors based on program requirements and product specifications.

The major testing focuses:


- specification-based function errors
- specification-based component/system behavior errors
- specification-based performance errors
- user-oriented usage errors
- black box interface errors
For more info, refer BlackBoxTesting.ppt

(ii) White Box Testing: The basic idea is to test a program, based on structure of a program. We’ll
discuss more about the same later.
For more info, refer WhiteBoxTesting.ppt

(iii) Gray Box Testing: If you know black box and white box, then you will be called as Gray box
tester.

Chp2: Introduction to Software Development Life Cycle (SDLC) – C.O PressMan

(i) Initial Phase:


The details of the project are discussed and documented
(ii) Analysis Phase:
(i) Cost-benefit analysis (ii) Type of programming language required (iii) Length and
duration of project (iv) Specifications and analysis (v) No. of resources (Ex: Refer QA
Staffing Table template)
(iii) Design Phase:
The database model, main modules, forms, report etc are designed
(iv) Code/Implementation Phase
Developers have to code the programs according to the specifications
Build Release Engineering: This team will develop Build using tools like InstallShield.
Build contains executable (ex: setup.exe for windows and setup.tar for unix). Executable
contains set of programs (ex: Registration.asp, Login.asp etc).
(v) Testing Phase:
This is the job of testers. (Ex: Testing process)
(vi) Maintenance/Support & Marketing Phase:
Once the project is tested, it is to be marketed to the customers

Manual Testing Page 5/14


Chp3: Introduction to ISO & SEI-CMM

1. ISO – International Standard Organization: It provides guidelines for the selection and use of
standards for quality management and quality assurance

2. SEI-CMM – Software Engineering Institute Capability Maturity Model: CMM for s/w provides s/w
organizations with guidance on how to gain control over s/w process for developing and
maintaining s/w.

5 types: 1. Initial 2. Repeatable 3. Defined 4. Managed 5. Optimized

1. Initial: The s/w process capability of Level1 organization is unpredictable because of


the s/w process is constantly changed or modified as the work progress
2. Repeatable: The s/w process capability of Level2 organizations can be summarized
as disciplined because planning and tracking of the s/w project is stable and earlier
successes can be repeated
3. Defined: The s/w process capability of Level3 organizations can be summarized as
standard and consistent because both s/w engineering and management activities all
stable and repeatable
4. Managed: The s/w process capability of Level4 organizations can be summarized as
predictable because the process is measured and operates within measurable limits
5. Optimized: The s/w process capability of Level5 organizations can be characterized
as continuously improving because Level5 organizations are continuously improve
the range of their process capability, there by improving the process performance of
their projects

Chp4: Introduction to Testing Life Cycle

(i) System Study – System study is nothing but understanding the application by reviewing
application/product related documents like SRS, Use Cases, Functional specifications, screen
shots etc.

(ii) Build: Build contains executable (ex: setup.exe for windows and setup.tar for Unix). Executable
contains set of programs (ex: Registration.asp, Login.asp etc). Build will be developed by Build
Release Engineering team. Some companies they will call it as Drop instead of Build.

(iii) Testing stages: QA, Staging and Production

QA: From installation to System test we have to perform testing on QA server in initial testing stage
QA server will be in QA lab.
 
Staging: Once system test is pass against QA server, we have to do quick smoke test or limited system
testing against staging server. A staging server is almost equivalent to production server. That means
staging server and production server configuration/environment is almost same.
Staging server will be in QA lab.

Production: Once testing is pass on staging server, all files should be moved to production server, so that
application will be available to public.
Production server will be in client/customer place.

(iv) Release Candidate: Before testing the final build (Gold Master), you will test this build.

(v) Use Case: A use case is a business need and that will be plain English developed by
management team (Business Analyst etc). use case is a technique for capturing functional

Manual Testing Page 6/14


requirements of systems and systems-of-systems. According to Bittner and Spence, "Use cases,
stated simply, allow description of sequences of events that, taken together, lead to a system
doing something useful".

(vi) Requirement: Requirement is a singular documented need of what a particular product or


service should be or do. It is a statement that identifies a necessary attribute, capability,
characteristic, or quality of a system in order for it to have value and utility to a user requirement
can be a description of what a system must do. This type of requirement specifies something
that the delivered system must be able to do. Another type of requirement specifies something
about the system itself, and how well it performs its functions.
Note: Requirments will be developed based on Use Cases. This will be developed by
management/development team (Project manager/Product manager etc).

(vii) SRS – System Requirements Specifications. Refer sample SRS template.


This document says what development team is going to develop for the application/project.
Requirments documents will be developed based on use cases.
This document will be developed by management/development team (Project manager/Product
manager etc).
Other names for requirement documents: ERD (Engineering Requirments Document), PRD
(Product Requirements Document), BRD (Business Requirments Document) etc..

(viii) Test Plan – It is a strategic document file consisting of complete information of testing process.
Refer sample TestPlan.
This document says what QA team is going to test for the application/project. This document will
be developed by QA team (QA manager/QA team etc).
This document will be developed based on requirements documents, use cases etc.

(ix) Test Case – It is a verification point or checkpoint to accomplish the task of testing an
application. Refer sample TestCase.

Status: Pass, Fail, Not Run, N/A


Pass: You will pass the test case when expected and observed results are same
Fail: You will fail the test case when expected and observed results are different
Not Run: Feature is ready for testing, but no time or resources (manpower, s/w, hw etc)
N/A: Feature is not ready for testing.

Test cases will be developed based on requirements documents and test plan.
Test cases will be developed by QA lead, QA team members.
Test cases will be developed in customer point of view.
Test cases will be developed by covering positive, negative, boundary conditions, functional,
user interface, performance etc.
Test cases will be executed against each and every build.

(x) Test Execution – Execute all the test cases against the Build. Build will be release to QA weekly
(mostly every Monday).

(xi) Results – No. of test cases executed, No. of test cases pass, No. of test cases failed, No. of test
cases differed.

(xii) Defect Tracking: If the Test Case fails, the bugs are filed in Bug Tracing System (TD, DDTS,
TrackGear). We can find how new bugs are filed, how many bugs are assigned, how many bugs
are opened, how many are fixed, how many are posted, rejected, duplicated and closed etc.

(xiii) Reports: The testing results of application or bugs which has failed after testing will be
presented in a graphical way or tabular way

Manual Testing Page 7/14


(xiv) Bug Review Meetings – This will happen once in a week. QA and Dev team will be joined.
 Bug review meeting will held when the bugs are moved to reject, postpone, duplicate and/or
priority change.
 Dev team cannot reject, postpone, duplicate and/or priority change until they discuss with
QA team.
 Bug review meeting will happen once in a week (Tuesday) during regular testing time and
will happen once in a day when we are close to the release.
 QA Lead or some one form development team, will prepare the bug review document that
contains list of defects/bugs needs to be discussed and send it to QA and Dev teams before
the meeting.
 Once Dev and QA discuss about the bugs then only bug status will be confirmed/finalized.

(xv) QA Status Meetings – This will happen once in a week – Only QA people will join.
 QA status meeting will happen among the QA team. Dev or other teams will not join this
meeting.
 QA status meeting will happen once in a week (Thursday).
 During this meeting, QA manager may give the details about upcoming projects, schedules.
 QA lead/testers should give update about the task which they are working that time. (Ex:
Test Plan development completed 80%, test cases developed 50%, automation completed
75% etc).
 QA team can ask QA manager about the following: if they need vacation or if they need to
upgrade memory, or/and if they need another computer etc.

(xvi) Release to Customers

(xvii) Maintenance

Chp5: Types of Testing

(i) Installation Testing: The installation testing is to ensure that s/w can be installed under different
conditions like new installation, an upgrade & custom installation. Once installed properly we
have to verify that s\w is operating correctly.

(ii) Smoke Testing/Sanity: Main feature of the applications are tested, if this testing is pass, then
we will go for further types of tests. If it fails, we’ll send the build back to developers.
NOTE: Once spoke/sanity test is passed, then only QA accepts the build and do bug verification.

(iii) Regression Testing: It is performed to verify the bugs in earlier version, which have been fixed
and fixing of these bugs has not resulted in introducing new errors. This test is useful to the
application during version change. Regression testing will be done whenever there is a change
in code. That includes bug fixes, adding new feature, delete/modify existing feature.

(iv) New Features: Make sure all the new features delivered in that particular build is working as per
requirements.

(v) Functional Testing: Functional Testing is a form of software testing that attempts to determine
whether each function of the system works as specified The goal of this testing is to verify proper
data acceptance, processing and retrieval.

(vi) Integration Testing: It focuses on testing multiple modules working together. Testing groups of
related units. It ensures that units work together correctly.

(vii) User Interface Testing: Checks the images, graphics, links, spellings, alignments, tables,
frames etc (Tool: XENU)

Manual Testing Page 8/14


(viii) Security Testing: It is a test that is performed to verify that protection mechanism built into a
system will in fact protect it from improper penetration and to minimize or stop illegal
connections.
When security testing is done using http and https protocols the following is checked: -
1. If the URL contains http there should not be a lock icon on the status bar
2. If the URL contains https there should be a lock icon on the status bar.
Examples: -
 The password should be displayed in bullets/asterisk when typed in the password
field of a login page
 The password should be encrypted when it is sent over the network
 Login account should be locked after two or three attempts with wrong password.
 Session time-out
 When you login and leave the application idle for sometime (ex: 120sec)
and try to access it again, it should take you back to the login screen
 When you logout and if ‘back’ button of browser is clicked, it should take
you back to the login screen

Changing protocol from http to https: -


1. Go to verisign.com
2. Look for trial version SSL (secured socket layer) certificate. Click on it and enter personal
information and complete the steps. After some time you will receive an email and this email
contains the certificate.
3. Go to the IIS web server and apply the certificate.
4. The certificate will be applied either for the whole website or only for the secure pages.

(ix) Configuration Testing: It attempts to verify that all of the system’s functionality works under all
hardware and software configurations.

(x) Usability Testing: Usability Testing is a process that measures how well a web site or software
application allows its users to navigate, find valuable information quickly, and complete business
transactions efficiently.

(xi) Backend Testing: Testing the backend database (SQL, Oracle etc) to make sure data inserted
into the right database, right tables and data populated in correct format.

(xii) Performance Testing: Performance testing is testing that is performed to determine how fast
some aspect of a system performs under a particular workload. It can serve different purposes. It
can demonstrate that the system meets performance criteria. It can compare two systems to find
which performs better.

(xiii) Load Testing: This test subjects the program to test with multiple users using the single
program entity at the same time.

(xiv) Volume Testing: This type of testing is to feed large volumes of data into the system to
determine that the system can correctly process such amounts.

(xv) Stress Testing: It is the test to breakdown the system. In this testing we will check the system
how it is handling peak usage periods.

(xvi) System Testing: System testing is the first time that the entire system can be tested as a whole
system against the Feature Requirement Specification(s) (FRS) or the System Requirement
Specification (SRS), these are the rules that describe the functionality that the vendor (the entity
developing the software) and a customer have agreed upon. This will be done before
acceptance testing

Manual Testing Page 9/14


(xvii) User Acceptance Testing/End User Testing: This testing will be done by the Customer. This
testing is done to check the whole application by giving live data to the system in the user
environment.

(xviii) Upgrade Testing: Upgrade testing allows you to add/remove feature on the existing application.
(Ex: UI change, new features, new technology etc).
Steps for upgrade testing: -
1. Install customer build e.g. Gold Master or Gold Master plus service pack (ex: SP1)
2. Insert test data
3. Run Version 2.0 installer on top of version 1.0
4. Make sure there is no errors- if there are file a bug. Also it should give two options-one for
upgrade and the second for new installation
5. Once upgrade installation is completed successfully we have to check the following:
 New features are added
 Old features are still working
 You are able to add new data
 The old user account is able to update new data or new user can update old data

(xix) Globalization Testing (G10N): Which basically involves the translation of web pages into
various languages, thereby allowing companies to reach a "global public". Internationalization
(I18N), Localization (L10N) is subsets of Globalization.
 Internationalization (I18N) and Localization (L10N) are subsets of globalization.
 Product will be called as globalized when I18N & L10N are successfully completed.
 Globalization Testing will be started once the English version is released. That should start
within 60-90 days.
(xx) Internationalization (I18N): Internationalization is the process of developing a software product
whose core design does not make assumptions based on a locale. It potentially handles all
targeted linguistic and cultural variations (such as text orientation, date/time format, currency,
accented and double-byte characters, sorting, etc.) within a single code base.
 When we test I18N we have to make sure date, time, currency are translated to local
language.
 We have to find a local operating system (Chinese), on top of that we have to install the
English version of the build.

(xxi) Localization Testing (L10N): Means taking an internationalized product and customizing it for a
specific market. This includes translating the software strings, rearranging the UI components to
preserve the original look and feel after translation, customizing the formats (such as date/time,
paper size, etc.), the defaults, and even the logic depending on the targeted market such a
customization is possible only if the application is properly internationalized; otherwise, the L10N
team faces a challenge whose significance depends on the application and the language
version. Once the Application properly tested in one language (Ex: Default –English) and if it is
stable, then we have to go for Localization testing, that means the same application will be
tested in different languages (Ex: French, Russian, Chinese etc).
 When L10N is tested, we have to make sure all the contents (buttons/images/text etc)
are converted into local language.
 We have to find local operating system (Chinese), on top of that we have to install
Chinese build.

Below link will convert English character to other languages, so that you can test you application
in other than English language.
http://babelfish.altavista.com/tr

(xxii) Documentation Testing: It is the user documentation to determine the representation of prior
system test cases. Make sure Help, Installation Guide, Tutorial, Users Guide, Admin Guide and
other related documents are properly prepared

Manual Testing Page 10/14


Chp6: Test Metrics

Test cases should be executed on all builds and should be updated the test cases based on the test results
and then finally check following items:
1. Total test cases executed 2. Total test cases passed 3. Total test cases failed 4. Total test cases deferred

Chp7: Basic Flow of Testing **

(i) Verify SRS, functional specifications and UI design documents


(ii) Prepare Test Plan
(iii) Prepare Test Cases based on SRS, functional specifications, and UI design documents
(iv) Divide modules and test cases among testers
(v) Manual test and Automation test (only when build is stable)
(vi) Check the application with all possible types of testing
(vii) File the bugs
(viii) Development fix by and send it back to tester
(ix) Tester will retest the bugs and do the Regression Testing (make sure fixes didn’t break other
part of syntax) and close the bugs
(x) Participate in Bug review meeting
(xi) At the end of the week QA must send all test results to development team/higher management
for their review. Refer Sample Build Test Status Report template

Chp8: Bug Tracking Tool


(i) When to file a bug?
We have to file a bug when expected and observed results are different.
Bugs can be filed either in word, excel or any bug tracking tools. Refer Sample Bug Format
(ii) Usage of Bug Tracking Tool (DDTS, Track Gear, JIRA, BugZilla)
(iii) Bug Tracking Life Cycle:
1. New
2. Assign
3. Open -> (i) If not valid bug - Reject, (ii) If valid bug Fix, Duplicate, Postpone and Priority
Change 4. If bug is Fixed-> (i) Close (ii) Re-Open
5. If bug is re-opened then again process will start from step 3
Refer Problem Management Flow Char, Resolution Report Form
(iii) Bug Review Meetings Process

Chp9: White Box Testing

The basic idea is to test a program, based on structure of a program.


(a) White Box Testing Objectives – The major objective of WB Testing is to focus on internal
program structure and discovers all internal program errors.

(b) White Box Testing Methods


(i) Basic Path Testing: Guarantee to execute every statement in the program at least
one time.
(ii) Branch Testing: Exercise predicate nodes of a program flow graph to make sure
that each predicate node has been exercised at least once.
(iii) Loop Testing: Exercise loops of a program to make sure that the inside and outside
of loop body are executed.
(iv) State Based Testing: Basic idea is to use a finite state machine as a test model to
check the state behaviors of a program process.
(v) Data Flow Testing: Focus data value definitions and its uses in a program control
flow charts. Data flow test criteria is developed to improve the data test coverage of
a program.
For more info, refer WhiteBoxTesting.ppt

Manual Testing Page 11/14


Chp10: Entrance and Exit criteria * *

After developing some modules of the application, QA needs to start testing application. After final build
testing QA needs to stop testing application.
Unit Test Exit Criteria:
(i) All requirements documents should be base lined
(ii) Coding for phase should be completed
Integration Entrance Criteria:
(i) All requirement design documents should be base lined
(ii) Test Plan must be completed
(iii) First time delivery of software must be completed
Integration Exit Criteria:
(i) All test cases for the integration testing have been successfully executed:
- 100% of P-1 bugs fixed
- 100% of P-2 bugs fixed
- 80% of P-3 bugs fixed
Note: The above % will vary based on company/organization
System Test Entrance Criteria:
(i) Integration test exit criteria have been successfully met
(ii) All install documentation is completed
(iii) All software has been successfully built

System Test Exit Criteria:


(i) All test cases should be tested
(ii) All test cycles should be executed
(iii) All documents are reviewed, finalized and signed-off

Chp11: Release Notes

Release Notes contains the information about the particular build/patch and will give the over all information
to test engineers/customers. This document contains the below info:
 Build/Patch number and version
 New feature or fix details – (for QA only)
 Resolved Issue – (for QA only)
 Known issues
 Supported configurations etc…
Refer Sample Release Notes

Chp12: Other Related Topics

What is the major difference between Build and Patch testing?


Build: Install Build on specified server and test all available features in that particular build

Patch: (i) Find the customer environment (ex: Win 2003, IE, Build 55) (ii) Find similar environment in QA lab
(iii) Install Customer build on QA machine (iv) Reproduce Customer problem (v) Apply patch by following
instruction in patch release notes (vi) Re-test the customer problem (this time you should not see the
problem).

Service Pack: Combination of more than one patch.

Ghost Process – Backup & Restore (Norton Ghost 2003)


Backup: Allows you to take be backup from the machine/server
Restore: Allows you to re-store the above taken backup image, so that your server/machine will come to the
original state.
Manual Testing Page 12/14
Configuration Management (tools: PVCS, Visual SourceSafe) - This tool is version control tool to store
individual builds as versions

MODELS: Waterfall, Incremental, Prototyping, Spiral and V Models

(i) Waterfall Model:


http://www.experiencedynamics.com/popups/popup_waterfall_method.php
Business Requirements doc(often incomplete)->Technical Requirement doc(Review tech specs)->Begin
Code(Build s/w architecture)->Develop use cases(develop interface)-> Finish Coding->QA testing technical
testing user acceptance testing->debug->Launch
• The waterfall model is a simplistic sequential model
(Strategy->Analysis->Design->Build->Test->Transition)
• It assumes that development can follow a step-by-step process.
• You never go back to previous steps.

(ii) Incremental Model/Method:


http://scitec.uwichill.edu.bb/cmp/online/cs22l/incremental.htm
• There are a number of models typified by an incremental approach.
• Pieces are designed, implemented, and tested individually.
• The system is built up piece by piece.
• Someone has to keep the big picture in mind.

(iii) Prototyping Model:


http://searchsmb.techtarget.com/sDefinition/0,,sid44_gci755441,00.html
The Prototyping Model is a systems development method (SDM) in which a prototype (an early
approximation of a final system or product) is built, tested, and then reworked as necessary until an
acceptable prototype is finally achieved from which the complete system or product can now be developed.
This model works best in scenarios where not all of the project requirements are known in detail ahead of
time. It is an iterative, trial-and-error process that takes place between the developers and the users.

(iv) Spiral Model:


http://en.wikipedia.org/wiki/Spiral_model
The spiral model is a software development process combining elements of both design and prototyping-in-
stages, in an effort to combine advantages of top-down and bottom-up concepts. This model of development
combines the features of the prototyping model and the waterfall model. The spiral model is intended for
large, expensive, and complicated projects.

The V-Model:
http://en.wikipedia.org/wiki/V-Model_(software_development)
The V-model is a software development model which can be presumed to be the extension of the waterfall
model. Instead of moving down in a linear way, the process steps are bent upwards after the coding phase,
to form the typical V shape. The V-Model demonstrates the relationships between each phase of the
development life cycle and its associated phase of testing. The development process proceeds from the
upper left point of the V toward the right, ending at the upper right point. In the left-hand, downward-sloping
branch of the V, development personnel define business requirements, application design parameters and
design processes. At the base point of the V, the code is written. In the right-hand, upward-sloping branch of
the V, testing and debugging is done. The unit testing is carried out first, followed by bottom-up integration
testing. The extreme upper right point of the V represents product release and ongoing support.

Manual Testing Page 13/14


Before Start QA Training
Client/Server, Web based and ERP applications:
1. C/S applications are standalone applications and can be tested on different Operating systems but no on different
browsers. A web based application can be tested on different Clients (browsers) and Operating systems.
2. Client Server applications need separate client application to run whereas Web applications only need a web
browser to run it.
3. C/S application testing is faster than web based application testing.
4. Client/Server: Application Server (JBoss, Web Logic) -> Database server (Oracle/SQL server) – 2 tier Architecture
5. Web-base: Application Server (JBoss, Web Logic) -> Web server (IIS, Apache) -> Database server (Oracle/SQL
server) – 3 tier Architecture
6. ERP: Enterprise Resource Planning (SAP, PeopleSoft, Oracle Financials etc)

1,2, 3 and N-tier architecture


1-Tier Architecture - A simple form of standalone application architecture where everything resides in a single program.
Contrast this to 2-tier and 3-tier architectures.
A 1-tier architecture is the most basic setup because it involves a single tier on a single machine. Think of an application
that runs on your PC: Everything you need to run the application (data storage, business logic, user interface, and so
forth) is wrapped up together. An example of a 1-tiered application is a basic word processor or a desktop file utility
program.

2- Tier Architecture - The two tiers are:


*. Client application: the application on the client computer consumes the data and presents it in a readable format to the
student.
*. Data server: the database serves up data based on SQL queries submitted by the application.
the client handles the display, the server handles the database
Although the 2-tier approach increases scalability and separates the display and database layers
Client/Presentation tier-> data tier OR Client->Database(Oracle/SQLserver) – CLIENT/SERVER

3- Tier Architecture - We create a 3-tier architecture by inserting another program at the server level. We call this the
server application. Now the client application no longer directly queries the database; it queries the server application,
which in turn queries the data server.
*. Client application: the application on the client computer consumes the data and presents it in a readable format to the
student.
*. Server Application:
*. Data server: the database serves up data based on SQL queries submitted by the application.
Client->Server(IIS/Apache)->Database(Oracle/SQL server) – WEB BASED

N- Tier Architecture: Client->Server(IIS/Apache)->BusinessLogin (Verisign etc) ->Database(Oracle/SQLserver)

Application Vs Product:
Application: A software developed for a specific customer (www.safeway.com)
Product: A software developed for multiple customers (MS Office, Payroll Mgmt, Inventory Mgmt)

Build: Build contains executable (ex: setup.exe for windows and setup.tar for Unix). Executable contains set of programs
(ex: Registration.asp, Login.asp etc). Build will be developed by Build Release Engineering team. Some companies they
will call it as Drop instead of Build.

Test Environment:
Hotmail (server) -> Client Machine 1 -> Client Machine 2 -> …………. Client Machine N

Query: Query is a SQL statement (Insert, Update, Delete etc)

Stored Procedure: Stored Procedure is a pre-complied SQL statement.

Normalization: Normalization Reduce the redundancy in database, that way it will create the performance of database.
There are different types of normalizations: 1st, 2nd, 3rd, 4th, BCN normalizations etc.…s

Manual Testing Page 14/14

You might also like