0% found this document useful (0 votes)
19 views36 pages

Manaul Part 1

The document outlines the processes and methodologies involved in manual testing, API testing, and database testing, detailing the software development life cycle (SDLC) stages, team structure, and project management tools. It describes various testing types, including sanity, smoke, and regression testing, and emphasizes the importance of documentation such as Business Requirements Specification (BRS) and Software Requirements Specification (SRS). Additionally, it covers different models like Waterfall, V-Model, and Agile, highlighting their respective processes and team roles.

Uploaded by

jadhavbs16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views36 pages

Manaul Part 1

The document outlines the processes and methodologies involved in manual testing, API testing, and database testing, detailing the software development life cycle (SDLC) stages, team structure, and project management tools. It describes various testing types, including sanity, smoke, and regression testing, and emphasizes the importance of documentation such as Business Requirements Specification (BRS) and Software Requirements Specification (SRS). Additionally, it covers different models like Waterfall, V-Model, and Agile, highlighting their respective processes and team roles.

Uploaded by

jadhavbs16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 36

Manual-

Manual Testing-
 Process
1. Basic SDLC (software development life cycle)
2. Waterfall model/ process
3. V-Model/ process
4. Agile Model/Process  90 % to 95% process

 Testing type
1. Sanity testing, Smoke Testing, regression testing, etc

 Manual part 2 – Test cases design, Test cases execution, review, defects, etc
 Project management tool- JIRA/ HPALM

API Testing-
 SOAP & REST service testing
 SOAPUI tool & POSTMAN tool

Database Testing-
 SQL quires

Project – 2live

Team Size–
 Project Team- New features/ functionality/ Module. Ex. Paytm – Invest in stock
 Project team size is 24 to 26 people
1. Deliver manager(1) – Project delivery to client
2. Project manager (1)- Project task assign/ work need perform with the time/ team
handling
3. Business analysis (1)- BA interaction client and collect the requirement/
functionality/ New features
4. Designer/ Solution archicture (1)- project application design
5. Developer (14 to 16) – Developer will code for application
6. Tester (5 to 6) – Tester will do testing on developer application
 Support Team – Existing application issue/ defects/ end user quires/ Ticket Ex.
Paytm- Recharge module
 Support team size 9 peoples
1. Project manager/ Support manager (1)- Support task assign/ work need
perform with the time/ support team handling
2. Developer (5 to 6) – Developer will code for application
3. Tester (1 to 2) – Tester will do testing on developer application

 I have worked in Project team


 Support Tickets-

 Project –
 Wipro- Ex. Paytm = Project team & Support team
 Accenture- Ex. Groww= Project team & Syntel – Ex. Grow = Support team

 Service based company - Infosys, Wipro, Persistence, etc


 Product based company- HSBC bank, IBM, Paytm, etc

Manual Testing-
 Defines- Check / validation developed application will work as per requirements.
 Check completeness & correctness of functionality by manual

SQA process-
 SQA process defines to measure & monitor the developed application.
 SQA defines software quality assurance
 SQA process parameters-
1. Clients/ customer requirements full fill
2. Clients/ customer exception (security & performances)
3. Clients/ customer delivery within time
4. Risk managements

BA Client

Developer Tester

Process-
 Process deiced
 Ex. Client  HSBC company IT departments, Wipro company work  Client decide
process
 Ex. Client  Cred application, No IT departments, Wipro company  Wipro decide
process

SDLC process-
 SDLC- software development life cycle
 SDLC process development & testing application
 SDLC stage involved
1. Information gathering
2. Analysis
3. Design
4. Coding
5. Testing
6. Support/ Maintains
Information gathering-
 Information gathering will be done by BA
 BA will collect the requirement from Client related to business of Client
 In Information gathering stage BA will papered a BRS documents
 BRS – business requirements specification
 BRS defines clients business related requirements
 BRS documents will not get to tester
 EX. Client Business End user platform = Bidding/ Cricket biding End user platform
 End user brokerages charge  Crick, Football, etc Client business  Ex. Dream11,
MPL

Analysis-
 In Analysis stage BA will work
 In Analysis stage BA will collect the requirement from Client
 BA will collect requirements related the functionality of application/ software
 In Analysis stage BA will prepared SRS (Software requirements specification)
 SRS defies as functional requirements that will be implemented & system
requirement that will be used.
 SRS also called FRS(functional requirements specification)/ CRS (Customer
requirements specification)
 SRS will contains
1. Functional requirements
2. Functional flow diagram
3. Use cases (1 specific requirement)
A. Description – Detail about the specific requirement
B. Acceptance criteria – Does & don’t about specific requirement/ Summery
4. Screenshot/ Snapshot/ prototype
 After completion of SRS documents BA will sent a mail these SRS documents to
developer & tester
 Developer & tester will analysis/ understand/ study the SRS documents
 If documents is not clear then we will do the meeting with BA

 Ex. Use cases-BOQ- Stock Module- Intraday buy & Sell form new window
 Description- User has to buy or sell throw intraday window. For end user intraday added which
can buy or sell share / stock. In intraday, user can hold stock/share within 1 day. i.e. within
trading time as per your country.

 Acceptance criteria-
1. Intraday available in Stock module for all user
2. Intraday for active with trading day (trade cycle time – 9.15 to 3.15 pm)
3. Throw Intraday user can 1st buy then sell OR 1st sell then buy
4. Intraday will provide margin for all stock/share
5. Intraday will provide margin for all future & options
6. After trade time, for Intraday error message will show – “Intraday not available you order will
place after next trading day”

Design-
 In design stage designer or solution archicture will work.
 Designer will prepared the design same as SRS documents.
 Designer will prepared  HLD (high level design), LLD (low level design)

Coding-
 In Coding stage developer will work.
 Developer will work on LLD (low level design).
 LLD will implemented according to SRS/ as per Use cases

Testing-
 Tester will do the Test cases design (TCD) & Test case execution (TCE)
 Tester will check functionality as per SRS documents OR as per Use cases
 In TCE, if we found a defect/ bug, tester will raised to developer
 Developer will fix these defect/ bug
 At the end application/ software deploy/ delivery to the client
 Project team will give the warrant support for 1 month.

Support/ Maintains-
 After completion of project team warrant time/ period then project/ application went to
support team
 Existing application issue/ defects/ end user quires/ Ticket

Interview Question-
1. What is your team size? In last project what is your team size?
2. What is ratio of you developer & tester?
 Answer- In my Current/recent project we have totals 25 to26 peoples. We have 16
developers and 6 testers. So we have ration of 3 developers: 1 tester
3. What the SDLC process?
4. What is difference between SDLC & STLC?
5. What is SRS document and what it will contains?
6. If we don’t clear the about requirements then what your approaches?

Basic SDLC module / Fish module/Process –

Information Analysis Design Coding Testing Support


Gathering (BRS) (SRS) (HLD, LLD) (Developers) TCD, TCE

Review (BA) Review (BA) Review (Design) WBT (White BBT (Black
Box testing) box testing)

Static testing/ Verification / Dynamic testing / Validation /


Quality control Quality assurance

 Review- To check the correctness & completeness of documents.

Static testing/ Verification Dynamic testing / Validation


Definition- BA will check their documents, Definition- Tester will check the functionality
designer will check their design & developer of application / software
will check code/ logic these process called
Static testing/ Verification
Static testing/ Verification we will do/check Dynamic testing / Validation we will do/ check
Quality control Quality assurance
Static testing also called as Verification Dynamic testing also called as Validation
Static testing/ Verification also called as In- Dynamic testing / Validation also called End
progress testing progress testing
WBT (white box testing) BBT (Black box testing)
WBT is performed by Developer BBT is performed by Tester
WBT testing 2 types BBT testing types-
1. Unit testing 1. Sanity/ smoke testing
2. Integration testing 2. Functionality testing
3. Re-testing
4. Regression testing, etc
In WBT testing check – Logic, loop coverage, In BBT testing check – Functionality,
Brach converge, etc Behavioral coverage testing, input domain
coverage, etc
WBT also called code level testing BBT also called system & functional testing
WBT only performed in +ve way BBT performed in +ve way & - ve way
Ex. Paytm- Recharge Module- valid mobile Ex. Paytm- Recharge Module- valid mobile
no. testing no. & invalid mobile no. testing

Waterfall Model/ Process-


 Waterfall model/ process it is also sequential process
 Sequential process after completion of one next stage will start with next stage
 Ex. Until Developer will complete, tester will not start test testing

Information
Gathering (BRS)

Analysis (SRS)

Designer (HDL, LLD)

Coding (LLD) - WBT

Testing (TCD, TCE) - BBT

Support

Drawback-
 In Waterfall model time is not fixed for deliver/ deployment
 In Waterfall model, backward/ backtrack is not possible (in testing if we found defect
then we can’t go to coding stage)
 Ex. If in TCE, if we found a defect/ bug then we can’t go to back stage (coding)
 Ex. Paytm – Recharge module - BSNL mobile no. accepting  defects note down. When
new feature or new modules added in application/ software (New feature + old defects)

Q. What is an Entry criterion of Unit testing and what is an Exist criterion of unit testing?
 Entry criteria- When developer will completed coding, then developer will do unit
testing OR when developer will sating the testing then it is called Entry criteria for unit
testing.
 Exit criterion – When we will completed unit testing these is called Exit criterion for
unit testing OR When we start with Integration testing it is called Exit criterion for
unit testing

 Unit Testing  Integration testing  System & function testing

V-Module-
 V-module stand for Verification & Validation
 V-module / process  Developer (LCD) and testing (LCT) are doing parallel
 Developer & testing stages are work parallel
 In V-module / process deployment / delivery process  3 month
 V- module it is Plan driver process (deployment / delivery process = 3 month)

LCD (Life cycle development) LCT (Life cycle testing)


Information gath. (BRS) Assessment of dev Plan

Analysis (SRS) Prepare test plan


Requirements Testing

Design – HDL Design phase testing


Coding – Developer (LLD) Program Phase testing (WBT)
Test case design (TCD)

Install Build/ Application/ System & fun. Testing (BBT) - Pune


Software User acceptance testing (UAT) – Client
Knowledge transfer (KT)
Maintains DRE (defect removable efficiency)
/Support / CR (change request)/ RFC

 DRE (defect removable efficiency)- DRE = A/ (A+B)


 A= No. of defects found in BBT, B= No. of defects found in UAT
 DRE defines how much deeply we have tested application/ build
 CR (change request)/ RFC (request for charge)-
 EX. Paytm – Recharge module- Amount field under – Recharge Amount – description
shows green text
 Id any CR comes from the client then your comply will accept these CR and apply the
extra charge to client

Drawback-
 In V-module / process deployment / delivery process  3 month
 Extra money for CR

Agile process/module-
 Agile defines continues development & continues testing will happen in application/
build
 Agile process it is Values driven process ( Priority to Client)
 If any CR comes at any point of time the we will consider these CR
 When will accept these CR then we will check impact on current development, Testing
& production.
1. If CR has more impact on development/ testing/ production then we will inform to
client
2. If CR has less impact on development/ testing/ production then we will develop &
testing and we will deploy to client.
 In Agile process/ module deployment / delivery process  2 week/ 3 week
 Agile subtypes/ framework/ methodology
1. XP (extreme programming) – Development & no testing
2. Scrum - (Bunch of requirement  Sprint wise development & Testing & Delivery)
3. Kanban – Support team (tickets/ Existing issue/ bugs/CR)
4. Lean - Support team
5. FDD – Future driven development

 I have worked in Scrum agile methodology/ subtypes

Agile architecture –

SDLC Agile

Information BA Product Backlog (Project = 2000 req.)


Gathering (BRS)

Analysis (SRS) BA Sprint Backlog (Sprint = 20 req.) – Client priority


1 sprint = 20 req./US – Decider PM, BA & designer
2 sprint = 18 req./US
3 sprint = 17 req./US

Use Cases (specific 1 req.) User Story (specific 1 req.)


1. Description 1. Description
2. Acceptance criteria 2. Acceptance criteria

Designer (HDL) Designer

Coding (developer - LLD) Coding (developer - LLD)

Tester – TCD, TCE Tester – TCD, TCE

Support Support
Agile people-
 Stockholder  Client
 Delivery manager  Solution master
 Project manager  Scrum master
 Business annalist  Product owner
 Designer  Designer
 Developer  developer
 Tester  Tester

Agile Meeting/ Ceremonies-


1. Grooming meeting
2. Sprint planning
3. Scrum meeting/ Daily stand up
4. Sprint review meeting
5. Sprint retrospective meeting

Agile meeting Purpose Involved

Grooming meeting - US /Requirement clear/ doubt 1hr – BA, PM,


(Before start of sprint /Any understand Designer, Dev & Tester
time in sprint)
Sprint planning meeting - Current sprint= 20 US Added/work 30Min- PM, BA,
One time- Start day of – decided by PM, BA & Designer Designer, Dev & Tester
sprint - Estimation (time span )/ Story point
Ex. 1US/req. – 16hr Developer + 14hr
Tester
Scrum meeting/ Daily - What you have done yesterday 15 Min- PM, BA,
stand up - What are doing today Designer, Dev & Tester
(Daily – 9.45 am to - Issue/ roadblock
10.00am)
Sprint review meeting - 4 US tested by tester – Demo/ 1hr- BA/Client/ UAT,
(Last day of sprint) Review to BA/ Client/ UAT PM, Designer, Dev &
Tester
Sprint retrospective - Current sprint – Good & Bad 30Min- PM, Designer,
meeting discussion Dev & Tester, BA
(Last day of sprint) 1 Sprint  2 sprint (optional)

Agile Day wise Plane-


 Agile peoples / Team = 24 to 26 (16 developer & 6 Tester)
 Agile duration – 2 week (Monday to Friday) = 2 * 5 = 10 days
 Agile  Current Sprint= Sprint 1 = 18 US/ requirements (For 4 module each4 to 5 US)
 1 Team = 3 developer + 1 tester = Recharge module (4 US)
 2 Team = 4 developer + 1 tester = Movies ticket module (4 US)
 3 Team = 4 /5 developer + 2 tester = Investment in Stock (6US)
 4 Team = 3 developer + 1 tester = Fasttag module (4 US)

1 week – (Monday to Friday)

Monday –
 Grooming meeting – US doubt / clear about – 1hr
 Sprint Planning meeting (30 min)–
1. Current sprint = Sprint 1= 18 US – decide PM, BA, Designer & Task add
2. Estimation/ Story point (time span)  4 Team = 4 US assign to Team
Ex. 1US = 12hr development + 10hr Testing, 2US= 16hr dev + 8hr testing, etc

 1 US  1US = Coding (6/7hr-Inprogess) + 1US – TCD(5hr- Completed)

Tuesday-
 Daily stand meeting (15 min)
 What you have done yesterday (1US –TCD- Completed)
 What you are doing today (1US- TCE)
 Issue/ roadblock

 1US 1US = Coding (2hr- Sent Build), 2US – coding- (4hr- In-progress) + 1US – TCE
(6/7hr- Completed)

Wednesday-
 Daily stand meeting (15 min)
 What you have done yesterday (1US –TCE- Completed)
 What you are doing today (2US- TCD)
 Issue/ roadblock
 2 US  2US = Coding (6hr- Completed) + 2US – TCD(6hr- Completed)

Thursday-
 Daily stand meeting (15 min)
 What you have done yesterday (2US –TCD- Completed)
 What you are doing today (2US- TCE)
 Issue/ roadblock
 2 US  2US = Coding (1hr- 2US Sent build), 3US-coding – (6hr-In-progess) + 2US –
TCE (6/7hr- Completed)

Friday-
 Daily stand meeting (15 min)
 What you have done yesterday (2US –TCE- Completed)
 What you are doing today (3US- TCD)
 Issue/ roadblock

 3US 3US-Coding (6hr- Completed) + 3US – TCD (4/5hr- In-progress)

2 Week- (Monday to Friday)-

Monday-
 Daily stand meeting (15 min)
 What you have done yesterday (3US –TCD- In-progress)
 What you are doing today (3US- TCD completed and 3US- TCE)
 Issue/ roadblock

 3US (3US-Build sent). 4US- Coding (6hr-In-progress) + 3US – TCD (1hr-


Completed) & 3US- TCE (5/6hr- In-progress)
 3US- in TCE- defects (2 defects)- Raised to Developer  Develop fixed defect 
Tester will test defects (2 defects- fixed/closed) (1/2hr)

Tuesday-
 Daily stand meeting (15 min)
 What you have done yesterday (3US –TCE- In-progress, 2 defects raised/ Closed)
 What you are doing today (3US- TCE completed and 4US- TCD)
 Issue/ roadblock

 4US 4US- Coding (6hr-In-progress) + 3US – TCE (1hr- Completed) & 4US- TCD
(4/5hr- In-progress)

Wednesday-
 Daily stand meeting (15 min)
 What you have done yesterday (3US –TCE-Completed, 4US-TCD –In-progress)
 What you are doing today (4US- TCD & 4US -TCE)
 Issue/ roadblock

 4US 4US- Sent Build (2hr-Complted) + 4US – TCD (2hr- Completed) & 4US- TCE
(4/5hr- In-progress)
 4US- TCE in defects (5 to 6 defects)  Raised to developer  Raised (3 to 4 defects
fix- 3hr)  Sent for Testing & Developer is not accepting defects (5 to 6 defects)

Thursday-
 Daily stand meeting (15 min)
 What you have done yesterday (4US –TCE-In-progress, 4US- 6 defects found)
 What you are doing today (4US- TCE & we will test defects)
 Issue/ roadblock (Developer is not accepting the defects – 5 to 6)

 4US 4US- defect fix (defect 5 to 6) (4hr-In-progress) + 4US – TCE (2hr- Completed)
& 4US- defect testing(defect 3 to 4) (3/4hr-Complated)

Friday- (Last day of Sprint)


 Daily stand meeting (15 min)
 What you have done yesterday (4US –TCE-Completed)
 What you are doing today (4US- defects 5 to 6 I will test)
 Issue/ roadblock

 4US 4US- defect fix (defect 5 to 6) (1hr-Complated) + 4US defect testing(defect 5 to


5) (2hr-Complated)

 Sprint Review meeting (1hr)-


 Tester will give Demo/review to BA/Client/UAT of  4US

 Sprint Retrospective meeting (30 Min)-


 Current sprint (Sprint 1)- Good & Bad things about Process, Development & Testing
 Good things carried to next sprint

Agile Term-
 Epic- Main Module name - Epic  Multiples US present against that module
 Burn down chart- It is graph which defines how much work is reaming w.r.t. time
 Burn up chart- It is graph which defines how much work is completed w.r.t. time
 Velocity- Defines how much we will deployment/ delivering to Client
 Estimation/ Story point- Time span for US / Development & Testing time for a US
 Estimation will defines deposing on following on
1. How much knowledge you have against US
2. How much efforts will be required
3. How much complexity of the US
 Estimation/ Story point based on/ unit  Hours

Agile Advantage-
 Agile advantage
1. Sprint wise delivery/ deployment- 2 week/ 3 week
2. Automation is possible is in Agile process
3. Daily stand up/ Scrum meeting- Work progress (In my project, we will do 2 times daily
stand up- Moring & evening)
4. Check point is present in every module

Recharge Module-

1US (100- Class) 2US (150) 3US (200- Class) 4US 5US 6US

Recharge Mobile, Browser Promo code Payment Tnx


Icon Opereater, Plane tab messa
Circle ge
Agile dis-Advantage-

 If frequent changes in requirement/ US, we can’t deployment/ delivery these US to


client
 If your application/ module is depend on another application/ module, we can’t
deployment/ delivery these US to client

Travels Operator
(Agcnecy)

Paytm- Travels
module/ Recharge
module

Operator / Service
provided

Interview Question-

1. What is difference between V-module & Agile


2. What is the Agile & what are agile frameworks?
3. What is the Agile, Scrum/ Sprint & Epic
4. What is Burn down chart, Burn up chart & Velocity
5. What are the Scrum Agile mythology/ architecture of Agile?
6. What are difference types meeting in agile/ what is ceremonies in the agile?

7. What is the Agile & How you are following agile process in your originations?
 Answer- Day wise plane in Agile ( 2 week)

8. What are advances in the Agile & dis- advances agile process?
9. How you will decide the estimation in Sprint planning meeting?
 Answer- In my project, PM will open the Agile Board (JIRA/HAPLM) in sprint
planning meeting then PM will ask to every developer & tester about the time/ story
point for every US which is assigned to you.
 Estimation will defines deposing on following on
1. How much knowledge you have against US
2. How much efforts will be required
3. How much complexity of the US

10. What is Sprint zero?


 Answer- For next sprint preparation whatever the extra effort will be required for
preparation these effort will called Sprint Zero
 Ex. PM will do  Sprint 1 (last day) Next sprint (Sprint 2) US & US (US against task)
rough estimation

11. When are a delivery/ deployment in your project?


 Answer- In my project we are working in Scrum Agile methodology, where we will
work for 2 week/ 3 week (Monday to Friday)
 In my project we will Delivery/ deployment on – Saturday/ Monday

12. Who will do the Delivery/ deployment?


 Answer- In my all project, Developer will do Delivery/ deployment to All environment
& to the client

13. What is Kanban & Lean


14. What is your approach if US is not completed (development/ testing) in current
Sprint?
 Answer- In my project, PM will tell to developer & tester will work on Saturday &
Sunday and complete the work (More chances work will be complete)
 But still If still work will not completed in Saturday & Sunday then PM will prepared a
US in next sprint – US= “Reaming Work In Last Sprint”

15. What is the Scrum of Scrum Master?


 Answer- Head of all Scrum Master (1 Scrum of Scrum Master – 4 PM present under
in that)

Agile  Scrum Bunch of US work = Sprint 1


EPIC- Name – New Electricity module in Paytm - 40 US – Product Backlog

Sprint Backlog – 4US – Electricity + 4US recharge module + 4US Rent payment +4US

Sprint 1 = 16 US = Working

Sprint 2 = 18 US = Working

Reward  Preference (Food/ Entrainment/ Retails/ etc)

US- Preference in reward at end user

Project/ Application –

Frond end/ UI/GUI Service (SOAP service/


 Manual & REST services)  API
Automation Testing Testing

Database- (SQL
Server)  Database
Testing

 Different types of technology in your project


1. Frond end  Dot net languages
2. Services  Java languages
3. Backend/ Database  SQL Server

Environment-
 In my Project 4 types environment
1. Dev environment - developer work  Coding + WBT
2. SIT environment (System integration testing) – Tester  TCD, TCE, defect
3. UAT environment  Client Location (Developer + tester )
4. Prod/ Live environment End user application
Final Regression
Dev SIT UAT / Pre- Prod/ Live
environment environment product environment
environment

IP/DB- 172.20.10.100 IP/DB- 172.20.24.172 IP/DB- 172.12.40.212

Pune - Accenture Client- USA

 In Dev environment  developer will do coding & WBT


 Developer will sent the Build deploy/ delivery into SIT environment
 Dev environment (feature 250 line code)  Build delivery (250 line)  SIT
environment (feature -250 line code)
 Developer will sent Mail to Tester (throw JIRA/ HAPLM)  in Mail Attachment –
Unit Testing documents
 In SIT environment  Tester will work = TCD, TCE
 In TCE if we found defects then we will raised to developer  Inform throw Mail
(JIRA/ HAPLM)
 Defect will fix by developer in dev environment  Modified Build sent  Tester will
check defects
 In SIT environment TCE, tester will prepared the test proof  Mail sent to UAT
team throw JIRA/ HPAML tool

Dev open URL= 172.20.10.100:8080/paytm.com


SIT open URL= 172.20.24.172:8080/paytm.com

Dev environment Testing-


1. Unit Testing
2. Integration Testing

SIT environment Testing-


1. Sanity Testing/ Smoke Testing
2. System & functional Testing (BBT)
3. Re-testing
4. Regression Testing, etc

UAT environment Testing-


1. Alpha Testing
2. Beta Testing

Prod/Live environment Testing-


1. Production Issue

Dev environment-
 In Dev environment , developer will work
 Developer will do coding as per US
 After completion of coding Developer will do testing (WBT)
1. Unit Testing
2. Integration Testing

1. Unit Testing-
 Unit testing will be performed on every Sub-module
 Unit testing will be performed on every US (Specific functionality)
 Ex. Paytm  Recharge ModuleUS= Promo Codes tab in Recharge developer has
done the coding Developer will do Unit Testing
 Unit Testing documents will contains  Screenshot for testing WBT, Tables Name,
URI/URL, etc

2. Integration Testing-
 Integrations testing will performed by developer
 Integrations testing will performed on Main Module
 Integrations testing will performed on by combing all sub-module
 Ex. Paytm  Main module= Recharge Module

1US-Sub-module 2US 3US 4US 5US 6US

Recharge Mobile no. text Browser Promo code Payment Thank


Module icon & Operator planes message
text, Amount
tect
1. Frond integration testing-
 For Frond interaction testing developer will use call function (OOPS)

2. Backend integration testing-


 For backend interaction testing developer will use join Quires

Integration testing approaches-

A. Top down approaches-

Starting page/ Main Sub-module tab/ page


page
Under development/ another company

Stub- Temporary program  developed


throw XML

 If we don’t have sub-module then developer will prepared Stub


 Stub- It is Temporary program prepared by developer throw XML languages

B. Bottom up approaches-

Sub-module tab/ page


Starting page/ Main page

Under development/ another company


Driver- Temporary program 
developed throw XML

 If we don’t have sub-module then developer will prepared Driver


 Driver - It is Temporary program prepared by developer throw XML languages

C. Sandwich approaches-

Starting page/ Mail page Sub-module tab/ page


Sub-module tab/
Under development/ Under development/
page
another company another company

Driver- Temporary program  Stub- Temporary program  developed


developed throw XML throw XML

Interview question-

1. What are technologies used in your project?


2. How the deployment process in your organization?
3. Who will do the all deployment in your organization & how?
4. How you got the Build?
5. When you are starting the Testing?

SIT environment-
 In SIT environment tester will work
 Tester will do TCD & TCE in SIT environment
 What is these testing, Why we will do these testing, What will we do in these testing,
Preparation of defect/ TCD/ Documents/ mail in these testing, etc
Sanity Testing-
 When developer will sent us a New build and inform throw Mail (JIRA/ HPALM)
 In SIT environments, Sanity testing it is a first testing which is performed, Sanity testing
is called level zero testing
 When we got the build, tester will check build stability i.e. checking either these build
is stable for testing or not
 Sanity testing also called Tester acceptance testing/ Build verification testing/ build
test
 In Sanity testing we will check
1. Validation the core functionality of application/ build
2. Validation the GUI/ UI of application/ build
3. Validation the link of application/ build
4. Validation the tab/ pages
5. Validation the navigation

 In Sanity Testing if build is not stable for testing then tester will reject the build
 In sanity testing, tester will not log/ create a defect
 In sanity testing, if build is not stable for testing then we will reject build and inform to
developer throw Mail (JIRA/ HPALM)
 In sanity testing, tester will not writes test cases

 Ex. Paytm- Rent Payment (Module)  US  Coding(500 line) & Create New Build
(V9.0) New build will sent/ Deploy  Tester will do Sanity Testing  Checking
Build stability for testing  In Build core functionality in not working  Tester will
New Build reject (V9.0)  Developer will fix core functionality & Create New build
(V9.1)  New build (V9.1) will sent/ Deploy  Tester will do Sanity Testing

 In Sanity testing build is not stable or Sanity testing issue due to SIT environment
problem, System hung out, Run time problem, Core functionality/ Basic
functionality, etc

 For Sanity testing, we required 2 to 3 hr for every New Build


 In sanity testing, only tester is involved

Smoke Testing
 Smoke testing it is advance version of sanity testing.
 Smoke testing  When we got the New build, tester will check build stability i.e.
checking either these build is stable for testing or not
 In Smoke testing we will check
1. Validation the core functionality of application/ build
2. Validation the GUI/ UI of application/ build
3. Validation the link of application/ build
4. Validation the tab/ pages
5. Validation the navigation

 In smoke Testing if build is not stable for testing then tester will reject the build and
tester will give/ find the root cause of the defect/ issue
 In Smoke testing = Sanity Testing + Troubleshooting  Done by tester
 In Smoke testing = Sanity Testing + Package validation  Done by Developer
 How to do Troubleshooting-
1. Login into application or open application under test
2. Pass data or verify functionality of application
3. If we found error or defect then we will go Error logs (172.20.24.172:C:\application\
Windows NT\Action\Logs.txt)
4. Logs.txt file will contains all details about defect/ issue

 In smoke testing, tester will not write test cases


 In smoke testing build is not stable or smoke testing issue due to SIT environment
problem, System hung out, Run time problem, Core functionality/ Basic
functionality, etc
 In smoke testing, if we found issue then we found root clause of the defect/ issue and
inform to developer throw Mail (JIRA/ HPALM)
 In smoke testing, 2 to 3hr for every New Build
 In smoke testing, Tester & developer both involved
 In my project, We are following Smoke testing

Interview question-

1. What unit testing documents will contains?


2. What is integration testing? How you performed integration testing?
 Answer- Integrations testing will performed on Main Module. Integration testing will
be performed by both developer as well as tester
 Sprint – US –Module -> US name= Integration testing on Fastagemodule

3. What is the difference between sanity testing & smoke testing?


4. What is Sanity testing & Smoke testing? Which testing is following in you origination?
 Answer- In my project, We are following Smoke testing
 In smoke Testing if build is not stable for testing then tester will reject the build and
tester will give/ find the root cause of the defect/ issue
 If interview ask we will performed sanity testing, in my project, we are following
Smoke testing may be we are doing same work as you sanity testing doing.

5. Which is the testing you will prefer after getting a new build?
6. Which types of defects you generally got in Smoke testing?
7. How you will inform to developer in sanity testing/ smoke testing?
8. What are you doing in sanity testing/ Smoke testing? Are you writing test cases for these
testing?

System & Functional testing (BBT)-


 System and functional testing performed after completion of smoke testing/ After
checking the build stability
 System and functional 4 types
1. Usability testing 90 to 95%
2. System & Functionality Testing
3. Performance testing - Jmeter, Loadrunner, etc
4. Security testing - Jmeter, SOAPUI, etc

Usability testing-
 Usability testing, tester we will validate user friendlessness of the Screen/ application
 Usability testing 2 type,
1. GUI(graphical user interface) / UI(user interface)-
 In GUI/ UI testing, as tester validation
A. Validation look & feel of the screen / application
B. Validation ease to use (End user ease) of screen / application
C. Speed of interface

2. Manual support Testing-


 In manual support testing, as tester we will Validation , How manual text will be
support in screen / sensitiveness on screen

System & Functionality Testing-


 2 types of System & Functionality Testing-
1. Functionality testing – validation internal feature of the application/ build
2. Non- Functionality testing - validation external feature of the application/ build

Functionality testing (BIEBSC)-


 In Functionality, tester will validation internal feature of the application/ build
 For validating internal feature we will performed different coverage
 Functionality is different types
1. Behavioral coverage testing
2. Input domain coverage testing
3. Error handling coverage testing
4. Backend converge testing/ database coverage testing
5. Service based coverage testing
6. Calculation based coverage testing

1. Behavioral coverage testing-


 Tester will test/validate, behavioral of the object/ Web elements present in application
 Tester will check behavioral of the object/ Web elements & property of the object/
Web elements

Object/ Web elements Behavioral & Property


Text box Focus & Un- Focus
Check box Check & un-check OR Tick & Un-tick
Button Enables & Disables
Radio button On & off

2. Input domain converge testing-


 Tester in input domain coverage testing, tester validation different input size / length &
data types
 For input domain coverage testing maintaining these size/ length & Data types
A. Bounder values analysis (BVA)
B. Equivalent class partition (ECP)
C. Decision table testing techniques

Bounder values analysis (BVA)-


 BVA we will check Input size / length
 Ex. Login page- Username- only Mobile no, Password- 4 to 6 charter combination
(Capital, Small letters & No.)

Username-

Password-
Submit

BVA (Size/length) Pass Fail


Username text box - 10 digits 8, 11 digits
Password text box - 4 digits 3 digits
5 digits 7 digits
6 digits 8 digits

Equivalent class partition (ECP)-


 Equivalent class parathion validation input pass – data type
 Ex. Login page- Username- only Mobile no, Password- 4 to 6 charter combination
(Capital letters, Small letters & 2 No.)
 ECP (data type) Pass Fail
Username text box - 0-9 (int) A-Z, a-z, Special chare

Password text box - A-Z, a-z, 0-9 Special char (4 to 5 length)

Decision table testing techniques-


 Validation different input combination on the object/ web elements  result
 Ex. Ex. Login page- Username- only Mobile no, Password- 4 to 6 charter combination
(Capital, Small letters & 2 No.), Submit
Objects Rule 1/ input Rule 2/ input Rule 3/ input
combination 1 combination 2 combination
Username Valid Valid In-Valid
Password Valid In- Valid Valid
Submit Press Press Press
Result Home Error Error
3. Error handling converge testing-
 Validating different types error will generated in object of application
 If we will pass invalid/ wrong data into object/web elements
 Ex. Paytm –Recharge module
4. Backend coverage testing/ Database converge testing-
 Validating frond end operation that data stored into backend
 Backend coverage testing also called database testing
 Ex. Paytm –Recharge module (frond end)- Transaction ID/ order ID- 56783408456

DB- SQL server - Select * from Table


Where Transaction ID/ order ID = 56783408456

5. Service based converge testing-


 Validating sequential operation of the application
 Ex. Paytm –Recharge module – Mobile no. & Circle & operator & Amount  Promo
code  Payment tab  If payment done Thanks message OR If payment not done
Pop for cancellation –Yes  Home Recharge module
 Ex. Paytm –Recharge module – Mobile no. & Circle & operator & Amount & Click on
fast forward Payment tab  If payment done Thanks message OR If payment not done
Pop for cancellation –Yes  Home Recharge module

6. Calculation based converge testing-


 Validating arithmetic calculation of application
 Ex. Paytm  Recharge module 499rs Promo code (10%)  Payment – 499-49 =
450rs

Non- Functionality testing (RCCIISPG)-


 In Non-Functionality, tester will validation External feature of the application/ build
 Non- Functionality testing, different testing types
1. Recovery coverage testing
2. Compatibility coverage testing (Browser Compatibility testing)
3. Configuration coverage testing/ hardware coverage testing
4. Installation coverage testing  We have not done
5. Intersystem coverage testing
6. Sanitization coverage testing
7. Parrelaization coverage testing  We have not done
8. Globalizations coverage testing

1. Recovery coverage testing-


 Validating either application is handling abnormal situation
 Ex. Filpkart- Payment page/ tabWhile doing Payment close or refresh the page 
Close application  Add to card page/ tab

2. Compatibility coverage testing-


 Validating the application  Client excepted platform work
 Compatibility 2 types
1. Forward Compatibility testing –

Application/ Build OS SIT / SIT Platform


 Problem OS SIT

 If we have problem in OS of SIT environments  Network team

2. Backward Compatibility testing –

Application/ Build  OS SIT / SIT Platform


Problem in
application/ Build
 If we have problem in application of SIT environments  Tester
 Backward Compatibility testing  In my project I have done “Browser Compatibility
testing”
A. Cross browsing Compatibility testing –
 Validating application/ build it support different browser
 Ex. Different browser – Chrome, Firefox, Opera, Edge, suffari, etc

B. Version control Compatibility testing-


 Validating application/ build it support for different version on same browser
 Ex. Chrome browser – V90.10, V89.00, V85.00, V70.00, etc

3. Configuration coverage testing/ hardware coverage testing-


 Validating the application I supporting the hardware (Print, Blutthoot, etc)
 Configuration coverage testing also called hardware coverage testing

4. Installation coverage testing  We have not done


 I have not done, these testing

5. Intersystem coverage testing-


 Validating application  different service / application connect or data interchange
 Ex. Paytm- Recharge mobile 

Mobile recharge

Paytm

IRCTC

Electricity (MSEB)
6. Sanitization coverage testing-
 Validating the application extra feature/ Garbage values
 Ex. Paytm  Recharge module  Mobile no. text box
10 digits
Mobile no. text box-

 Extra feature added  +91

Mobile no. text box- +91 - 10 digits

 For extra feature/ Garbage values tester will raised defects

7. Parrelaization coverage testing  We have not done


 I have not done these testing

8. Globalizations coverage testing-


 Validating application is supporting to all langue’s
1. Standard langue’s/ Universal langue’s – English langue’s support
2. Regional langue’s – Hindi, Marthi, Tamil, etc langue’s support
 For regional testing we will Google translator

Re-testing-
 In Re-testing tester validating the functionality of the application by passing multiple
test data
 Testing which is performed by passing multiple test data, these testing called re-
testing
 Ex. Paytm – FastTag module  Multiples vehicle no. enter (4 tiers, 6 tiers, 8 tiers, 10
tiers, etc)
 Test data we will get from Databases (SQL server) -
 If we have old project  Existing data in Databases (SQL server) OR Tester in SIT
environment will create the test data
 If we have new project  For testing we required test data then Tester in SIT
environment will create the test data
 If we have project  Test data depends on other application  Ex. Phone pay – feature
UPIID created  Mobile no., Bank No., Debit card, etc these data related to bank
These Test data will provided by BA
 Ex. Paytm  Credit card new functionality  Payment for bus ticket  Credit card
dummy card= 4000111100001111

Regression Testing-
 In re-testing or in BBT, when tester found a defects then tester will raised to developer
 Then developer fix or resolved the defect and developer will sent Modified build and
tester will do the testing (regression testing)
 Regression testing – Re-execution of test on modified build, to check defect has been
fixed/ work properly & there is no side impact on interconnected modules
 Regression testing = Regret + Action
 Ex. Paytm – Recharge Module  Recharge module build (V9.0 - 500 lines codes) 
Mobile working Recharge  VI, Airtel, JIO, MTNL but BSNL mobile no. is not
working- defect (6 to 7 hr)  defect raised to developer  developer will fix or
resolved modified build (V9.0 - 550 lines codes), BSNL should work  Developer will
sent modified build (V9.0 - 550 lines codes) On modified build (re-testing +
Regression testing) tester will do Regression testing  check defect has been fixed/
work properly, BSNL mobile no. work (10 to 12 test data - multiples BSNL no.) (1 to
2hr) & no side impact on interconnected modules i.e. VI, Airtel, JIO mobile work

 Regression testing will perfumed in 2 times


1. In SIT environments - if we found defects
2. Final Regression testing- When application is moving from one environment to
another environment

 In Regression testing we will execute these test cases


1. Failed test cases (Ex. BSNL mobile no.)
2. High priority test cases (Ex. VI, Airtel, JIO, MTNL mobile no)
3. Extra features test cases or Extra functionality test cases (Ex. +91 mobile)
4. If times permits, we will remaining test cases

 Regression testing performed within 2 to 3hr

UAT/ Pre-product Testing-


 When UAT testing, will start after completion of BBT in SIT environment
 Tester who is working in SIT environment prepared the test proof and sent o UAT
team
 UAT (User acceptance testing) also called Custer acceptance testing
 UAT team works- 2 developer + 1 tester (Client location)
 UAT check/testing system & functional
 UAT 2 types
1. Alpha Testing
2. Beta Testing

Alpha Testing Beta Testing

Alpha testing performed on service based Beta testing will performed for product based
application application
Ex. Paytm/ Zerodha/ Upstock etc Ex. Splitewise, Adoberedaer, Khatabook, etc
In UAT, for Alpha testing developer + Tester In UAT, for Beta testing developer + Tester
are present are not present
If defects/ Issue application immediate If defects/ Issue application fixed/ resolved
fixed/ resolved in next version of the application
Client interaction is more End User interaction is more
Alpha Testing  check system & functional Beta Testing  check system & functional

 In my project, we will do Alpha Testing  Service based project


 I have do these testing by accessing remote desktop
 Throw remote desktop, we will access UAT sever & do the testing
 Tester will not work in both environments (SIT & UAT)
 In project, UAT tester is on leave (4 to 5 month) that I have done UAT testing.
 I same project we have UAT testing also, these time I have not SIT environments
 In one module I have performed SIT testing & in another module we have performed
UAT testing
 If we found issue/defect in UAT testing, UAT tester will raised these defects to
respective developer & SIT Tester.
 SIT tester, he re-produce the defects in SIT environment.
1. If defect found in SIT environment then we will inform developer and say that fix the
defect ASAP
2. If defect not found in SIT environment then SIT tester shows Test proof to UAT
tester
Final Regression –
 Regression testing will perfumed
1. Final Regression testing- When application is moving from UAT to prod/ Live
environment

 In Final Regression testing we will execute these test cases


1. Failed test cases (Ex. BSNL mobile no.)
2. High priority test cases (Ex. VI, Airtel, JIO, MTNL mobile no)
3. Extra features test cases or Extra functionality test cases (Ex. +91 mobile)
4. If times permits, we will remaining test cases

Production testing-
 After completion UAT, then developer will deployed the code/ build from UAT to
production
 If we found a defects in Production environment these defects called production
defect
 Production defect will occurred due to 2 reason
1. If Tester have missed functionality while doing the testing, then these production
defect “Hot Fix”
2. If Client has missed some functionality and these defects found in production issue,
then these production defect “CR (Change request)”
1. If Tester have missed functionality while doing the testing, then these production
defect “Hot Fix”
A. If defects/ issue found in production then Client will sent a “Escalate Mail” to project
team  Tester will re-produce the defects in SIT environment  If defects/ issue
is found in SIT environment  He will inform to developer and say fix these issue
ASAP and deployment to client (Project Manager ask to Tester sent me apology
mail / Reasoned why these has been happed against these production issue)
B. If defects/ issue found in production then Client will sent a “Escalate Mail” to project
team  Tester will re-produce the defects in SIT environment  If defect is not
present in SIT environment  Tester will Mail/Call to PM, Developer & Designer,
these issue is not present in SIT environment & attached test proof  Developer will
check the deployment process/ environment problem/ Configuration file 
Developer will these issue & deployment of client

2. If Client has missed some functionality and these defects found in production issue,
then these production defect “CR (Change request)”
A. If we will get CR from client then we will accept these CR but if CR is impact more
on Developer, Testing & Production. Client inform
B. If we will get CR from client then we will accept these CR and if CR is having the
less impact Developer, Testing & Production. CR deployment & Testing and
deployment to client.

Production Issue

You side (Company) Client

Production issue (Hot Fix) Production issue (CR)

Impact Impact

Development Development

Testing Testing

Deployment / delivery Deployment / delivery


Testing Terminology-
 Testing we will scenario

Monkey Testing-
 If we have more test cases ( 70 to 100 test cases) & we have less time (1hr to 2hr) for
testing then Monkey testing OR If we have get the build at last moments (in Friday)
& we have less time (1hr to 2hr) for testing then Monkey testing
 In Monkey Testing, we will execute high priority/ Core functionality application/
build (system & functionality testing)
 We will inform to client about miner defect may be present in the application/ build
 If we got issue/ defects (high priority/ Core functionality) in money testing then we can’t
deployed these US to client
1. If client say I want these deployment with these sprint, developer & tester will work
Saturday & Sunday  Developer & test & deployment to client
2. If client say not you can deployment to next sprint

Exploratory Testing-
 If we have more test data but we have less knowledge about the application then
Exploratory Testing
 Ex. If your collogues is absent (2 days immediate leave) and testing will be done by
another tester
 Tester will do testing by passing more data on the functionality & we will
Exploratory (application functionality) functionality understand

Ad-hoc Testing-
 If we have knowledge about the application but we have less test data then Ad-hoc
Testing
 Ex. Application payment tab  More credit care/ debit card for testing
 Credit card = 1 no & Debit card = 1no.

You might also like