0% found this document useful (0 votes)
37 views12 pages

Software Development Model and Methodologies HW Kotenko Liudmyla

The document discusses agile maturity evaluation of a project based on a questionnaire. Key points include: 1. The project is moving in the right direction but needs more time for grooming and communication between team members. 2. Areas that are partially or not achieved include dedicated test environments, automation script maintenance, involvement in meetings, and testability of enhancements. 3. Wasted testing time can occur from a lack of understanding goals, focusing only on positive cases, and untestable requirements. Meetings and feedback sessions are sometimes missed.

Uploaded by

Lyudmila Kotenko
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views12 pages

Software Development Model and Methodologies HW Kotenko Liudmyla

The document discusses agile maturity evaluation of a project based on a questionnaire. Key points include: 1. The project is moving in the right direction but needs more time for grooming and communication between team members. 2. Areas that are partially or not achieved include dedicated test environments, automation script maintenance, involvement in meetings, and testability of enhancements. 3. Wasted testing time can occur from a lack of understanding goals, focusing only on positive cases, and untestable requirements. Meetings and feedback sessions are sometimes missed.

Uploaded by

Lyudmila Kotenko
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
You are on page 1/ 12

Task description

1. Evaluate agile maturity of your project against questions on Questionnaire sheet

2. Add your feedback according to the points that you do not achive or partially
achieve.
Is it really issue for your project?
If not, why do you think so.
If yes, what are you doing on the project to change the situation

3. Preapre summary based on calculated metrics and your comments.


Tried to explain near each option
on the next page

We are moving in right way but we


need more time for groomings and
comunication with each other, it
will be good idea to take less tasks
into sprint
Dimension Focus Area

Product Owner

Agile Testing
Organization

Scrum Master

Team

Agile Test Cadence

Flexibility

Unit Testing
Agile Test Strategy
Development practices

Automation of Functional Testing


Integration Testing
Test Environment Setup

Meeting Acceptance Criteria

Requirements Testability

Testing Driven Development

Definition of Done

Agile Testing Practices


Agile Testing Practices Exploratory Testing

Scrum Ceremonies

Regression Testing

End to End Automation Testing

Dedicated Test Environment for


Automation

Automation Scripts Maintenance

Automation Development Life


Agile Test Automation Cycle
and Tools
Continuous Build Integration

Non Functional Testing

Agile Project Management Tool


Specific Goal
There is only one Product Owner for Agile team. PO is not reporting manager for the team.
Team interacts with Product Owner alone for getting clarifications on user stories.
Product Owner defines Acceptance Criteria for the user stories.
Product Owner is decisive while providing clarifications on user stories picked during sprint planning.
Product Owner does not insist of taking up user stories with open questions, not clear for the team that
required clarifications, in the sprint plan.
Does backlog grooming on regular basis.

Is available to the team during release planning, sprint planning. Makes himself available on need basis as well.
Acts primarily as a coach to the team and Product Owner
Has the authority to escalate to highest levels in the organization for clearing impediments.
Is a serving leader and not playing the role of Resource Manager for the team members.
Team members are not shared resources across teams and not shuffled between teams frequently.
Team size remains fairly constant and doesn’t change often.
Team size is between 5-9 members.
Team members have test automation capabilities to build and execute automation scripts.
Builds are available for testing team on demand in an automated fashion.
Features are delivered for testing within a sprint in continuous fashion rather than giving all together for
testing near the end of sprint.
Everyone supports and ensure testing happens continuously through out development.
The team members have willingness for change:
- Retro meeting happens on regular basis, improvements are defined and executed by team
The team is able to adapt to changes in requirements in backlog.
Unit testing by developers has at leasr 80% code coverage.
Unit tests execution is integrated with Continuous Build Integration
Code review is performed for each changes
Static code analysis is performed (e.g Sonar)
Coding startdards are defined and followed by developers

Required functional regression suite is automated.


Integration between applications in an Enterprise setup is end-to-end tested using automated suites.
Deployment of the builds into the test environment and test environment configuration is automated and
requires no manual intervention to run tests.
Test scenarios creation, regression test scenarios identification, go/no-go decisions are all based on user story
acceptance criteria.
Testing team members are involved in user story elicitation, grooming and acceptance criteria definition and
they determine the testability of user stories.
Automated tests are developed in sprint.
Automated tests are developed prior to application coding.
Automated tests are refactored to end-to-end automated scenarios later to make them part of automated
regression test suite.
Successful test completion and resolution of all (or major and higher severity) identified defects is part of
Definition of Done.
Test automation of acceptance criteria of each user story is part of DoD.
Exploratory testing is done routinely to ensure the test coverage goes beyond the routine regression test
coverage.
Testing team members participate in daily scrum meetings.
Testing team members participate in sprint planning meetings.
Testing team members participate in sprint review meetings.
Testing team members participate in sprint retrospective meetings.
Testing team members participate in product backlog grooming meetings.
Functional regression testing is done for each sprint, to give confidence that the deliverables in each sprint are
production deployment ready.
Release sprints are identified, in case regression testing is not possible to do in every sprint.
End to end test scenarios are drafted for automation and then automated.
End to end automation scenarios cover integration with other applications.

Automation development and execution happens in a dedicated environment for automation testing.
Automation tests run frequintly, at least once a week
Automation tests are stabilized (95% pass rate)
Automation test result analysis is performed for each run
The automation tool, framework and scripts has good user and training documentation.
Automation development is done with a defined and measured development process.

Automation tests are integrated with CI tool and the tests run automatically after each build deployment.
Automated regression test suites cover non-functional requirements of the application along with functional
requirements.
Agile PM tools used for managing product backlog, sprint backlog, tasks tracking, burn down charts, efforts
and stories acceptance.
Application defect tracking tool is integrated with Agile PM tool.
Agile PM tool supports tracking of user stories size in terms of story points along with effort tracking and
enables tracking of productivity (velocity) improvement over multiple sprints.
Maturity
Largely Achieved
Fully Achieved
Partially Achieved
Not Achieved

Partially Achieved
Not Achieved

Fully Achieved
Fully Achieved
Fully Achieved
Largely Achieved
Partially Achieved
Not Achieved
Partially Achieved
Largely Achieved
Largely Achieved

Largely Achieved
Largely Achieved

Fully Achieved
Largely Achieved
Partially Achieved
Largely Achieved
Largely Achieved
Largely Achieved
Largely Achieved

Largely Achieved
Largely Achieved

Largely Achieved

Largely Achieved

Partially Achieved
Largely Achieved
Not Achieved

Fully Achieved

Largely Achieved
Largely Achieved
Largely Achieved
Fully Achieved
Fully Achieved
Fully Achieved
Fully Achieved
Largely Achieved

Largely Achieved
Fully Achieved
Fully Achieved
Largely Achieved

Largely Achieved
Largely Achieved
Largely Achieved
Fully Achieved
Largely Achieved
Largely Achieved

Largely Achieved

Largely Achieved

Largely Achieved
Largely Achieved

Largely Achieved
Comments to the fields that Not Achieved/Partially Achieved

I am not on such kind of meetings because I am not teamlead


All team decide this

need more comunication with customer


no

we still work on this


we have big problems with decreasing team size

loks like it is done

sometimes enhansments are not testable


1. Specify where tester can waste time

2. Evaluate your testing process against identified values - do you have


any activities that not bring value in your testing process ?
Tester doesn’t know goal of testing (element/system), focusing
just on positive test cases, tester has not understanding of why he
is testing, review/feedback sessions were missed, not testable
requirements

Some times I lost a lot of time gathering requirements because they are
not full, we do not have grooming meetings

You might also like