0% found this document useful (0 votes)
68 views7 pages

Duration Testing: Guidelines/checklists

The document discusses various types of testing conducted for MFP-CPE projects including duration testing, memory testing, regression testing, performance testing, localization testing, and exploratory testing. It provides guidelines for conducting each type of testing and describes the process for logging any defects found in the Marks Information Management System (MIMS).

Uploaded by

Arya Sumeet
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
68 views7 pages

Duration Testing: Guidelines/checklists

The document discusses various types of testing conducted for MFP-CPE projects including duration testing, memory testing, regression testing, performance testing, localization testing, and exploratory testing. It provides guidelines for conducting each type of testing and describes the process for logging any defects found in the Marks Information Management System (MIMS).

Uploaded by

Arya Sumeet
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 7

MFP-CPE focus on testing like Duration Testing Regression Testing Performance Testing Memory Testing Exploratory Testing /Ad-hoc

Ad-hoc Testing Localization Testing

Duration testing Duration testing is a special style of testing whose purpose is to discover defects specifically in the contention for resources and interaction between components in the system. Each component is responsible for doing component level testing; duration is responsible for stress testing the system as a whole to discover defects that occur when components do not interact properly or starve each other for resources. Triage is the process of collecting information about an engine or emulator in a crashed state, analyzing that information, discovering as much as possible about the root cause of the failure, and then getting that information into the hands of the right people. The Lab Setup consists of the Products under test; Windows test PCs and UNIX testing environment with Clear case and several tools and application required to run duration depending on the product testing requirements. With the Lab Setup in place, the duration scripts are started on individual printers and are closely monitored for any crashes/hangs/errors. Guidelines/checklists: 1. Upgrade all the duration machines with the latest Firmware for which the test request has been raised in MIMS. 2. Schedule all the Machines for the respective freeze run in the MIMS. 3. Capture all the bash data for the respective printers. 4. Configure the respective printers for Fax/DSS. 5. Start Patt Duration on the scheduled printers. 6. Keep a close watch on the printer requirements and observe for crashes/hangs/errors that occur on the printer. 7. Log a crash/hang in the MIMS page once you observe the same on the printer running duration. 8. Collect all the bash and gdb logs for the defects/crashes/hangs that occur on the printer. Memory testing

Process: Memory Tests are run the same way we run duration with the ShmHeap being captured at regular intervals. The setup will also remain the same except for the ShmHeap capture will be enabled in the patt.cfg file of the respective printer on which we are running memory tests. Memory tests will be run separately for different job categories independently and the logs created are checked for memory leaks Guidelines/checklists: 1. Upgrade the machine to the freeze on which the memory tests have to be run. 2. With the ShmHeap variable enabled in the patt.cfg start duration test cases with only the particular job category for which the memory leaks have to be determined. 3. Analyze the ShmHeap logs that are collected and observe for the memory leaks. Regression Testing Process: Regression Test for MFP-CPE basically involves Session Creation, Test assignment and Execution, Test Review and Defect logging. Session Creation: Session creation is done in Neo tool by onsite test engineer when test request is received for a freeze. The session includes all the tests for the specific product and grouped according. Test Assignment and execution: Once the session is created, tests are reviewed preliminary and they are assigned to tester in Neo according to the priority. i.e the tests with a greater priority assigned first. Tester will execute the tests assaigned.and update in Neo with test results. These test results are then visible to Review section of Neo. Test Review: Test Analyst/engineer will review the test results. The Pass test cases were just review for the field of Machine Time and Tec Time in Neo. Test analyst will reassign the failed tests to different testers. These failed tests are executed for current freeze and also over the previous freezes. If it is reproducible then they were analyzed with respect the behavior, freeze. Defect Logging: Once the failed test is the defects are logged in to MIMS and MIMS defect ID will be generated. The test result will be updated with MIMS defect ID in NEO. Guidelines: When the reviewer encounters a failure for a test case, he/she verifies the failure and enters details in the MIMS log page. Data Capture during regression: As in regression a lot of test execution involved data capture in order to track the progress of the program is required. For these purpose below are the data that is been captured. - Total tests in a Session

Groups of test in each session Test reviewed per day (name of tester, name of reviewer any defect logged) Test assigned per day (name of the tester, test groups, TE, number of test) Number of tests executed per day (Tester, Test case number) Pass back Test Details Defect details

Performance Test: Performance tests are conduced to measure the performance of MFP functionalities like copy, scan to send email, scan to send fax etc. The time to perform a MFP job with particular load are measured and compared with other freezes. These tests should be run on a private network to avoid slowing due to network traffic. Private network should have email capability. Anteater should be installed with the FMTR trace setup. Enable logging and check the log file to make sure that time stamping is working. Then the log can be used to calculate the times each process took. Localization Testing: Localization testing is done as a part of manual testing. The manual test team has to coordinate with IE (Information Engineering group in hp) for getting the strings localized. Ideally we are supposed to get the Localized strings from IE before FC. The test team can get the localized strings from IE upon request. The IE has a repository for this and they will send us the path once the strings are localized. The strings will be sent to the requestor individually. We get the strings localized only once for every CPE release, until and unless we do have an immediate requirement to do so. The actual testing happens between FC and CC for localization. The reason being, that this functionality will be completed only by FC and the strings will be finalized based on the functionality that is implemented. The tester should change the language for each MFP and compare the strings that have been added or modified with the localized strings we have received from IE for a particular language. The correctness of the strings in different languages is ensured by IE. The testing effort can be captured in an excel sheet and tracked for every release. Defects found during testing will be logged in MIMS. Exploratory /Ad hoc Test: Exploratory testing is an informal method of testing in which the tester explores the various areas of the MFP in parallel with other manual tests. Exploratory testing also called as ad hoc testing, are basically done with the new feature areas in order to get more code confidence. Guidelines:

Product and the features specifications are the main inputs while carrying out exploratory testing and not the test case procedures like traditional testing Testers are encouraged to try various options, possibilities and combinations of the features /functionalities Preferred when there is a large code turmoil and less time for execution Expertise in the particular area preferred Higher level test plan need to be provided without strict boundaries for scope of test execution. All observations are noted and validated against the product/feature specification. Valid test scenarios to be added into regression suites. Back to Top How to log the defects? The issues, defects and crashes observed during the various kinds of testing is logged by the MIMS(Marks Information Management System).it can be accessed by Address192.6.8.156 MIMS: MIMS is the single point access for information related to MFP-CPE project. It has been designed with the following main objectives. MIMS defect logging: At the end of each shift, the shift lead reviews the MIMS log page, checks if the defect is not present in MIMS/DIMS already and then logs it as a MIMS defect. This helps avoid logging duplicate defects. If this review is not done and too many duplicate defects are logged; tracking and closing of duplicates could turn out to be an overhead. Test Execution Reviews: The shift leads get involve in test reviews parallel to execution in Web Neo and submit the reviews. Reviews should be submitted only when he/she ensures that all details are filled appropriately. For failed tests, the leads review the MIMS logs, raise a defect and ensure that the correct MIMS defect ID is entered in Web Neo. Only then, he/she close review for a test case. Defect Tracking: MIMS acts as a defect tracking tool for the internal defects found by MFP-CPE testing team. This works similar to DIMS. It also acts as a proxy to the DIMS site and helps you to update DIMS defects. In addition to this, it also manages hot sites, Top issues and Beta Requests created by the Tech Marketing. The MIMS defects and logs section of the webpage will have defects logged by the team. The test leads are also responsible for defect tracking and kick-off process of test writing for defects that are being fixed by the developers. He also ensures that all defects are logged, Neo and MIMS updated and tracked to closure as applicable.

Test Management: MIMS is used to create, manage and track test requests for various CPE products and FW builds. This includes Regression, Fixed Focus and Duration testing. It helps you to know the duration statistics with ease. It gives you the number of pages printed, failure rate and the target failure rate for different products and machines. It acts as a proxy to NEO and manages Regression statistics. It links the Merge request with Fixed Focus and assign/Notify a Test Engineer to write FF test cases. In addition to this it also helps to manage the test resources/machines. Integration/Merge Request Management: The MIMS page is also used to Submit and manage Merge requests. It helps the team to follow a review process, where the developer submits a merge request which needs to be reviewed and approved by another developer. All merge requests and review comments are tracked. The integrator takes the approved merge request and merges them to CPE branch and builds the next FW version. It keeps track of the various defects and branches went into each build. Miscellaneous:In addition, it also serves for the following. To know the schedule of various CPE releases and the version. To get support and download MFP related Firmware and software. As an archive for MFP related documents and Feature Development Plans For other teams to submit Post-BR fixes to CPE products. Helps engineers and Managers with a customized home page which lists the defects, test request and review request assigned for them. Priority Every defect must be assigned a priority. Priority establishes a relative ranking among all recorded defects. Those defects that have the higher priority should receive the immediate attention. 1. 2. 3. 4. 5.

Priority 3 2 1 0

Meaning High Medium Low Not

Definition This defect is holding up development or testing or cannot be released to customers. This defect has HIGHER priority than development.
Defect fixing must be balanced with development for current release & is HIGHER priority than development for future releases. This defect has LOWER priority than development for current release & fixing must be balanced with development for future releases.

Screened, but not analyzed.

Prioritized Duplicates Duplicates are moved through the states and resolved with the resolution type Not a defect. Apply the resolve in DIMS. Select the Duplicate action on the Resolve Tab, enter the defect ID that the current defect is a duplicate of and apply. The Resolution code is changed to Duplicate and the duplicate child is linked to the original parent. When the parent defect is resolved and verified, you must also verify any duplicates. Defect Verification Defects must be verified within 21 days of resolution. The method that originally found the defect must be re-executed by the submitter against the fixed code in order to verify the code. If the submitter is not available, test lead is responsible for assigning someone to re-execute the method. If both submitter and responsible engineer are not available, the screener must re-execute the method. Back to Top Test execution and Defect tracking Processes in MFP CPE

Test Execution Process

You might also like