Testing
Testing
1 Objectives of Testing:
This section introduces the concept of testing and how important is, for the successful implementation of the project. Different phases of testing are described along with the level of testing incorporated in this particular project. Testing is vital to the success of any system. Testing is done at different stages within the phase. System testing makes a logical assumption that if all phases of the system are correct, the goals will be achieved successfully. Inadequate testing at all leads to errors that may come up after a long time when correction would be extremely difficult. Another objective of testing is its utility as a user-oriented vehicle before implementation. The testing of the system was done on both artificial and live data. Testing involves operation of a system or application under controlled conditions and evaluating the results (e.g., if the user is in interface A of the application while using hardware B and does C, then D should not happen). The controlled conditions should include both normal and abnormal conditions. Typically, the project team includes a mix of testers and developers who work closely together, with the overall QA processes being monitored by the project managers. 4.2 Types of Testing
Also known as functional testing, this is a software testing technique whereby the tester does not know the internal working of the item being tested. Black-box test design treats the system as a black-box, so it does not explicitly use knowledge of the internal structure. Black-box test design is usually described as focusing on testing functional requirements. Synonyms for black-box includes: behavioral, functional, opaque-box and closed-box. 4.2.2 White Box Testing White box test design allows one to peek inside the box, and it focuses specifically on using internal knowledge of the software to guide the selection of test data. Synonyms for white-box include: structural, glass-box and clear-box. 4.2.3 Condition Testing An improvement over White-box testing, the process of condition testing ensures that a controlling expression has been adequately exercised whist the software is under test by constructing a constraint set for every expression and then ensuring that every member on the constraint set is included in the values whish are presented to the expression 4.2.4 Data Life-Cycle Testing It is based upon the consideration that in the software code, a variable is at some stage created, and subsequently may have its value changed or used in a controlling expression several times before being destroyed. If only locally declared Boolean used in control conditions are considered then an examination of the sources code will indicate the place in the source code where the variable is
created, places where it is given a value is used as a part of a control expression and the place where it is destroyed. This approach to testing requires all possible feasible lifecycles of the variable to be covered whilst the module is under test. 4.2.5 Unit Testing The purpose of this phase is to test the individual units of the developing software component. This phase is recursive and is to be repeated, as many as there are, levels of testing. In the DGLW project, each individual form has been tested using techniques of testing namely: Client side testing using JavaScript. Each individual form has been validated so that user enters only valid data at every time. 4.2.6 Functional Testing: This is done for each module / sub module of the system. Functional testing serve as a means of validating whether the functionality of the system Confers the original user requirement i.e. does the module do what it was supposed to do? Separate schedules were made for functional testing. It involves preparation of the test data, writing of test cases, testing for conformance to test cases and preparation of bugs listing for non-conformities. 4.2.7 System Testing: System testing is done when the entire system has been fully integrated. The purpose of the system testing is to test how the different modules interact with each other and whether the entire system provides the functionality that was expected.
System testing consists of the following steps: a) b) c) d) e) Program Testing String Testing System Testing System Documentation User Acceptance Testing
4.3 Various Levels Of Testing Before implementation the system is tested at two levels: Level 1 Level 2 4.3.1 Level 1 Testing (Alpha Testing) At this level a test data is prepared for testing. Project leaders test the system on this test data keeping the following points into consideration: Proper error handling Exit Pints in code Exception handling Input / Output format Glass box testing Black box testing If the system is through with testing phase at LEVEL 1 then it is passed on to LEVEL 2.
4.3.2 Level 2 Testing (Beta Testing) Here the testing is done on the live database. If errors are detected then it is sent back to LEVEL 1 for modification otherwise it is passed on to LEVEL 3. This is the level at which the system actually becomes live and implemented for the use of END USERS. We have also checked the proposed system for: Recovery & Security A forced system failure is induced to test a backup recovery procedure for file integrity. Inaccurate data are entered to see how the system responds in terms of error detection and protection. Related to file integrity is a test to demonstrate that data and programs are secure from unauthorized access. Usability Documentation & Procedure: The usability test verifies the user-friendly nature of the system. This relates to normal operating and error-handling procedures. 4.4 Quality Assurance
Proper documentation is must for mainframe of any software. Apart from In-line documentation while coding. Help coding, help files corresponding to each program were prepared so as to tackle the person-dependency of the existing system. 4.5 System Implementation
During the implementation stage the system is physically created. Necessary programs are coded, debugged and documented. A new hardware is selected, ordered and installed. 4.6 System Specification Every computer system consists of three major elements. 1. The Hardware 2. Application Software such as visual studio. 3. Operating system For successful operation of the package following must be kept in mind: Too many packages should not be used, as very few systems may have all those packages installed due to memory problem. Thus, the compatibility of the system developed will get reduced. 4.6.1 Hardware Requirements Intel Pentium processor at 500 MHz or faster, minimum of 364 MB available disk space for installation (including IBM SDK), minimum of 256 MB memory,512 MB recommended, CD-ROM drive. Table given below lists the .NET hardware requirements. These hardware requirements are divided into requirements for client/desktop applications, as well as serverside ASP.NET applications. This table lists the Microsoft minimum and recommended system specifications. In typical Microsoft tradition, they've low-necessity of their hardware recommendations. Although .NET applications may run in these low-powered systems specified in the minimum columns, in my experience, Ill be a lot happier with a faster
system. I have included a faster column indicating a more desirable system specification. Platfo CPU rm
CPU
CPU
RAM
RAM
RAM
Minimum( Recommende Better( MHz) d(MHz) MHz) .NET 90 Client .NET 133 Server 90+ 133+ 350+ 450+
Minimum Recommende Better( (MB) d(MB) MB) 32 128 96+ 256+ 128+ 512+
Operating System Requirements: In addition to the hardware requirements, .NET applications also have a minimum required operating system level to support the various .NET features. The Table shows the .NET Framework software requirements. Platform .NET Client Operating System Windows 98 Windows 98 SE Windows ME Windows NT 4.0 Workstation Windows NT 4.0 Server Windows 2000 Professional Windows 2000 Server Windows 2000 Advanced Server Windows 2000 Datacenter Server Windows XP Home Edition Windows XP Professional .NET Server Windows 2000 Professional Service Pack 2 Service Pack 6a Service Pack 6a Additional Software
Windows 2000 Server Windows 2000 Advanced Server Windows 2000 Datacenter Server Windows XP Professional Windows 2003 Server Family Database Access Requirements
In addition to the previously mentioned hardware and software prerequisites, certain database access features used by the .NET Framework have minimum MDAC (Microsoft Data Access Components) levels that are required. MDAC is included in the installation process for the .NET Framework and Visual Studio.NET, so you don't need to worry about it in a development environment.
Platform .NETClient
Notes Needed by the SQL Server .NET Data Provider Needed by the SQL Server .NET Data Provider
1. The internet standards are to be known in advance to develop the application according to the standards for the internet. If I fail to do so the application would fail in various environments which is certainly undesirable.
2. The various cyber laws and obligations are to be known as the system has to deal with privacy and also the emails of the users and thus rules regarding the bulk mails and the emails privacy should be known. 3. The System (SWEMS) policies, the personnel evaluation criteria and also the human psychology are to be studied so as to formulate efficient and accurate algorithms for the calculation of the loan, tax deduction, salary information and leave information. 4. For the organizations whose personnel I would interview, I would require their current employee assessment techniques and formula and I may and may not use them for my project.
4.6.2 Software Requirements The system can be accessed over the Internet connecting all the Internet of Ministry of Labour & Employment nodes. Clients equipped with webbrowsers can access the system from any of the Intranet Nodes. The Software requirements for the project:
As I feel that the project could be developed using the .net languages and if I use the .net platform with the c# concepts for the development, I would like to have the following software for the whole project: 1. .net framework 2. visual studio.net 2005 3. Internet Information Server(IIS 5.0) 4. MS XML parser 4.0. 5. Internet Explorer (5+), Opera (7+), Fire fox.
6. Fusion charts software with free license.(third party software) 7. crystal reports(built with visual studio.net 2005) 8. Microsoft Windows XP professional. 9. SQL Server 2005.
4.7
Installation
The Application installation scripts have to be generated from the current server where the application source code is saved and installed in the main server from where the application is to be run. This was done using a special code, which generates all SQL-Statements to insert preliminary data (like menu entries, code in code directories etc) at server and the operational modules of the application made available to the end users successfully. 4.8 Implementation
The system is still under construction few reports are yet to me made after that this system will be implanted at client side. Users will be given a training to use the package and special workshops are conducted by the Developer for the purpose. And according to their feedback the changes are implanted in the software.
TEST PLAN The strategy that is applied to cover a full systems test of the SWEMS is shown in PSF.Developer used the system documentation to prepare all test cases, design and procedures. Some Employee who are currently working are also involved to check the system compliance with requirements and essential features gathered in analysis phase. Types of testing carried out are 1) Unit Testing and Functional Testing 2) Integration Testing 3) Regression Testing 4) GUI Testing 5) Compatibility Testing 6) Security Testing 7) User Acceptance Testing (Descriptions of all above testing is given below along with test cases and result) POINT OF CONTACT FOR TROUBLESHOOTING PURPOSE Name- Vivek kr. Nirala Designation Project developer Contact number- +919253330097 Email id- [email protected]
USERS INVOLVED IN THE TESTING : Vivek kr. Nirala(System Developer) is the test
manager and test analyst for the SWEMS. Maximum test of systems are conducted by the developer itself. The other users involved in the testing are shown below in the test cases.
As stated above test data is derived from the documentation. The implementation of test data and the steps taken to conduct tests along with reported errors and measures taken is also shown below in the test cases.
Module Name: Login Project Title SWEMS(Smart Web Employee Management System) Test Case Login Name LUT1 Test Case ID Conducted By Vivek Nirala Description Testing Tool Perform login validation Validation Check By Asp.Net Testing Date 10/02/10
13
MODULE EXECUTION Module Steps Result Expected Outputs from Result Module (Pass/Fail) 1.1 Run system it will ask System should open login System opened login Pass Login credentials 1.2 screen screen User logged in successfully Pass
Enter user name and User should be able to log in password LoginID=admin Password= admin
14
Module Name: EMPLOYEE REGISTRATION Project Title SWEMS(Smart Web Employee Management System) Test Name Test Case ID EUT2 Conducted By Description Perform user registration validation Testing Tool Validation check By Asp.Net Parveen Case Registration Testing Date 12/02/10
15
Module
to
2.1
2.2
Enter informations
Pass
16
2.3
Enter same login name for System should display System displayed New user Fail different user
added successfully
Login Name = Eo1 First Name = Kishor Last Name = Kunal Password = kunal Confirm Password= kunal
17
Conclusion: One error found in the module Measures Taken: (Module 2.3 Error Rectification) Developer reviewed the code of adduser.aspx file and rectified the error by identifying the login name code part and check against in the database. If system founds the similar name in database, developer displayed error message User already exists.
18
Module Name: ADD NEW EMPLOYEE Project Name: Test Case Name: Written by: Description: SWEMS Testing Date: Add Employee E-01 Test Case Number: Vivek kr. Nirala Project Developer Approved by: The responsibility of this module is to add new Employee to the system
# 1 2
Function to Test Addition of new Employee details to the system No duplicate title should be allowed
Scenario Expected Results The new Employee should get inserted The system should give a message that the title already exists The system should give a message that the Yearly salary exceeded as norms of Per day wise. The system should prompt user for the empty fields
Actual Result New Employee details were inserted successfully The system prompted the user that the title already exists The system should give a message Admin that the Yearly salary exceeded as norms of Per day wise. The system prompted user of the empty fields
Fail
Pass
19
Conclusion: The new feedback session was added successfully but there was an error that the system allowed the yearly
salary to coincide with the Per day. INTEGRATION TESTING Different units of proposed system (SWEMS (SmartWeb Employee Management System)) are combined and tested as groups in multiple ways. Developer chooses bottom up approach. This means that integration testing starts at the bottom level. The lower level integration test modules are described in the corresponding components unit test. Developer conducted integration testing to expose the problems with the Net Admin Control interfaces before trouble occurs in real world execution. Integration testing is done to test the overall Net Admin Control system. Project Title Test Case Identifier: Test Item: Environmental Needs: SWEMS IT-01 Integrate the admin control with other system operation An object performance Testing Date 11/03/10
20
Module Input Specification Integrated 1.1 1) Start Admin Login 2) Click Recruitment Employee 3) Browse CV file 4) Browse Profile 5) Add Uname Employee 6) Close the Add Employee Window 7) Verify Employee List
Status(P/F) Pass
Comment The test procedure verified that Admin Control and Browse Computer module are working correctly
2)
3)
4)
5)
21
Project Title Test Case Identifier: Test Item: Pre-Condition: Environmental Needs: Module Integrat ed 2.1
SWEMS IT-02
Testing Date
12/03/10
Employee Activity with the System New System detected in the network A Performace Observer Module Execution Output Specification
Input Specification
Status (P/F)
Comment
1) Send login credential to Existing Employee 2) Took default policy and implement
1) 2)
Employee login successfully Employee should be notified for each selected operation
Pass
The test procedure verified that all selected operation functioning are successfully executing from the Employee.
3)
22
Project Title Test Case Identifier: Test Item: Conducted By: Module Integrated 3.1
SWEMS IT-03
Testing Date
13/03/10
The test will check for the integration of all the modules under the main module Vivek Kr.Nirala (Developer) Input Specification Authorization to modules: When users logs in to SmartWeb Employee Management System should be granted depending on the access granted to them. Status(P/ F) Pass The test procedure verified that user can perform only the assigned operations. (Output coming from the login module brings the desired screen to the user) Comment
3.2 3.3
Pass Pass
23
3.4 3.5
Integration between login & main menu Information passed from one screen to other screen
Pass Pass
Regression testing is normally accomplished to build confidence that changes in software have no unintended side-effects. These are the test cases re run from existing test suites. During testing some test were failed as shown above in unit testing. To rectify those errors developer made changes in the system. Those changes might have created side effects in the Smart Web Employee Management system. Developer believes that all code changes have already been retested in the project integration tests; therefore this phase of testing would be unnecessary (as said by Albion Government services (2004)). So regression testing is not conducted by the developer.
24
8.4 GUI TESTING GUI testing is done to test the proposed system user interface. It checked the appropriate use of the components.
Project Name:
SWEMS
Will check the ease with which the system could be used. Questions Do the menu labels are meaningful? Do they describe their associated Yes No N/A Answers Yes No N/A
Buttons
Do the buttons have meaningful labels? Do they describe the appropriate action? Do the names given to the labels are consistent all the screens? Is the grouping of buttons is appropriate. E.g. Submit button on the left and Reset on the right horizontally) Are the buttons size consistent (width and height).
Yes No
N/A
Yes No
N/A
Yes No
N/A
Yes No
N/A
Yes No
N/A
Text Boxes
Do the maximum length of the Yes No text box matches their corresponding data storage sizes? Do the check constraints available Yes No
N/A
N/A
25
Questions Are they meaningfully placed? Are the combo boxes Yes No No N/A N/A
Answers
having Yes
appropriate values? Do the values change performing Yes accordingly? Screen Design Is the layout logical so that the user Yes does not have to search for typical functions? Are graphics and text arranged on Yes the screens in such a way that they are easy to view and are not clustered? Is the used text providing Yes No N/A No N/A No N/A No N/A
meaningful information Color Are colors used consistently when Yes designating functionality? Do the used colors are sufficient Yes contrast to reduce eye strain? Are the colors appealing? Form Labels Yes No No N/A N/A No N/A No N/A
Do the label sizes and lengths are Yes appropriate. Are the labels given proper text so Yes that they describe properly what they meant? Are the form labels lengths Yes
No
N/A
No
N/A
sufficient to accommodate common screen solution? Messages Do the error messages are Yes No N/A
descriptive and meaningful. Do error message contain non- Yes technical information? Fonts Are fonts consistent to all modules Yes No No N/A N/A No N/A
Are the fonts used are available are Yes available in all types of operating system? Items Check Boxes Questions Do check boxes showing their status Yes (Checked/Unchecked) and working according to that. Do they have descriptive labels? Radio Buttons Are radio buttons Yes
Answers No N/A
No No
N/A N/A
grouped Yes
26
accordingly? Are they working according to their Yes description? Do they have descriptive labels? Yes No N/A No N/A
Table 4: GUI testing Conclusion: - Above result shows that the graphical user interface of the proposed system (SWEMS) is user friendly. Therefore developer found that the end users definitely not feel any difficulty in accessing SWEMS(Smart Web Employee Management System)
8.5 COMPATIBILITY TESTING Testing Date: 19/03/10 Compatibility testing of Smart Web Employee Management System was conducted to make sure that the developed application would work properly with all the major Microsoft Windows versions currently available in the market like Microsoft Windows 2000, Microsoft Windows XP and Microsoft Windows Vista. The outcome showed that Smart Web Employee Management System is mostly compatible with all Microsoft windows version but the best result is received in Microsoft Windows XP. Absolute compatibility with Microsoft Windows XP was assured, since it is the most widely used windows by current corporate and organizations. However, some problems were encountered in Microsoft Windows 2007 because some of the internet information service restrictions, blocking and unblocking operations dont seem be to perform easily. 8.6 SECURITY TESTING Smart Web Employee Management System Security Testing Vivek kr. Nirala To check the system security
20/03/10
ST-01
Vikas Singhal was the tester for the usability of the system because he is the system administrator at APIIT SD INDIA. Being a system administrator he could check the interface against their usability in a professional manner. Question Answer No, without logging in Can anybody access the protected area of the system (Attempted access without a proper password to the system to see information?) user can not access the protected area of the Test Pass system. User will be (No Action) asked to enter login credentials to proceed No, to access the Can simple Employee access the admin protected area admin protected area user must be of Admin type. Table 5: Security testing Test Pass (No Action) Action to be taken
27
8.7 USER ACCEPTANCE TESTING Testing Date: 23/03/10 31/03/10 User acceptance testing is the final step before rolling out the application. Developer distributed the application to some APIIT students so that end users experience can be seen regarding the developed system. User acceptance testing gives confidence to developer that the application being delivered to end users meets their requirements.
28
Student Name/Percent age Excellence Kishor Kunal Excellence Percentage Kumar Shreeram Excellence Percentage Anup Kumar Excellence Percentage Adarsh Deshratnam Excellence Percentage
Smart Web Employee Management System - USER ACCEPTANCE TESTING Able to monitor Able to perform Able to deploy Able to login to all the related info policy on any SmartWeb Employee perform of Employee and Management System operations browser administration Yes Yes Yes Yes 100% 98% 88% 75%
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
100%
94%
90%
86%
29
Yes 100%
Yes 94%
Yes 93%
30
SIGNOFFS Phase User Acceptance Test Release Kumar Shreeram (Tester) 02/04/10 Name Kishor Kunal (Tester) Date 01/04/10 Signature
Anup Kumar
02/04/10
Adarsh Deshratnam
03/04/10
Aneeta Chahal
Test Strategy Unit Testing Integration Testing Regression Testing GUI Testing Compatibility Testing Security Testing User Acceptance Testing
Start Date 10 February 2010 11th March 2010 16th March 2010 17th March 2010 19th March 2010 20th March 2010 23rd March 2010
th
End Date 10 March 2010 13th March 2010 16th March 2010 18th March 2010 19th March 2010 20th March 2010 31st March 2010
th
PASS/FAIL CRETERIA
The system must satisfy the standard requirements for system pass/fail as stated in requirement and design specification. Some of the identified requirements are: SWEMS functions should work correctly. System response time should not be greater than 30 sec. GUI should be consistent and should follow the HCIU principle. Administrator must be able to perform all the operations he is intended to do.
31
32
Testing
33
Test Plans Testing is done throughout the implementation phase of the project, and for each type of testing, whether its unit or system testing, they are all tested based on several criteria such as functionality, usability and of course user acceptance. The on going process is done with the use of prototypes and the test results are documented each time a testing is performed. These test reports help in prototyping for further iterations and eventually help to maintain the project on track. The test plan strategies used in this project consists of the followings: BLACK BOX TESTING. It is the testing performed without the prior knowledge of the system to the user. It is used to test the overall functionalities of the system. The user is unaware of how the system works; he/she just enters the input data and check the output. Every input will be tested for its validation using various black box testing techniques. Tester: End User COMPATIBILITY TESTING. COREP Wizard is supposed to be a platform independent system running on a web browser. Thus it needs to be tested on varied Operating Systems and web browsers. This testing will be performed after completion of the whole system. Tester: Developer(myself)
USABILITY TESTING: COREP Wizard being a multimedia based tool developed targeting novice users in mind; it needs precise use of HCI usability principles and conventions. The system will be tested to follow the designing principles. The system is also checked for the feedback, visibility, constraints and other HCI guidelines. Tester: Developer and end user together.
FUNCTIONAL TESTING. After verifying the system using black box testing it also needs to be validated, conforming its specifications. This test is performed by inputting a varied type of erroneous data. This will involve testing of the product's user interface, database management, security, installation etc. Testing: Developer (myself)
UNIT TESTING. This test will be performed to check the functionality and reliability of every single function/ module developed in the system before integrating with other functions/modules. This test will be mostly carried out parallel with the development of the system. Tester: Developer (myself)
INTEGRATION TESTING: This test is performed when all the individual module based tests are finished. Prior this test the modules are combined together to form complete integrated system which is tested for its integrity.
Tester: Developer ACCEPTANCE TESTING. Testing to verify a product meets customer specified requirements. After completion of the COREP Wizard it will be handed over to some end-user to test the user requirements. Tester: End User
1. Unit Testing
Unit testing is performed on single modules which are created separately apart from the main system itself. With each separate module comes with a prototype which is being used as a testing subject, and also for the purpose of letting third party users to use the system in order to perform the user acceptance testing. The feedbacks and criticisms received at these moments were recorded and taken into consideration for future enhancements or even changes of the module.
For conducting Unit testing of COREP Wizard the developer has divided the system into modules as below:
1. Login & authentication module 2. Project initiation (create folders and files dynamically and filling data) 3. Intro development and communication with XML 4. Theme development 5. Event module 6. Image, video n audio upload section
34
As the system was developed using separate modules the basic test cases involved in the modules were tested at the time of implementation. Two of the modules units testing are displayed below; the rest of the test cases are skipped from documentation as they will utilize unnecessary space. Module 1: Login and Authentication
Table 8: Unit testing for login panel
Test Subject
Test Method
Expected Result
Actual Result
Remarks
Login panel
Problem with header command Replace header with echo based redirection command
Login panel
Authentication type
Redirect to index.php
Page redirected
35
Test Subject
Test Method
Expected Result
Actual Result
Remarks
Project initiation
Display list of all the existing projects with hyperlink to the project summary page
Developer didnt mention the word Projects in the link Link updated, working successfully
Project initiation
When creating a new project with same name as of existing one, the process should be interrupted
Folder checked,
Existing project
Project initiation
Project initiation
The xml files are created in the project folder with predefined data
Project initiation
Session variables are created and page redirected to project summary page.
Page redirected to project summary but not all Session variables are created
Update the code to create all the session variables and check for all variables otherwise display error. Problem solved
Integration Testing
As explained previously this is the test to check how the modules perform when they are combined together and work in a coherent environment. The test mainly checks the calling of functions, passing of variables, global properties and their effect on working of individual modules. In case of COREP Wizard the integration was performed in three steps:
i. Integration of PHP modules
36
ii. iii.
In the first step all the PHP modules were integrated to form the wizard section. The integration testing at this level is performed as follow:
Table 11: PHP Integration test
Test Subject
Test Method
Expected Result
Actual Result
Remarks
Connectivity
Successful navigation
using next step next step on link on each page Connectivity using main menu each page to check navigation Check if the main menu links navigate to the proper sections Update project status Checks if the project status is
Successful navigation
links navigate to working the appropriate page. Update the project status in Only the project database was updated. properly
Need to append the code to update session value as well. Successful in 2nd iteration .
In the second step the all the flash files are integrated together. Every project has a file named as <project name>.swf. This is the main file that shall autorun when the project disc is prepared and executed. The testing of flash integration is as follows:
Table 12: Flash Integration Test
Test Subject
Test Method
Expected Result
Actual Result
Remarks
the template.swf set true, except but the xml files the main flash 37
file The problem was solved after updating the action script
Global menu
Check if the
Sound and music are playing separately without interfering each other.
background sound are distinct overlaps the internal event driven sounds
Keyboard shortcuts
Key press event works all right when Escape key is placed
Finally the fully integrated COREP Wizard is tested for integration testing. This integrated system consists of both the PHP modules and Flash Modules.
Table 13: Integration Test
Test Subject
Test Method
Expected Result
Actual Result
Remarks
Check if proper flash files are displayed, selected, copied using PHP pages
Appropriate flash file is used for display, selection and copy No problems occur from
Test satisfactory
System Testing
Table 14: System testing
Test Subject
Test Method
Expected Result
Actual Result
Remarks
All the data inputted is published in the output file based on the
Successful test.
38
testing)
choice and input data. The PHP processing took more time than normal processing The output flash hanged for few seconds in few heavy processing but worked fine The system as projected earlier gave below average performance.
Test the systems performance on a low configuration system (the configuration is compatible with minimum system requirement)
The system should work fine except few performance issues like slow processing, low quality of output.
Compatibility Testing
The system is developed independent of the platform for deployment. Also the user section is deployed on a web browser so it is expected to work in exactly same fashion on every web browsers. The compatibility testing for browser compatibility was performed on the developers system using the following browsers:
Internet Explorer 6.0 Mozilla Firefox 2.0 Google Chrome Apple Safari 3.2.2 Opera 8.5
The compatibility testing for OS independency was performed on following operating systems:
Table 15: Compatibility test OS details
Microsoft Vista
Ubuntu 6.04
Apple Mac OS X
System details Owner: Mr. Saurabh Pattarkine(peer student) Configuration: Intel P4 processor, 512MB RAM Server: XAMPP Windows 1.7.0 Owner: Mr. Ninad Pahune(developer) Configuration: Intel core 2duo processor, 2GB RAM Server: XAMPP Windows 1.7.0 Owner: Mr. Amit Kumar(peer student) Configuration: Intel P4 processor, 512MB RAM Server: XAMPP Linux 1.7.0 Owner: Mr. Varadraj Kumar(Peer student) Configuration: Intel Core Duo processor, 1GB RAM Server : XAMPP Mac OS X 1.0.1
Test Subject
Test Method
Expected Result
Actual Result
Remarks
Microsoft Deployment Windows of the system, 98, 2000 black box testing Microsoft Deployment Vista of the system, black box testing
Suitable deployment, appropriate output on black box testing, average performance Suitable deployment, appropriate output on black box testing, Best performance
System deployed and Passed black box testing with above average performance System deployed and Passed black box testing with best performance
39
Ubuntu 6.04
Worked best on *LAMP platform. Test successful The code was reconsidered and rectified. Performance increased in 2nd iteration.
black box testing, above testing with best average performance Suitable deployment, appropriate output on performance System deployed with minor tweaks
Apple Mac OS X
black box testing, above that hindered average performance performance. Passed black box testing with average performance.
Test Method
Expected Result
Actual Result
Remarks
Check visibility, absolute positioning, scroll property & other HCI interface guidelines
Correct positioning, size and scrolling of divisions. Appropriate font size and implementation of CSS.
CSS was reconsidered. Proper output in the 2nd iteration. Test case passed.
Check visibility, absolute positioning, scroll property & other HCI interface guidelines
Correct positioning, size and scrolling of divisions. Appropriate font size and implementation of CSS.
Test case
Google Chrome
Check visibility, absolute positioning, scroll property & other HCI interface guidelines
Correct positioning, size and scrolling of divisions. Appropriate font size and implementation of CSS.
Check visibility, absolute positioning, scroll property & other HCI interface guidelines
Correct positioning, size and scrolling of divisions. Appropriate font size and implementation of CSS.
The layout of the system appears smaller but no problem with the look and usability issues.
Opera 8.5
Correct positioning, size and scrolling of divisions. Appropriate font size and
Test case
40
implementation of CSS.
41
2. Mr. Sachin The second interviewee for the project. Being a multimedia developer, he is familiar with thte development of such projects and their requirements. Being a part of the multimedia industry he proved useful in finding out the efficiency of the system. 3. Mr. Saurabh Pattarkine He is a level 3 student of APIIT SD India. Being a peer student he was able to asses the usability and functionalities implemented by the developer. Moreover the he belonged to the target audience class for year disc thus, his suggestions will be utilised. 4. Mr. Simanta Chakraborty- A level 1 student of APIIT SD India, and interested in multimedia development was well aware with the HCI principles to be implemented in the system. He proved useful in finding out the response of users of orientation disc.
The usability and user acceptance testing will be divided into three main categories which will focus on different areas of the system to determine its success and acceptance from the user, these categories are: Interface Design The testers taste in the design interface provided by the developer. Audio and Video - This category mainly focuses on how the tester thinks about the audio and videos in the system. Usability - This category mainly focuses on the HCI usability principles and to obtain feedback from the user based on these principles. The usability testing and the user acceptance testing together took 5 days. The testers were provided with the system and asked to fill the User acceptance test form only after developing serious synopsis of the system.
The overall response by the users was satisfactory. The developer was appraised to prove his metal. However Mr. Sachin the multimedia developer expected more professional level of work as considered in the market. He suggested few suggestions for future development and credited the effort put by the developer in bringing a new idea in the market.
*The user acceptance forms are attached in the Appendix for more details.
Conclusion
The testing phase made the developer aware of the actual status of his system. Many problems and shortcomings of the system were identified and eventually sorted out, this lead to an extensive research of the domain and technologies used by the developer in making COREP Wizard. The developer hereby completes the testing of the system and feels satisfied with the test results brought about by him and the test users.
42
43