0% found this document useful (0 votes)
59 views59 pages

.NG - 96247.8850 Support System For Software Evaluation

The document discusses the development of a decision support system for software evaluation. It introduces the background and need for such a system given the complexities of software evaluation and selection. The aim is to design and implement a system that can assess software features, determine their effectiveness, and maintain evaluation records to help users select the best software option. It discusses related work on software evaluation criteria, multi-criteria decision support systems, and decision support systems in general. The research will analyze the current software selection process, design the proposed system, implement it using programming languages, and test its effectiveness in aiding software evaluation.

Uploaded by

Manu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views59 pages

.NG - 96247.8850 Support System For Software Evaluation

The document discusses the development of a decision support system for software evaluation. It introduces the background and need for such a system given the complexities of software evaluation and selection. The aim is to design and implement a system that can assess software features, determine their effectiveness, and maintain evaluation records to help users select the best software option. It discusses related work on software evaluation criteria, multi-criteria decision support systems, and decision support systems in general. The research will analyze the current software selection process, design the proposed system, implement it using programming languages, and test its effectiveness in aiding software evaluation.

Uploaded by

Manu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 59

SUPPORT SYSTEM FOR SOFTWARE EVALUATION

1
CHAPTER ONE
INTRODUCTION
1.0 Introduction
This chapter introduces the decision support system for software evaluation.
It is the first chapter in this research work and is specifically focused on the
theoretical background as well as the statement of the problem, aim and
objectives of the study, significance of the study, scope of the study,
organization of the research and finally the definition of terms. This will
bring clarity in regards to the general concept of this research project.

1.1 Theoretical Background


Making decisions concerning complex systems (e.g., the management of
organizational operations, industrial processes, or software evaluation) often
strains our cognitive capabilities. Even though individual interactions among
a system's variables may be well understood, predicting how the system will
react to an external manipulation such as a policy decision is often difficult.
Many variables are involved in complex and often subtle interdependencies
and predicting the total outcome may be daunting. There is a substantial
amount of empirical evidence that human intuitive judgment and decision
making can be far from optimal, and it deteriorates even further with
complexity and stress. Because in many situations the quality of decisions is
important, aiding the deficiencies of human judgment and decision making
has been a major focus of science throughout history. Applying decision
support systems to software evaluation therefore is a very important research
area.

Decision Supports Systems (DSS) are computer-based information systems


designed in such a way that help managers to select one of the many
alternative solutions to a problem. It is possible to automate some of the

2
decision making processes in a large, computer-based DSS which is
sophisticated and analyze huge amount of information fast. It helps corporate
to increase market share, reduce costs, increase profitability and enhance
quality. The nature of problem itself plays the main role in a process of
decision making. A DSS is an interactive computer based information
system with an organized collection of models, people, procedures, software,
databases, telecommunication, and devices, which helps decision makers to
solve unstructured or semi-structured business problems. Adopting decision
support system to software evaluation guarantees accurate evaluation of the
software [3].

Evaluation is dependent on the current knowledge of science and the


methodological standards. Evaluation as an aid for software development has
been applied since the last decade, when the comprehension of the role of
evaluation within Human-Computer Interaction had changed. The activities
“Task analysis”, “Requirement specification”, “Conceptual and formal
design”, “Prototyping”, “Implementation” are each supplemented by an
activity “Evaluation” which helps to decide progression to the next step.
Software can be evaluated with respect to different aspects, for example,
functionality, reliability, usability, efficiency, maintainability, portability. In
earlier times evaluation of software took place at the end of the developing
phase, using experimental designs and statistical analysis, evaluation is
nowadays used as a tool for information gathering within iterative design:
“Explicit human-factors evaluations of early interactive systems (when they
were done at all) were poorly integrated with development and therefore
ineffective. They tended to be done too late for any substantial changes to
the system still be feasible and, in common with other human-factors
contributions to development, they were often unfavourably received.
Instruments for evaluation are not primarily used for global evaluation of an

3
accomplished product, but these instruments are applied during the
development of a product. Indeed, most experts agree nowadays that the
development of usable software can only be done by a systematic
consideration of usability aspects within the life-cycle model. One prominent
part is the evaluation of prototypes with respect to usability aspects,
employing suitable evaluation techniques in order to find usability errors and
weaknesses of the software at an early stage. [1].

The technology used is Microsoft Access 2003 and Visual Basic 6.0. The
source code below can be used to save software valuation record to database
in Visual Basic 6.0
Private Sub Command1_Click()
Adodc3.Recordset.Update
MsgBox "SAVED"
End Sub

1.2 Statement of Problem


As institutions and organizations spend huge amount on Enterprise resource
planning (ERP) packages and other computer software that cost hundreds of
thousands and even millions of dollars, purchasing a software solution is a

4
high expenditure activity that consumes a significant portion of companies’
capital budgets. The following are the problems faced:
 Selecting the right solution is an exhausting process for companies.
 Therefore, selecting a software package that meets the requirements
needs a full examination of many conflicting factors and it is a
difficult task.
 Most times the software bought do not meet the needs of the
institution or organization despite the huge amount.
To avoid the problem of software ineffectiveness, this has led
researchers to investigate better ways of evaluating and selecting software
packages.

1.3 Aim and Objectives of the Study


The aim of the study is to develop a decision support system for software
evaluation. The following are the objectives of the study:

 To design and implement a decision support system for software


evaluation.
 To develop a system that will assess the software features to determine
their level of effectiveness?
 To develop a system that will maintain record of software evaluation
records

1.4 Significance of the Study


The significance of the study are:
 It will enable software users and developers evaluate the effectiveness
of a software product.

5
 It utilizes a simple approach for evaluation.
 The study will also serve as a useful reference material to other
researchers seeking for information concerning the subject.

1.5 Scope of the Study

This study covers decision support system for software evaluation. It is


limited to the capturing of the weighted mean of software features and the
determination of the best software option based on the total weight of its
features.

1.6 Organization of Research


This research work is organized into five chapters. Chapter one is concerned
with the introduction of the research study and it presents the preliminaries,
theoretical background, statement of the problem, aim and objectives of the
study, significance of the study, scope of the study, organization of the
research and definition of terms.
Chapter two focuses on the literature review, the contributions of other
scholars on the subject matter is discussed.

Chapter three is concerned with the system analysis and design. It presents
the research methodology used in the development of the system, it analyzes
the present system to identify the problems and provides information on the
advantages and disadvantages of the proposed system. The system design is
also presented in this chapter.

Chapter four presents the system implementation and documentation, the


choice of programming language, analysis of modules, choice of
programming language and system requirements for implementation.

6
Chapter five focuses on the summary, constraints of the study, conclusion
and recommendations are provided in this chapter based on the study carried
out.

1.7 Definition of Terms

Software - Programs and applications that can be run on a computer system,


e.g. word processing or database packages

Evaluation - The act of considering or examining something in order to


judge its value, quality, importance, extent, or condition

System: An assembly of computer hardware, software, and peripherals


functioning together to solve a common problem

7
CHAPTER TWO
REVIEW OF RELATED LITERATURE
2.0 Introduction

This chapter is concerned with the review of related literature, the


contributions of other researchers is examined in this chapter. it presents the
following:

2 Software selection process


3 Software Evaluation Criteria
4 Belyk’s and Feist’s Software Evaluation Criteria
5 Weighting Product Selection Factors
6 Multi-Criteria Decision Support System (MCDSS) for Software
Component Selection
7 Decision Support Systems
8 Software Evaluation Attributes

2.1 Software Selection Process

Many organizations are attempting to save costs by integrating third-


party, commercial off-the-shelf (COTS) packages (e.g., component libraries
or extensions) or complete COTS-based solutions (e.g., enterprise resource
planning [ERP] applications). The methods used to identify a set of possible
candidate solutions are, for the most part, rather subjective. The individual or
individuals performing the evaluation have various, distinct experiences that
will factor into the decision process, either consciously or subconsciously.
To have a successful COTS evaluation, a formal process is needed to
properly evaluate COTS products and vendors supplying them. In this
instance, the term formal means having an established and documented

8
process to perform the selection and evaluation activities in a consistent,
repeatable manner.

How does an organization conduct the initial research into products that
might be candidates for use on their project? How is the initial selection
performed? Some organizations use an “intuitive approach” to select the
initial list of products. This approach uses an individual who has had past
experience with the product or who has “heard good things” about the
product. An inappropriate selection strategy for COTS products can lead to
adverse effects. It could result in a short list of COTS products that may not
be able to fulfill the required functionality; in addition, it might introduce
overhead costs in the system integration and maintenance phases [3]. One
successful method for selecting products is the use of a selection team. When
selecting a COTS component,1 the use of a team of technical experts—
systems/software engineers and several developers—is recommended. When
selecting a COTS-based system, however, the inclusion of business domain
experts and potential end users is recommended [4]. The use of a team
virtually eliminates a single-person perspective or bias and takes into
account the viewpoints and experiences of the evaluators in the selection and
evaluation process.

2.2 Software Evaluation Criteria

When evaluating a possible software solution, most organizations are likely


to consider the ability of the product to meet the functional requirements.
Although it is a significant first step in the evaluation process, this should not
be the only criterion that is considered. Two additional criteria that should be
considered are intangible factors and risk.

9
Intangible factors are not the traditional “quality” factors, but rather factors
that are programmatic decisions (i.e., decisions that can or will affect the
overall program during its life span) and that have an effect on the system
utilizing the software. Most of the decisions also depend on intangible
factors that are difficult to quantify. Some costs can be identified up-front,
but others—the ones that organizations need to worry about for the long term
—are hidden.

Risk is another element that should be part of the selection criteria. Many of
the risks associated with system management and operation are not in your
direct control. Each vendor that plays a role in the design, development,
acquisition, integration, deployment, maintenance, operation, or evolution of
part (or all) of your system affects the risks you face in your attempt to
survive cyber attacks, accidents, and subsystem failures. Some possible risk
factors that should be considered are listed below:

• Is the company well established?


• What is the longevity of the company?
• Is there support (training, developer, etc.) offered?
• Is your vendor flexible enough to make changes in the middle of
development?
• Is the vendor financially stable?
• How mature is the technology used?

Another risk to consider is the volatility of the COTS components. COTS-


based systems are always subject to the volatility of the COTS components
(i.e., frequency with which vendors release new versions of their products).
Expect volatility to increase exponentially with time and the number of
components used [5].

10
After a product or component has been selected, continuous risk
management should be applied for the life cycle of the system that uses it.
Continuous risk management is especially important if the product or
component is being used as part of a framework. Unlike other software-
selection decisions, the selection of a framework is a long-term decision—
possibly lasting 10–15 years [4]. After a final selection has been made, the
risks associated with the product or component should be fed back into the
risk management plan. One method for mitigating the risk is to perform
continual vendor-based risk evaluations. This type of evaluation focuses only
on the vendor or vendors supplying the third-party components. Continual
risk evaluation is especially important if the component is a critical part of
the system life cycle for a mission-critical system. This activity should also
be addressed as part of a risk management plan [4].

2.3 Belyk’s and Feist’s Software Evaluation Criteria


Another evaluation scheme for software products, related to wide network
exploitation is developed in [5]. Its main characteristics are presented
below. A series of categories and criteria is suggested for the evaluation of
online collaborative tools in the development, delivery, and administration
online of software products.

Cost (institutional and user):


• System requirements for the product operation: open platform, platform-
specific, server purchased vs. hosted.
• Bandwidth for network exploitation (modem, cable, ADSL, T-1, etc.).
• License fees (scaled per user).
• User software/hardware requirements.
• Peculiarity of the installation of the product.

11
Complexity (user focus):
• Technical support (user manual; frequently asked questions; online
and off-line help).
• Collaborative tools if applicable (asynchronous – email, conferencing;
synchronous – chat, audio- conferencing, whiteboard, virtual
networking);
• Usability (seamless technology; degree of intuitiveness; ease of use;
navigation; consistency; stability; functionality)

Control:
• Secured access (password protection; encryption; firewall).
• Personalization.
• Privacy.

Clarity: resolution, size, layout, etc.

Common Technical Framework: Interoperability; integration, protocols,


standards supported; scalability; platform; file-sharing.

Features: Administrator tools (registration; report generation)

2.4 Weighting Product Selection Factors


Before selecting specific products, institutions should consider each of the
above factors, balancing as far as possible the merits of specific products
against the general features of the system. The selection of a specific product
requires attention to:
• Software reliability.
• Availability of technical support by the institution to the users.

12
• Availability of support by the software supplier to the institution and
the users.
• Cost to the institution (i.e., local server support).

2.4.1 Matrix Weighted Evaluation of the Product


A quantification approach, using evaluation matrix is applied for the
software product evaluation in. The evaluation responses may be weighted
using points scoring criteria and scorecards. Results can then be compared
quantitatively according to the evaluation matrix. The review and the
analysis of the responses are recommended to be performed in the following
sequence.
• Analyze each evaluation response using a ‘score card’.
• Review each requirement listed in the score card and check the
answer(s). It is recommended to use a simple ‘Yes or No’ marking, or
a combined weighting and scoring method to indicate to what degree
the score card requirements are met by the evaluator.
• Repeat the process, using a new scorecard for each software product.
• The evaluation criteria have to formalize the requirements towards the
software products.
• Requirement weighting: Essential (3 x), Desirable (2 x), Nice to have
(1 x).
• Evaluation score is based upon reviewing and analyzing the score
card. The evaluation marks are: 0 = does not meet requirements; 1 =
partially meets requirements; 2 = meets requirements; 3 = exceed
requirements.
• The total points for the evaluation are the multiplication: weighting x
evaluation score.

13
An example of the weighted score card is presented in Table 1. It may
be obvious from the evaluator scorecard results which software product
should be short listed. The preparation of a common comparison matrix,
consisting the results of each of the scorecards above, can be a good choice
for presentation of the total result in a table form.

Source: Bandor, Michael S. (2006).


2.5 Multi-Criteria Decision Support System (MCDSS) for Software
Component Selection
Numerous approaches have been proposed for the general problem of
software component evaluation and selection. Most methods for component
selection employ a variation of the standard five steps which are:

 Define criteria
 Search for components
 Shortlist candidates
 Evaluate candidates
14
 Analyze results and choose component

Frequently employed approaches for evaluating and selecting components


include the usage of simple scoring and weighted sum approaches, the
Analytic Hierarchy Process (AHP), or iterative filtering. Others use methods
based on utility analysis to tackle the incommensurability of decision factors.
In particular in cases of strict requirements on trustworthiness and reliable
selection of components, evidence-based decisions using controlled testing
are recommended [4]. For the scenario of component selection, using goal-
based requirements modeling and utility analysis is especially suitable for a
number of reasons: The decision models strongly build on quality attributes
that lend themselves to requirements engineering approaches; the anomaly of
rank reversal should be avoided; and the number of analytical steps that for
example the application of the AHP requires is in many cases prohibitive.
Still, the problematic aspect of all approaches for component selection that
can be considered trustworthy, i.e. evidence-based and formalized, is the
high complexity and effort involved in creating suitable evidence. This
begins with the unambiguous specification of criteria for quality attributes,
which can be quite challenging , and extends to the evaluation of
components, i.e. the process of assigning values to decision criteria.
Software quality models have provided a common language to high-level
aspects of the selection problem. The ISO standard 25010 - ‘Systems and
software engineering - Systems and software Quality Requirements and
Evaluation (SQuaRE)

2.6 Decision Support Systems


Decision support systems are interactive, computer-based systems that aid
users in judgment and choice activities. They provide data storage and
retrieval but enhance the traditional information access and retrieval

15
functions with support for model building and model-based reasoning. They
support framing, modeling, and problem solving. Typical application areas
of DSSs are management and planning in business, health care, the military,
and any area in which management will encounter complex decision
situations. Decision support systems are typically used for strategic and
tactical decisions faced by upper-level management decisions with a
reasonably low frequency and high potential consequences in which the time
taken for thinking through and modeling the problem pays generously in the
long run. There are three fundamental components of DSSs [3].

Database Management System (DBMS): A DBMS serves as a data bank


for the DSS. It stores large quantities of data that are relevant to the class of
problems for which the DSS has been designed and provides logical data
structures (as opposed to the physical data structures) with which the users
interact. A DBMS separates the users from the physical aspects of the
database structure and processing. It should also be capable of informing the
user of the types of data that are available and how to gain access to them.

Model-base Management System (MBMS): The role of MBMS is


analogous to that of a DBMS.
Its primary function is providing independence between specific models that
are used in a DSS from the applications that use them. The purpose of an
MBMS is to transform data from the DBMS into information that is useful in
decision making. Since many problems that the user of a DSS will cope with
may be unstructured, the MBMS should also be capable of assisting the user
in model building.

Dialog Generation and Management System (DGMS): The main product


of an interaction with a DSS is insight. As their users are often managers

16
who are not computer-trained, DSSs need to be equipped with intuitive and
easy-to-use interfaces. These interfaces aid in model building, but also in
interaction with the model, such as gaining insight and recommendations
from it. The primary responsibility of a DGMS is to enhance the ability of
the system user to utilize and benefit from the DSS. In the remainder of this
article, we will use the broader term user interface rather than DGMS.
While a variety of DSSs exists, the above three components can be found in
many DSS architectures and play a prominent role in their structure.
Interaction among them is illustrated in Fig. 1. Essentially, the user interacts
with the DSS through the DGMS. This communicates with the DBMS and
MBMS, which screen the user and the user interface from the physical
details of the model base and database implementation.

Figure 2.1: The Architecture of a Decision support system


Source: Bandor, Michael S. (2006).

While the quality and reliability of modeling tools and the internal
architectures of DSSs are important, the most crucial aspect of DSSs is, by
far, their user interface. Systems with user interfaces that are cumbersome or
unclear or that require unusual skills are rarely useful and accepted in
practice. The most important result of a session with a DSS is insight into the

17
decision problem. In addition, when the system is based on normative
principles, it can play a tutoring role; one might hope that users will learn the
domain model and how to reason with it over time, and improve their own
thinking. A good user interface to DSSs should support model construction
and model analysis, reasoning about the problem structure in addition to
numerical calculations and both choice and optimization of decision
variables. User interface is the vehicle for both model construction (or model
choice) and for investigating the results. Even if a system is based on a
theoretically sound reasoning scheme, its recommendations will be as good
as the model they are based on. Furthermore, even if the model is a very
good approximation of reality and its recommendations are correct, they will
not be followed if they are not understood. Without understanding, the users
may accept or reject a system's advice for the wrong reasons and the
combined decision-making performance may deteriorate even below unaided
performance . A good user interface should make the model on which the
system's reasoning is based transparent to the user. Modeling is rarely a one-
shot process, and good models are usually refined and enhanced as their
users gather practical experiences with the system recommendations. It is
important to strike a careful balance between precision and modeling efforts;
some parts of a model need to be very precise while others do not. A good
user interface should include tools for examining the model and identifying
its most sensitive parts, which can be subsequently elaborated on. Systems
employed in practice will need their models refined, and a good user
interface should make it easy to access, examine, and redefine its models [3].

18
2.7 Software Evaluation Attributes
Attributes to be considered for software evaluation are usually classified in
several groups. Quality characteristics of the software package such as
functionality, reliability, usability, efficiency, maintainability, and portability
have been used as evaluation criteria group in several studies. Among the
ISO/IEC standards related to software quality, ISO/IEC 9126-1 specifically
addresses quality model definition and its use as framework for software
evaluation. Criteria related to: (1) vendor, (2) hardware and software
requirements, and (3) cost and benefits of the software packages are
commonly used across many papers. Criteria related to quality of the
software are given in Tables 2.1, 2.2 and 2.3 respectively.

Table 2.1: Criteria Related To Vendor


Criteria Criteria Group Criteria meaning
User manual Vendor Availability of user
manual with indexes,
with important
information and the
main commands.
Tutorial Vendor Availability of tutorial to
learn how to use the
software package.
Troubleshooting guide Vendor Availability of
troubleshooting guide.
Training Vendor Availability of training
courses to learn the
package.
Maintenance and Vendor Vendor support for
upgrading upgrading and
maintenance of the
software.
Consultancy Vendor Availability of technical
support and consultancy
by the vendor
Communication Vendor Communication between
19
vendor and the user
Demo Vendor Availability of on-site
demo and free trial
version.
Number of installations Vendor Number of installations
of the software package
Response time Vendor Level of service offered
by the vendor
Length of experience Vendor Experience of the
vendor about
development of the
software product.
Product history Vendor Popularity of vendor in
the market.
Technical and business Vendor Technical and business
skills skills of the vendor
Past business experience Vendor Past business experience
with the vendor if any
References Vendor Number of references of
existing customers using
the product.

Table 2.2: Criteria Related to Hardware and Software


Criteria Criteria group Criteria meaning
Internal memory Hardware Primary storage
Communication Hardware Communication
protocols protocols supported
External storage Hardware Secondary storage
needed in the form of
disk space and other
storage facilities
Compatibility Software Compatibility with the
existing software and
hardware
Source code Software Availability of the
source code
Hardware platform Hardware Hardware platform
required to run the

20
software.
Network configuration Configuration Network technology
needed to run the
software package e,g
LAN, WAN

Table 2.3: Criteria Related to Cost and Benefits

Criteria Criteria group Meaning


License cost Cost License cost of the
product in terms of
number of users
Training cost Cost Cost of training to the
users of the system
Installation and Cost of installation and
implementation cost implementation of the
product
Maintenance cost Maintenance cost of the
product
Upgrading cost Cost of upgrading the
product when new
version will be launched.
Cost of hardware Cost of machinery used
to support the system,
including processor,
memory and terminals
Direct benefits Benefits Tangible savings in
labour and equipment,
reduction in processing
cost per unit and
elimination of outside
service charges
Indirect benefits Benefits Improvement in
customer service, faster
turnaround time of
processing.

Many software evaluators provide a preferred list of evaluation


attributes for evaluation of specific software package; however, a lack of

21
common list is apparent. The exact meaning of a criterion is open to the
evaluator’s own interpretation. Sometimes the terminology used by author(s)
for a criterion in one literature is different than another author(s) for the
same criterion. This may lead to ambiguity and gives unclear picture to the
evaluator. To address this issue we provide generic lists of evaluation criteria
and their meaning, which can be used for evaluation of any type of the
software package. Although, functional criteria for software selection vary
from software to software, criteria related to cost, vendor and quality of the
software may be common and can be used for selection of any software
package. The standard list of evaluation criteria and their definition could
overcome some of the pitfalls in software evaluation and selection process.
There is need to develop a framework including a software selection
methodology, an evaluation technique, an evaluation criteria, and a system to
support evaluation and selection of any software package [6].

22
CHAPTER THREE
SYSTEM ANALYSIS AND DESIGN
3.0 Introduction
In this chapter, the system analysis and design is discussed. The source of
data methods of collection, the evaluation of the existing system and the
organizational structure of the system are presented.

3.1 Research Methodology


A software development methodology or system development
methodology is a frame work that is used to structure, plan and control the
process of developing an information system. The system analysis and
design methodology used to analyze the system is Object Oriented Analysis
and Design Methodology (OOADM). OOADM applies object orientation in
the analysis and design as a software engineering approach that models a
system as a group of interacting objects. Object oriented analysis and design
is the analysis and design of a system from the object point of view.

3.2 System Analysis


It is important to analyze the existing system in order to determine areas of
improvement.

3.2.1 Analysis of the Existing System


In the existing system, the software is evaluated after user experience.
However, there is no systematic method to categorize the evaluation and to
compare it with another software.

23
3.2.2 Problems of the Existing System
The following problems were identified in the existing system:
- No proper record is kept of software evaluation.
- It is time consuming to manually determine the effectiveness of
software systems

3.2.3 Analysis of the Proposed System


The system to be developed will provide a scoring interface. Key values are
given to the attributes to be evaluated, the total and average is then
computed. After that, it is compared against one another.

3.2.3.1 Advantages of the Proposed System


The proposed system will be easy to use and it will save time. Records will
be kept of evaluation for future reference.

3.2.3.2 Disadvantage of the Proposed System


The disadvantages of the proposed system is that it is expensive to develop
and maintain.

3.3 System Design


The system design has to do with the layout of the system and it comprises
of the input and output layout and Algorithm design and program flow chart.

Use Case Model


A use case model shows the different components of a system and their
interactions with the actors. The owner of the filling station or any one he
authorizes is responsible for making full use of the system. They provide
inputs to the system and also can view the output from the database. The

24
administrator can register software to be evaluated, perform software
evaluation and view reports from the database.

View Database

<<uses>>
<<uses>>
<<uses>>
Register
software

Evaluate
User software

View Reports

Fig 3.1: Use Case model

25
3.3.1 Input Layout (Software Registration)

Registration Date

Name Of Software

Category

Developer(S)

Date Of Development

Licensed

OS Platform

Hardware REQ

Software REQ.

NEW SAVE CLOSE

Fig 3.2 Software Registration input layout

26
Input Layout (Software Evaluation)

Evaluation Date

Software Name

Cost

Reliability

Graphical User Interface

User Manual

Installation

Keyboard Shortcuts

Ease of Usage

Response Time

TOTAL

AVERAGE

SAVE CLOSE

Fig 3.3 Software Evaluation input layout

3.3.2 Algorithm
Step 1: Start
Step 2: Login

27
Step 3: If login is success goto step 4 else goto step 2
Step 4: Display main menu
Step 5: Input choice
Step 6: If choice is software registration goto step 7 else goto step 8
Step 7:Input registration details and save to database.
Step 8: If choice is software evaluation goto step 9 else goto step 12
Step 9: input evaluation details
Step 10: compute total
Step 11: compute average
Step 12: If choice is evaluation report goto step 13 else goto step 15
Step 13: Display evaluation database
Step 14: Query database by software name
Step 15: if choice is quit goto step 16 else goto step 5
Step 16: Stop

28
3.3.3 Program Flowchart

LOGIN

Start

Input Username and Password

Is username and password valid? No


Display “Invalid Username/password”

Yes

Fig 3.4: Login M

29
MAIN MENU

Is choice software registration? YES


SR

NO
YES
Is choice software evaluation?
SE

NO
YES
Is choice Evaluation Report?
R

YES
Quit?

NO

End

Fig 3.5: Main Menu

30
SOFTWARE REGISTRATION
SR

INPUT SOFTWARE REGISTRATION DETAILS

SAVE TO DATABASE

YES Continue?

NO

Fig 3.6: Software Registration

31
SOFTWARE EVALUATION

SE

Input Evaluation details

Compute total and average

Display expert decision

Save to database

YES Continue?

NO

Fig 3.7: Software Evaluation

32
REPORT

Display database report

Query database by software name

YES
Continue?

NO

Fig 3.8: Report

33
3.3.4 Output Format
Output layout (Software Registration)
Registratio Name_of_S Category Developer Date_of_developm ………
n_Date oftware ent

Output layout (Software Evaluation)


Evaluation_ Software_name Cost Reliability Graphical_user_interface
Date

34
CHAPTER FOUR
SYSTEM IMPLEMENTATION AND DOCUMENTATION
4.0Introduction
This chapter focuses on the system implementation. It presents the system
design diagram, choice of programming language, analysis of modules,
programming environment, implementation and software testing.

4.1 System Design Diagram

LOGIN

MAIN MENU

SOFTWARE REGISTRATION SOFTWARE EVALUATION SOFTWARE EVALUATION REPORT QUIT

NEW FIND
NEW

SAVE SAVE
CLOSE
CLOSE PRINT

CLOSE

Figure 4.1: System Design Block Diagram

4.2 Choice of Programming Language


The programming language used is visual BASIC. The language was chosen
because it enables the creation of applications with a graphical user interface,
containing controls such as text fields, combo box, labels, command buttons
etc.

35
4.3 Analysis of Modules
The system is made up of four main modules as shown in the system flow
diagram. They are;
Software Registration: This module enables the registration of software for
evaluation

Software Evaluation: This module enables the registered software to be


evaluated.

Software Evaluation Report: This module enables the database of


registered software to be evaluated.

Quit: This module terminates the program

4.4 Programming Environment


The programming environment used for the development of the application is
windows 7 operating system and the integrated development environment
(IDE) chosen for the development of the system is Visual BASIC 6.0.
The hardware and software requirements for successful implementation of
the system are stated at this point.
The hardware requirements are;
- Pentium iv computer system
- Super video graphic array monitor
- 512 MB RAM
- Keyboard
- Mouse
- Uninterruptible power supply (UPS)

36
The software requirements are:
- Microsoft Visual Basic 6.0
- Microsoft Access 2003

4.5 Implementation
The system implementation method recommended and chosen by the system
developer is the parallel running so as to prevent data loss. In parallel
running, new system and the old system are used with extra staffs recruited
to run the new system but it is very expensive. Both systems continue to run
until the new system is working properly then the old one is discarded.

4.6 Software Testing


The system was tested at every stage of its development in other to be able to
detect errors and remove them immediately. The testing was done in two
phases as discussed below:

Modular Testing: All the modules of the system which are:


- Software Registration

- Software Evaluation.

- Software Evaluation Report


were independently tested to ensure that they are working properly.

System Testing: In this phase all the modules are integrated and the system
with all its modules is tested to identify and remove errors that may arise as a
result of the integration.

37
CHAPTER FIVE
SUMMARY, CONCLUSION AND RECOMMENDATION

5.0 Introduction
This chapter presents the summary, conclusion, constraints of the study and
useful recommendations are offered based on the research study.

5.1 Summary
Many studies provide a preferred list of evaluation criteria for evaluation of
specific software package. The exact meaning of a criterion is open to the
evaluator’s own interpretation. Sometimes the terminology used by author(s)
for a criterion in one literature is different than another author(s) for the
same criterion. This may lead to ambiguity and gives unclear picture to the
evaluator. To address this issue we provide generic lists of evaluation criteria
and their meaning, which can be used for evaluation of any type of the
software package. Although, functional criteria for software selection vary
from software to software, criteria related to cost, vendor and quality of the
software may be common and can be used for selection of any software
package. The standard list of evaluation criteria and their definition could
overcome some of the pitfalls in software evaluation and selection process.

5.2 Constraints of the Study


In the course of the research work, some challenges were faced such as:
Time: The time given for the research work is too short. A research of this
magnitude requires time to be properly executed
Materials: Few materials were found pertaining to the research study, and
this limited the bulk of the literature.

38
Finances: The high cost of transportation to different libraries as well as the
high cost of internet browsing stood as limitation of the study.

5.3 Conclusion
A decision support system for software evaluation is an automated system
that accepts the values of the criteria for evaluating software to evaluate the
effectiveness of the software. This information provided aids organizations
to decide on the effectiveness of the software. The average value presented
as the result reveals the capability of the software system. There is need to
develop a framework including a decision support software selection
methodology, an evaluation technique, an evaluation criteria, and a system to
support evaluation and selection of any software package.

5.4 Recommendations
The following recommendations are offered based on the study:
• More research should be encouraged in the area of software
evaluation
• Computer programmers should be contracted to develop software
evaluation systems.
• Organizations should evaluate software before they acquire it.

39
REFERENCES

[1] Abdelkader, A. I. (2006).A Cooperative Intelligent Decision Support


System for Contingency Management, University Paul Sabatier of
Toulouse, France Journal of Computer Science 2 (10): 758-764, 2006
ISSN 1549-3636 © 2006 Science Publications

[2] Anil S. Jadhav a, Rajendra M. Sonar (2009). Evaluating and selecting


software packages: A review. Information and Software Technology 51
(2009) 555–563

[3] Bandor, Michael S. (2006).Quantitative Methods for Software Selection


and EvaluationCopyright 2006 Carnegie Mellon University Technical
Note CMU/SEI-2006-TN-026 September 2006

[4] Becker, C. and Rauber, (2010) A. Improving component selection and


monitoring with controlled experimentation and automated
measurements. Information and Software Technology, Volume 52, Issue
6, June 2010, Pages 641-655

[5] Becker, C. Kraxner, M., Plangg, M., Vienna, A. (2013). Improving


decision support for software component selection through systematic
cross-referencing and analysis of multiple decision criteria.University of
Technology. 2013 46th Hawaii International Conference on System
Sciences

40
[6] Bhargava, H. and Power, D. (2015). Decision support systems and web
technologies: A status report.

[7] Carvallo, J. P., Franch, X., and Quer, C. (2007). Determining criteria for
selecting software components: Lessons learned. IEEE Software,
24(3):84–94, May-June 2007.

[8] International Standards Organisation (ISO) (2007). Software engineering


– Software product Quality Requirements and Evaluation (SQuaRE) –
Measurement reference model and guide (ISO/IEC 25020:2007).

41
APPENDIX A
SOURCE CODE
Private Sub MNUQUI_Click(Index As Integer)
End
End Sub

Private Sub MNUREP_Click(Index As Integer)


Form4.Show
End Sub

Private Sub MNUSOFTEVAL_Click(Index As Integer)


Form3.Show
End Sub

Private Sub MNUSOFTREG_Click(Index As Integer)


Form2.Show
End Sub
Private Sub Command1_Click()
Adodc1.Recordset.AddNew
End Sub

Private Sub Command2_Click()


Adodc1.Recordset.Update
MsgBox "Saved"
End Sub

Private Sub Command3_Click()


Unload Me
End Sub
Private Sub Command1_Click()
Text7.Text = Combo1.Text
Adodc3.Recordset.Update
MsgBox "SAVED"
End Sub

Private Sub Command2_Click()


Unload Me
42
End Sub

Private Sub Command3_Click()


Adodc3.Recordset.AddNew
Text1.Text = Date$
End Sub

Private Sub Command4_Click()


On Error GoTo AB
Me.PrintForm
AB:
End Sub

Private Sub Form_Load()


On Error GoTo AB
Adodc1.Refresh
Do
Combo1.AddItem Adodc1.Recordset.Fields("NAME_OF_SOFTWARE")
Adodc1.Recordset.MoveNext
Loop Until Adodc1.Recordset.EOF = True

AB:
End Sub

Private Sub Text2_Click()


On Error GoTo AB
Text2.Text = CDbl(Combo3.Text) + CDbl(Combo4.Text) +
CDbl(Combo5.Text) + CDbl(Combo6.Text) + CDbl(Combo7.Text) _
+ CDbl(Combo8.Text) + CDbl(Combo9.Text) + CDbl(Combo10.Text)
Exit Sub
AB:
MsgBox "ERROR, CHECK INPUTS"
End Sub
Private Sub Text3_Click()
On Error GoTo AB
AVG1 = CDbl(Text2.Text) / 8
Text3.Text = AVG1
Exit Sub
AB:
MsgBox "ERROR, CHECK INPUTS"
End Sub

43
APPENDIX B
OUTPUT

Fig Appendix B.1: Snapshot of main menu

44
45
Fig Appendix B.2: Software Registration snapshot

Fig Appendix B.3: Software Evaluation snapshot

46
47
Fig Appendix B.4:Software evaluation report snapshot

DESIGN AND IMPLEMENTATION OF DECISION


SUPPORT SYSTEM FOR SOFTWARE
EVALUATION
(A CASE STUDY OF DIGITAL CENTER, AKWA IBOM STATE
POLYTECHNIC)

BY

JOHN, OFONIME DAVID


AKP/ASC/CSC/HND2013/0309

SUBMITTED TO

THE DEPARTMENT OF COMPUTER SCIENCE, AKWA IBOM


STATE POLYTECHNIC, IKOT OSURUA, IKOT EKPENE, AKWA
IBOM STATE.

IN PARTIAL FULFILLMENT FOR THE AWARD OF HIGHER


NATIONAL DIPLOMA (HND) IN COMPUTER SCIENCE

48
NOVEMBER, 2015

CERTIFICATION
I hereby declare that the work presented herein was done by me, and not by a
third party. Should I be convicted of having cheated in this work, I shall
accept the verdict of Akwa Ibom State Polytechnic, Ikot Osurua, Ikot
Ekpene

JOHN, OFONIME DAVID …………………………


(AKP/ASC/CSC/HND2013/0309) SIGNATURE/DATE

MRS. MFREKE UMOH …………………………


SUPERVISOR SIGNATURE/DATE

MR. OBONGUKO U. A. …………………………


HEAD OF DEPARTMENT SIGNATURE/DATE

49
……………………………. …………………………
EXTERNAL SUPERVISOR SIGNATURE/DATE

APPROVAL
This project report is approved for submission.

……………………………………….
MRS. MFREKE UMOH

50
DEDICATION
This project is dedicated to the almighty God, the Alpha and Omega who is
the source of all my knowledge and to my beloved parents Mr. and Mrs.
David John Udofia who taught me the steps of education.

51
ACKNOWLEDGEMENT
I offer my gratitude to God Almighty for his mercy for allowing me achieve
my dreams and also giving me the needed guidance and protection
throughout my studies in Akwa Ibom state polytechnic.

I am immensely grateful to Mrs. Mfreke Umoh my project supervisor who


labored through each of my many draft and provided very constructive
comments, direction and encouragement.

My sincere thanks to my beloved mother Deputy Deaconess General (DDG)


Akon, David Udofia who provided me with moral and financial support,
prayers, love and care, my beloved father, Mr. Emmanuel David Udofia for
encouraging me despite all the difficulties I encountered and mostly my
sponsors Mrs. Mma Udofia (DDG) and my senior sisters, Mercy, Iniobong
Etuk, Mrs. Victoria Ubong and my senior brothers Emmanuel David, Uduak
David, Ubon David and my inlaw Ubong Asuquo, Dr. Ikobong David who
assisted me in one way or the other, during this challenging period.

52
I also wish to recognize my Head of Department Mr. U. A. Obonguko and
all my lecturers and other staff of the department who guided and advise me
during my study in Akwa Ibom State polytechnic.

My immense thanks goes to Dr. Ikobong who spend time to encourage me in


every vital area.

Finally, my special thanks goes to all my bossom friends and colleagues


Engr. Isaac Uko, Akaninyene, Felicia, Aniekan and those too numerous to
mention.

I express my gratitude to the almighty God for giving me strength, wisdom


and protection to complete this work may his name be glorified forever and
ever Amen.

JOHN OFONIME DAVID

53
ABSTRACT
This research work focused on decision support system for software
evaluation. The concept of software evaluation evolved as a result of the
need for selecting effective amongst numerous software packages available
today. Most of the software marketed are not really able to solve the
problems they were designed for very well and this may bring about loss for
the user of the system. It is in view of the need to select the best software
that necessitates the study. Choosing the wrong software package can turn
out to be costly and adversely affect business processes. It is therefore
imperative to adopt the software evaluation techniques or methodologies to
grade and select the best software based on defined criteria for evaluation.
Use case modeling was adopted to represent the different components of the
system. The software development methodology used is the spiral model and
the programming language used is Visual BASIC 6.0.

54
TABLE OF CONTENTS
Title Page - - - - - - - - i
Certification- - - - - - - - - ii
Approval Page - - - - - - - iii
Dedication - - - - - - - - iv
Acknowledgment - - - - - - - - v-vi
Abstract - - - - - - - - vii
Table of Contents - - - - - - - - viii-x
List of Figures -- - - - - - - - xi
List of Tables - - - - - - - xii

CHAPTER ONE: INTRODUCTION


1.0 Introduction - - - - - - - 1
1.1 Theoretical Background - - - - - - 1-3
1.2 Statement of Problem - - - - - 3-4
1.3 Aim and Objectives of the Study - - - 4
1.4 Significance of the Study - - - - - - 4-5
1.5 Scope of the Study - - - - - - - 5
1.6 Organization of the Research - - - - - 5-6

55
1.7 Definition of Terms - - - - - - 6

CHAPTER TWO: LITERATURE REVIEW


2.0Introduction - - - - - - - 7

2.1 Software selection process - - - - 7-8

8.2 Software Evaluation Criteria - - - -- - 8-10


8.3Belyk’s and Feist’s Software Evaluation Criteria - - 10-11
8.4Weighting Product Selection Factors - - - - 11-12
2.4.1 Matrix Weighted Evaluation of the Product - - - 12-13
8.5Multi-Criteria Decision Support System (MCDSS)
for Software Component Selection - - - - 13-14
8.6Decision Support Systems - - - - 14-17
8.7Software Evaluation Attributes - - - - 18-21

CHAPTER THREE: SYSTEM ANALYSIS AND DESIGN


3.0 Introduction - - - - - - - - 22
3.1 Research Methodology - - - - - 22
3.2 System Analysis - - - - - - - 22
3.2.1 Analysis of the Existing System - - - - 22
3.2.2 Problems of the Existing System - - - 23
3.2.3 Analysis of the Proposed System- -- - - 23
3.2.3.1 Advantages of the Proposed System - - 23
3.2.3.2 Disadvantages of the Proposed System - 23
3.3 System Design - - - - - - - 23-24
3.3.1 Input Layout - - - - - - 25-26
3.3.2 Algorithm - - - -- - - 26-27
3.3.3 Program Flowchart - - - - - 28-32
3.3.4 Output Layout - - - - - - 33

56
CHAPTER FOUR: SYSTEM IMPLEMENTATION AND
DOCUMENTATION
4.0 Introduction - - - - - - - 34
4.1 System Design Diagram - - - - - - 34
4.2 Choice of Programming Language - - - 34
4.3 Analysis of Modules - - - - - 35
4.4 Programming Environment - - - - - 35
4.4.1 Hardware Requirements - - - - - 35
4.4.2 Software Requirements - - - - - 35-36
4.5 Implementation - - - - - - 36
4.6 Software Testing - - - - - - 36

CHAPTER FIVE: SUMMARY, RECOMMENDATION AND


CONCLUSION
5.0 Introduction - - - - - - - - 37
5.1 Summary - - - - - - 37
5.2 Constraints of the Study - - - - - - 37
5.3 Conclusion - - - - - - 38
5.4 Recommendation - - - - - - 38
References - - - - - - 39-40
Appendix (A) - - - - - - 41-42
Appendix (B) - - - - - - 43-46

57
LIST OF FIGURES

Fig 3.1: Use Case model - - - - - - 24


Fig 3.2 Software Registration input layout - - - - 25
Fig 3.3 Software Evaluation input layout - - - 26

Fig 3.4: Login - - - - - - 28

Fig 3.5: Main Men - - - - - - 29

Fig 3.6: Software Registration - - - - 30

Fig 3.7: Software Evaluation - - - - 31

Fig 3.8: Report - - - - - 32

Figure 4.1: System Design Block Diagram - - - 34

58
LIST OF TABLES

Table 2.1: Criteria Related To Vendor - - - - - 18-19


Table 2.2: Criteria Related to Hardware and Software - 19-20
Table 2.3: Criteria Related to Cost and Benefits - - - 20

59

You might also like