Online Survey System - A Web-Based Tool For Creating and Administe
Online Survey System - A Web-Based Tool For Creating and Administe
CSU ePress
6-2006
Recommended Citation
Idowu, Fatima Marie, "Online Survey System: A Web-Based Tool for Creating and Administering Student
Evaluations Online" (2006). Theses and Dissertations. 71.
https://csuepress.columbusstate.edu/theses_dissertations/71
This Thesis is brought to you for free and open access by the Student Publications at CSU ePress. It has been
accepted for inclusion in Theses and Dissertations by an authorized administrator of CSU ePress.
Digitized by the Internet Archive
in 2012 with funding from
LYRASIS Members and Sloan Foundation
http://archive.org/details/onlinesurveysystOOidow
Columbus State University
A Thesis in
by
of the Requirements
for the Degree of
Masters of Science
June 2006
I have submitted this thesis in partial fulfillment of the requirements for the degree of
Master of Science.
3qrv£ g g00£?
Date" Christopher Whitehead, Assistant
Professor of Computer Science, Thesis
Advisor
dK
J^rueL * £s00-£
Date Eugen Ionascu, Associate Professor of
Mathematics
Abstract
With the advancement in technology over the years, the administering of online surveys has
expanded. In particular, universities are using the online medium to administer surveys to
students, in order to evaluate faculty performances. The move of surveys to the online realm
has meant a reduction in cost, time and efforts, of survey administrators, and the increase in
use of technology within universities. With the use of online surveys the challenges of
confidentiality, anonymity and response rates are as prominent as they are with paper-based
surveys.
This study researched the use of online surveys in education; detailing systems currently used
by many universities to facilitate the creation of an Online Survey System that would solve
the latter challenges. The principal idea was to create a system that would provide the
Columbus State University Computer Science department with a web-based tool for creating
surveys and administering them online. The surveys will be created by faculty members;
Survey Administrators, and administered to students; Survey lakers. With the system,
faculty members are able to create surveys for classes taught, providing questions and
responses deemed suitable. Once a survey is administered, students can access the survey by
logging onto the system. Upon authentication students are able to complete surveys online.
Surveys are generated dynamically, depending on the survey criteria supplied by the survey
administrator. The system stores survey criterion in a database, which is retrieved when
The system developed focused on providing functionality that would increase the
The created system was successful in its aim to provide these features, also providing ease
for faculty to create online surveys without having the extensive technical expertise that is
required to do so.
1
Table of Contents
Abstract ill
Table of Contents v
List of Figures viii
List of Tables ix
List of Tables ix
Acknowledgements x
Acknowledgements x
1. Introduction 1
List of Figures
List of Tables
Firstly I would like to give glory and praise to my Heavenly Father for giving me the peace,
wisdom and strength I needed to complete this thesis. Thank you Ford.
I want to express my deepest gratitude to Professor Whitehead for being so patient with me
and always being available when I needed help throughout the course of this project. For
I also want to thank the committee members, Bhagyavati, Eugen Ionascu, and Paulina
Kuforiji, for their comments and suggestion throughout the process of documenting this
Finally I want to thank my parents and sisters for their love, support and prayers for me
during the course of this project. Thank you to Andrew Smith for guidance, suggestions and
constructive criticism of my work and Amir Fynton for encouraging and constantly assuring
For many decades, surveys have been used for research and as a means of obtaining
feedback from consumers. In fact, surveys are probably the most commonly used research
method world-wide [Pfleeger & Kitchenham, 2001]. Surveys can be administered in two
different forms: supervised and unsupervised. The means by which a survey is administered
is dependent upon the administrator's objectives and the resources available. Supervised
surveys are those where a survey researcher is assigned a respondent, or survey taker; for
example, telephone surveys require a researcher to ask the respondent a series of questions
and then to record the answers. Unsupervised surveys take the form of automated voice
In light of the advancement in technology over the years, the medium used to administer
surveys has expanded to the online realm. Companies have been able to administer their
surveys online in order to obtain feedback on their products and services. Due to the nature
of the Internet, respondents' answers are recorded, totaled, and ready for analysis
immediately, thus eliminating the need for the tallying of surveys' results by hand; thereby
saving time and money. In the past few years, this trend has moved into the education
system.
feedback on the units offered at the institute. Over the years, the use of communication has
grown extensively in university teaching and learning. Many universities have expanded to
online curriculum and with this expansion have adopted online surveys as a method of
collecting student feedback. However, the use of online surveys for feedback on the
university's curriculum and performance of its professors is not limited to those taking
online classes. Universities are adopting online surveys as a mechanism to replace "pen and
paper" surveys across the board; online and face-to-face classes alike. Research done at
Murdoch University found that the growth in the use of online surveys developed from the
desire to align the surveys with the use of technology for classes, "to increase access to
external students, and to improve the efficiency of the survey process." [Cummings &
Ballantyne, 2000]
While online surveys are used in numerous universides all around the world, there is little
research literature invesdgating their use in education [Cummings & Ballantyne, 2000]. Few
researchers have investigated the use and creation of online surveys for the educational field
[Cummings & Ballantyne, 2000; Pfleeger & Kitchenham, 2002b; Ha & Mars, 1998].
The University of Leicester has a research project entitled 'Exploring Online Research
Methods.' The research resulted in the development of a website that aims to provide an
"online resource which provides training for researchers who are interested in using online
research methods such as online questionnaires and online interviews." The website targets
researchers and postgraduates in the higher education, and "researchers working for other
organizations such as those involved with public policy and market research," and is a great
resource providing online methodologies to those wishing to explore the online research
realm.
2. Purpose of the System/Motivation/ Related Work
The common practice at Columbus State University (CSU) is to administer paper-based end
of unit surveys. This allows departments to offer students the opportunity to evaluate their
classes and the faculty members who conducted these classes on the basis of their teaching
skills and the way the class or module was delivered. The main objectives of these
evaluations are to assess the instructor's performance, gain insight into student attitudes
about course content and assignments, and student satisfaction levels with quizzes, exams,
and the course in general. Evaluations are also used to assess changes in instructional
practices or new courses; specifically when this practice is used or the course is offered as
part of an experiment or a trial. The results of these surveys are generally used to determine
whether a class will be offered again, if the class was offered on a trial basis, or whether the
The Department of Computer Science at CSU currently uses both paper-based and online
evaluations. Some faculty members prefer the use of paper-based surveys while others prefer
their online counterparts. However, these ultimately contain the same information. The
Department of Computer Science has a designated individual who creates the online surveys
for the department and faculty are then responsible for pointing students to the appropriate
The online surveys currently administered by the Department of Computer Science are static
in nature, meaning that the questions are predefined and each survey has the same questions
on them, these questions are the same as those used in the paper-based surveys; there is no
opportunity provided for faculty to pose their own questions regarding classes that they
teach. This is somewhat unfortunate as a faculty member teaching a class will more than
likely have questions specific to the class that he/she would like to ask the students.
The intent of this project, therefore, is to produce a system that is aimed at providing CSU
Computer Science faculty members with a tool which can be used to develop online
evaluation surveys for the courses taught and to administer these surveys to students when
they deem fit; most likely near the end of the semester or half semester, for those classes
offered for only half a semester. The system will allow faculty to supply questions that they
feel are relevant to their course rather than administering a static survey which is used college
wide. The system will be web based allowing faculty to create surveys when they wish and to
make them available to students. This system will be beneficial for both students and faculty,
as students will have the opportunity to take surveys in the convenience of their own time. It
is anticipated that this convenience will allow students to think about their responses; and
thus increase the quality of their responses. Feedback from students revealed that
completing paper-based surveys in a class room setting leads to the selection responses that
would not normally be chosen. The reason for this is that students feel rushed to complete
surveys in the time given. Whilst this notion is plausible, it has not been validated by
research.
The system currently in place has no way of restricting students from completing a class
survey more than once or even restricting students who did not take a particular class to
have access to complete the survey. The proposed system will utilize user authentication in
order to restrict users from completing a survey more that once and makes surveys available
[Gaide, 2005]. These are system administrators, survey administrators and students. A survey
administrator is responsible for the creation of online surveys. In the proposed system the
faculty members will take on the role of survey administrators and students will take the role
of survey takers. The system will have a system administrator who will be responsible for
The success of student evaluations online is wholly dependent upon the faculty's support. It
is the faculty's responsibility to make students aware of the surveys they need to complete.
Faculty are, however, in a position where they can either undermine the survey by lack of
attention and despondent comments or promote the survey through supportive comments,
reminders and providing feedback on how past survey results were used to improve the unit
[Cummings & Ballantyne, 2000]. It is, therefore, anticipated that the system will encourage
faculty to promote surveys as it will be surveys that they created themselves and feedback
from those surveys will be more directed towards information they require about the
particular class rather than simply generic feedback about the teaching process.
Response rate is a major issue when administering surveys. Any reliable survey system
should measure and report its response rate and responses. Using the system presented in
this thesis, faculty will be able to view statistical results of completed surveys online, allowing
them to graphically see responses to the surveys instantly instead of having to wait for all
surveys to be taken and then tallied; as is the case with paper-based surveys.
The proposed system therefore aims to improve the process of administering surveys by
using the Internet as a medium and making the process more convenient for faculty and
students alike.
Two common types of surveys are paper-based and telephone surveys. However, with the
recent move made by higher education institutions to make their surveys available online, the
question arises as to the effectiveness of online surveys versus paper-based surveys. It has
been argued that comparison of responses between online and paper surveys shows no
There are various advantages when using an online surveying system for those involved in
the process. The common advantage to all involved is the flexibility and the "potendal to
improve the efficiency of the survey process." Alternatively, low response rates and
The creadon of surveys and the surveying process highlights many challenges. These
challenges range from confidentiality and anonymity to response rates. These challenges are
The observed challenges that face an online surveying system are those of computer access
and literacy, security and confidentiality and response abuse, such as multiple submissions.
In asking the questions whether student feedback should be moved completely online, four
These are "support of staff members involved, level of student access to information
technology, the lowest level of computer literacy in the student target group and the level of
online, they must be promoted by staff or students will be unaware of their availability. Staff
members may be reluctant to promote evaluation surveys because they do not want to
change their teaching style or be critiqued by students. This also holds true with paper-based
surveys.
The access to hardware is a limitation to consider when thinking about online surveys. The
level of access that students across a campus or class have may vary considerably. The
physical access to appropriate hardware and software for students is not the only factor to be
considered when talking about access to information technology, but also access where and
when it is convenient for the student. With regard to access to hardware, class time could be
provided where students have access to the appropriate hardware and software needed to
complete a survey. Paper-based surveys are usually administered during class time.
Nevertheless, with both online and paper surveys, the issue of convenience is still at hand.
Are a couple of minutes before the end of class necessarily a convenient time for a student
to complete a survey? As a student myself, and this is true for many students that this issue
has been discussed with, when asked to complete a course evaluation in class, one would
prefer to complete it in his or her own time, as the student tends to feel rushed to complete
the survey within the few minutes at the end of a class or within the time frame given. The
8
given responses therefore may not necessarily be the same as those students would have
given if given the opportunity to complete surveys in their own time. This is especially true
when one is required to give comments, as a student will not have time to ponder the
The level of computer literacy among students and also among staff is another factor when
considering transferring surveys online. Unless their particular area of expertise affords
them to deal with information technology regularly, staff members, especially the ones less
familiar or comfortable with using technology, will have more problems administering online
surveys. The level of competence with computers is not necessarily evenly distributed among
the student population and also the staff population. However, many high school graduates
are fairly familiar with computer technology, as nowadays it is a requirement for high school
students to take computer literacy classes. There are however, some parts of the world where
it may not be necessary depending on the field the student wants to embark upon, .i.e.,
If CSU is to administer all its surveys online, faculty need to be comfortable with using an
complete the survey and a willingness to do so. With regards to staff comfort in
administering online surveys, it is not necessary that all staff members become survey
administrators. Those comfortable with the task can be given the responsibility of
administering surveys for themselves and other staff members. With reference to student
access to technology, most university campuses are equipped with computers available for
student use; the only issue here is the convenience to the student.
As the primary focus for this system is the Department of Computer Science, the issue of
computer literacy among staff members should not be an issue. With students there is a
chance that computer literacy may be low, but as they are completing a computer science
degree this level will increase as they progress with their degree and there are tutors who are
When considering taking surveys online, there must be a means of informing the
respondents, students in this case, that they are required to take the survey. If a student has
the liberty to complete a survey in their own time, a mechanism should be in place to inform
the student of the requirement to complete the survey; this should be something other than
the instruction from the teacher. With existing systems, students are usually notified via
school email.
An effective survey system for student feedback should be password protected; there should
be some way of authenticating users to ensure that the intended survey taker is the one
actually taking the survey. With paper surveys, particularly those that are mailed and in the
case of students, those which students are given the opportunity to take away with them and
return at a later date, it is somewhat difficult to ensure that the correct person has completed
the survey. A student might have a friend or a sibling complete a survey for them. With the
means of authentication, a majority of students are aware of security issues and will be more
10
reluctant to share their login credentials with others. A password system will therefore allow
With paper-based surveys administered in class, instructors usually tell students to drop the
completed surveys off when the students are leaving or return the completed surveys to the
faculty member's mailbox at a later date. These surveys are somewhat anonymous, so there
is no way to remind those who have yet to submit them to do so. With an online system
such as the one proposed, there will be a means of identifying those students who have yet
to complete survey for the purpose of reminding them to do so. The issue of confidentiality
and anonymity' will arise however if survey administrators have access to the identities of
students who have yet to or did not complete their surveys. For this reason the proposed
system will protect anonymity' by having a mechanism to track those who have not
completed surveys solely for the purpose of sending reminders. This information will be
encapsulated so that the instructor or persons administering the survey have no information
as to who reminders are sent to. The system will also have a mechanism of ensuring that
beneficial for those taking online classes. Automated data collection is another advantage
which as a result reduces researcher time and effort. Online surveys save time by allowing
large volumes of data to be collected for the given survey continuously and imported into
statistical tools and databases, increasing the speed and accuracy of analysis. In the case of
the Department of Computer Science, the resource which is normally used to collect and
analyze the data for surveys will no longer be required, as survey responses will be
automatically stored on a database system. As a result, there will be a reduction in the cost of
administering survey, "from less staff time required to handle forms and enter data"
[Cummings & Ballantyne, 2000] to saving cost of printed forms. Data can also be
automatically validated for online surveys; that is, the system can return error messages
requesting the correct format of data entry, resulting in low data entry errors. In addition,
Disadvantages of online surveys include uncertainties over the validity of the data and
sampling issues. Here validity refers to the accuracy of the specific conclusions and
inferences drawn from non-experimental data [Gunn]. For online researchers, sampling is an
issue as there is no access to a central registry or database where an accurate sampling frame
can be gathered, neither is there any way of discerning how many users are logging on from
a particular machine. For the proposed system, this is not an issue of concern, as the sample
will be the students registered for a particular class or those that meet the criteria set out by
survey requires a certain level of technical expertise. In addition, the time taken to prepare an
online questionnaire can be substantial and may outweigh some of the time savings noted in
the advantages. With the proposed system however, a substantial part of the designing will
be automated. The only concern for the survey administrator will be to provide the survey
questions and administer it to students by sending them email notifications and reminders,
Another disadvantage to consider is that online surveys may need to be shorter than paper-
based surveys. Response rates for online surveys drop after 10-15 questions and are directly
Technical issues can also occur with online surveys. A server or computer can crash. There
are technical variances in computers, monitors, browsers and internet connections which
The construction of a survey goes far beyond the development of a questionnaire and asking
the intended audience to complete the survey. A survey should be seen as more than just an
The "survey instrument" is part of a large survey process defined by the following ten
13
7. Selecting participants
10. Reporting the results, statistical analysis and inference of survey results
It is necessary that a survey be designed to provide the most effective means of obtaining
information needed to address the objectives of the survey [Kitchenham and Pfleeger,
2002b]. For a survey to provide the most effective means, it should be designed in a way that
it will not be swayed by a particular faction, aspect or opinion. The survey should make
sense in the context of the population, and the administration and analysis should be within
Those conducting surveys often have some idea of what they are seeking. As a result, the
way they build the survey instrument can inadvertently reveal their biases [Kitchenham and
To avoid bias, survey construction must be done in a way that questions are neutral, the use
of words should not influence the respondent's thoughts, enough questions should be asked
to adequately cover the survey topic, attention should be paid to the order of questions (so
that answers to one do not influence responses to the next), provision should be made for
exhaustive, unbiased, mutually exclusive response categories, and instructions should be clear
and unbiased.
When constructing a survey, care must be taken as to how the questions are formulated and
structured. Questions must be formulated in a way that respondents can answer easily and
relationships between the intention of the question and survey objectives [Kitchenham &
Pfleeger, 2002b]. That is, the purpose of the question should be clear, or the question is
that the number of questions that can be realistically asked in a survey depends on the
amount of time respondents are willing to spend completing it [Kitchenham & Pfleeger,
2002c].
The time it takes to complete a survey can be dramatically reduced by having standardized
answer formats, for example "strongly agree, agree, disagree, strongly disagree" [Kitchenham
& Pfleeger, 2002c]. Standardized answer formats save time, as the respondent can anticipate
that the same choices are available for each question and do not have to take the time to
15
read new answer choices for every question within the survey. Questions in a survey can be
either open or closed. Open questions avoid imposing restrictions on the respondent,
answers. Open quesdons are difficult to code and analyze [Kitchenham & Pfleeger, 2002c].
Closed questions restrict responses but are easier to analyze. However, on the subject of
standardized responses for closed questions, each question may require responses that may
not be necessarily aligned with the standard answer set, whatever it may be.
Quesdons can usually be grouped into topics where each topic addresses specific objecdves
for the survey. It is important not to have too many quesdons on a survey as this can make
the respondent lose interest [Kitchenham & Pfleeger, 2002c]. Kitchenham and Pfleeger
further suggest identifying a topic that is addressed by many quesdons and removing some
of the less vital ones as a way of reducing question size. However, a balance should be
maintained between what one wants to achieve with the survey and the willingness of the
Some researchers oppose the view that there is little difference between the response rates of
online and paper-based surveys [Nulty, 2000]. Response rate is a major issue when
administering surveys. Any reliable survey should measure and report its response rate. The
[Cummings and Ballantyne, 2000) found that when they established an online system at
Murdoch University, the response rate from students was lower than that in paper-based
surveys administered at the university. Despite strategies put in place to make the system
effective, overall response rates for each semester were 30% in comparison with 65%
Many factors contribute to low response rates. For example, research suggests that people
do not like to participate in surveys unless they feel it is beneficial to them in some way
[Kitchenham & Pfleeger, 2001]. Incentives are usually introduced to increase response rates.
Cummings and Ballantyne used a "cash prize draw as an incentive for students who
completed surveys for all their units online." This improved the response rate from 30% to
54%.
Due to the "nature of online survey process, response rates need particular attention"
[Cummings & Ballantyne, 2000]. Cummings and Ballantyne demonstrate that there are a
"number of useful strategies to improve response rates." These include but are not limited to
encouraging staff to promote the survey, discussing feedback and any consequential course
changes with students, and offering incentives, particularly cash. These methods have been
The following steps were highlighted by Kitchenham and Pfleeger [2002b] to improve
response rates:
1. Ensure that respondents are able to answer the survey questions (questions should
impertinent
3. Ensure that respondents are motivated to answer the questions (show clearly that
Kitchenham and Pfleeger [2001] also noted that people are more motivated to complete
surveys if they can see that the results are going to be useful to them within the education
for a class they have taken. This can be done by informing them of changes or things that
There are few systems already in existence implementing the same basic goals as the system
that will be produced as a result of this thesis. The first is the Web-Online Feedback system
[Nulty, 2000]. The research aimed to "overcome the common problem with student
evaluation of teaching" [Nulty, 2000]. The research found that "qualitative components of
student evaluations (in particular student comments) were not being made available to
teaching staff concerned until sometime after the teaching has occurred." The system
developed offered a user-friendly, web-based interface that allowed users to generate web-
based questionnaires. Access to the system was 24 hours a day. The main goals for the
system were:
2. Improve on the range of different ways that people have to obtain feedback on
3. Support the gradually changing ethos surrounding evaluation of teaching and units
from one-off sporadic evaluations to that which evaluation is seen as an integral part
5. Ability to review pre-existing questions available in item banks (for paper and pencil
questions) to ensure that issues explored quesdons posed are adequately represented
It was found that academic staff opposed questionnaires which made use of a "fixed bank of
optional items or consisted of a fixed set of compulsory items limiting them from
conducting meaningful evaluations in relation to their context" [Nulty, 2000]. Hence, as part
of the fourth goal of the system, WOLF allows people to specify their own questions.
The system was not used as expected, so the success rate was low. However, there was
success with the response rate of questionnaires created on the system "by lecturers in the
2000].
The second system was a pilot program for an evaluation system implemented by the
Maryland. Instead of building a system from the ground up, the researchers made use of the
features available in WebCT [Denman, Robinson &White, 2004]. The feasibility of moving
19
the Physics department. The paper-based forms were mimicked within a web interface to a
database. The success of the pilot was measured by the amount saved on paper and scanning
cost (scan of paper forms). The pilot also eliminated sources of potential error (damaged or
misplaced scan sheets). This also reduced the amount of time required to produce the final
reports.
The University made use of the survey tool available on WebCT. The tool offered a
confidential electronic means of collecting evaluation data from students on courses at the
university. This information could then be transferred to the statistics lab for processing. It
was the university's desire to have a system that would send introductory emails and
reminder emails to students. The frequency of the reminders would be controlled by each
college. Staff desired a mechanism where a list was provided at the end of the evaluation
period of students who had completed evaluations; this would then be used to allocate extra-
The system was set up in a way that one WebCT "course" was created per department, this
was known as the evaluation space, and one 'survey' (a tool provided by WebCT) was
created for each university course. Department Representatives (DRs) were set up, assigned
and given the responsibility for very basic troubleshooting [Denman, Robinson &White,
2004]. DRs also had a tool available to them which allowed them to receive student response
rates upon request for each evaluation and overall for the department. Students had access
to surveys for the courses they were enrolled in for the semester. Access to surveys was
given to students based on criteria of the course number and section of the course(s)
20
enrolled in, in a particular department. Dates where also set for the release of the survey
ending the last day of the term [Denman, Robinson &White, 2004]. Icons that linked to
surveys were provided on the WebCT homepage. Students were notified of surveys via
emails which were sent from a web site set up for DRs. Reminder messages were placed in
queues with information regarding the beginning and end dates and frequency set by the
DRs. Reminders were processed once a day and any messages which matched the criteria for
the specific day were sent to students who had not yet responded to all their surveys in the
different departments.
Denman, Robinson & White [2004] focused on student response rate as the dominant
measure the college used to determine the success in the transfer from paper to online
course evaluation. The response rate for the student population for the summer, fall, and
spring semesters were 38%, 44%, and 31% respectively, the research did not provide
response rates for previous paperdoased surveys administered. The disparity of the response
rates in the spring as well as the lower percentages may have been related to the fact that
reminder emails sent during the fall semester were limited and dependency was placed on
introductory emails and a mid-way email. Certain departments had higher response rates due
to incentives such as extra credit and $50 gift certificate being offered.
The last system reviewed was developed as "part of a joint venture between Hong Kong
University of Science and Technology, University of Hong Kong and Hong Kong
Polytechnic University" [Ha & Mars, 1998]. The project saw the development and
implementation of two web-based systems, COSSET and OSTEI. These aimed to support
"student evaluation of teaching in local higher education institutes" [Ha & Mars, 1998].
21
COSSET is a centrally controlled system designed for collecting and processing data for
allowing instructors to construct their own questionnaires, gather student responses and
view the evaluation results online. During the project, the team also focused attention on
"evaluating the viability of the web as a valid and reliable medium for student evaluation of
OSTEI allows instructors to access its web site and set the necessary configurations to
conduct a questionnaire survey on the web. Once an instructor creates a questionnaire, with
the help of a question bank, students are able to access the OSTEI student site and complete
questionnaires.
The OSTEI system uses a "registration system to control access by instructors" [Ha & Mars,
The ID along with the instructor's username uniquely identifies the questionnaire. For the
system to retrieve the correct questionnaire from the database, a student must supply the
complete a given questionnaire more than once. "It is also possible for anyone aware of a
questionnaire ID and instructor username to complete a questionnaire" [Ha & Mars, 1998],
even if it was not intended for them to complete it. For this reason, OSTEI is not
3.1 Introduction
This chapter describes the system in terms of its functional and non-functional
requirements. It also presents the functional model of the system. The funcdonal model
represents the use cases, which elaborate on the requirements of the system by describing
the funcdonality of the system from an actor's point of view. An actor is an external endty
that needs to exchange information with the system. An actor can represent either a user role
or another system. The analysis object model is presented in this chapter and is represented
by class diagrams for the proposed system. The diagrams describe the entities manipulated
by the system.
This chapter begins by introducing the scope of the system and goes on to discuss the
objectives and success criteria for the proposed system. The proposed system is then
detailed along with the funcdonal and non-functional requirements. The functional model
The Online Survey System is an application that is aimed at improving the convenience of
the student evaluation process, both for students and for faculty. It aims to alleviate the
time and cost incurred when collecting data from course evaluations and to provide staff
with the opportunity of constructing evaluations themselves. The system will be a web-based
application which will be hosted in the CSU Studentwebs server. The system will allow
faculty, survey administrators, to create evaluation surveys by specifying the questions they
23
want on the survey and the answer formats for those questions. These criterions will be
stored on a database along with the intended audience as selected by the survey
requires and the selected target audiences for these surveys to be sent to. Once the surveys
have been created, the survey administrator will need to indicate to the system that the
surveys should be sent to the intended audience. The system will provide a list of all surveys
that the survey administrator has created along with the title they provide for the survey, the
survey creation date, the survey expiration date, which will also be provided by the survey
administrator, and the date the survey was sent to the intended audience. On selecting a
created survey, the survey administrator will be able to choose to send the survey to the
intended audience, send a reminder email or update questions or answers on the survey
provided that a notification email has not been sent out. On receipt of a survey notification,
a survey taker can click the web link contained in the email. This will bring the survey taker
to the system login and on authentication the survey taker will be provided with a list of
surveys he or she needs to complete or have already completed. When a survey is selected,
the survey will be dynamically created and displayed on the survey taker's screen. Survey
takers will be able to view responses of surveys they have previously completed. A survey
administrator will not be able to access individual student responses but can access statistical
information on the overall responses for each survey question on selection of a survey.
Online surveys provide a faster and more cost effective way of obtaining feedback from
audiences [Madge, 2006]. The objective of the application is to create a system that will allow
24
a certain group of users (survey administrators) to create surveys that other users can
complete online. Web-based surveys will be generated dynamically from the retrieval of
questions stored on a database. Once a survey taker selects a survey to complete, the
questions for the survey will be retrieved from the database and the survey will be
— Create a survey display page that displays "user specific" survey questions based
providing questions and the user groups to whom the survey should be assigned
— Create a set of user groups, allowing users to fall into more than one group
3.2.1 Overview
The system will be a web-based application that will allow students and staff members of
CSU to take surveys online. The proposed system has features from each of the systems
described in section 2.5. This system will be similar to the OSTEI, but will implement a few
additional features that will eliminate the limitations identified with the OSTEI system.
These additional features are user authentication, which will restrict survey takers from
completing a survey more than once and make surveys available to only those that they are
25
intended for, notification and reminder email, which will give the survey takers the
notification and reminders they need of the availability of the surveys, and provision of
The following section provides an overview of the funcdonal requirements of the system.
Functional requirements deal with what the system should do or provide for the users. The
funcdonal requirements will detail what facilities are required and what acdvides the system
should carry out. In other words, functional requirements define the required funcdonal
1 . Allow survey administrators to retrieve quesdons used in previous surveys, which are
stored in a question pool on the database. The quesdon pool should contain all
1. The system should build surveys dynamically at run-time. That is, the survey is
derived from questions stored in a database and the question response format and
1. Survey administrators should be able to view statistical results of the responses to the
survey.
2. Qualitative results should be made available for surveys before and after the cut-off
For example, if the department secretary wanted to create a survey, the system
administrators.
1 . Survey takers should be able to view surveys that they have previously completed.
1. Provide Internet client/server application that will allow users to connect via a login
2. Inform users of any errors detected while using the system or writing to the
3. Survey Administrators should be able to modify survey questions and answers before
Users will be authenticated to avoid multiple submissions from the same user. The user
authentication will be the same as their Novell authentication. This will enable survey takers
to use the same authentication they use for logging onto the school network. Survey takers'
email addresses will also be retrieved from the information held on the Novell database.
Questions used in previous surveys should be made available for all survey administrators to
The system should be able to generate surveys dynamically so that survey administrators are
Survey introduction and reminder notifications should be sent to inform users that the
survey is available and remind those who have yet to take the survey to do so.
29
The system will aggregate survey responses and present that aggregate information in the
form of charts.
To protect anonymity of the student taking the survey, the system must encapsulate all
information regarding the students and surveys that they have completed. Survey takers'
email addresses will be encapsulated in group names so when reminders are sent, the survey
administrator is only able to view the group name of the survey takers who are being sent a
reminder. Survey takers' completed surveys are only available to the survey takers themselves
The Online Survey System must fulfill the following non- functional requirements. Non-
functional requirements describe the user-visible aspects of the system that are not direcdy
related with the functional behavior of the system [Bruegge & Dutoit, 2000]
The user interface should be similar to that of a standard online survey application and
The system should operate on the studentwebs server in the Department of Computer
Science, CSU.
The time taken for the system to load and retrieve database data will depend on the
network connection over the Internet and the performance of the server.
The user interfaces with the system via a web-based interface in the user's browser.
The system should be able to retrieve data from, and write data to, the database, and
generate and display surveys for completion with minimum delay. The fact that a database
users. The response time between a request being made by the user and the response
1. Error in writing to the database. In the case of this error, the user will be directed to an
2. The client may not successfully connect to the server. The client will be informed that
The bear minimum of the system is to allow survey administrators to create surveys and
administer them through email notification to the intended survey takers. The system could
be modified in the future by adding a 'find' functionality and also making all the surveys
The system will be implemented using ASP.NET C# and an SQL Server 2000 database. The
system will perform best using an Internet Explorer browser. Mozilla Firefox browser can
also be used.
The use of the system will be user authenticated. Pages within the system will also be
authenticated causing restrictions to certain pages for certain users. The Survey Taker will
only be able to access the page where they can complete a survey and survey administrators
will be able to access pages for survey creation purposes. The system administrator will have
3.3.1 Scenarios
This secdon highlights the scenarios identified for the proposed system. A scenario is an
informal, concrete, focused description of a single feature of the system from the viewpoint
Flow of Events
2. The user is redirected to the system login page, if they are not authenticated.
Once the user logs in successfully, they are redirected to the main page, where
they are presented with surveys they have previously created, with the options of
3. The user selects creation of a new survey. The user is forwarded to a page where
4. The user goes through the steps of defining the survey criteria. A textbox is
available to input new questions, a button to add the new question to the survey
5. The user inputs a question, selects a question category (if none selected, question
is given the default 'General' category will be used) and presses the add button,
this displays the question in a table on the page labeled 'Survey Questions.'
6. The user selects a question category from the table which is used to display
existing database questions. This lists questions from the question pool in the
database from the selected category in the table labeled 'Existing Questions.'
From this the user checks all questions required from the chosen category. The
user then clicks the 'Add' button, which adds it to the list of questions for the
survey being created. These questions are displayed in the table labeled 'Survey
Questions.'
33
7. The user clicks the 'Continue' button and is brought to a page where he/she can
configure question answers. Questions already in the database are listed with
their answers displayed below them in a table (future system, user can modify
answers). New questions (questions not already in the database) do not already
have a selection of answers attached to them. A user selects the question from
the table of questions to provide an answer for and selects the format of the
answer(s), then enters the relevant answers for that particular question (user can
8. User then clicks the 'Create Survey' button and is directed to a page that displays
the created survey. If the user is happy with the survey, he/she can click the
finish button to create the survey permanently on the database. The user must
provide a start and expiration date for the survey to be taken by and a survey title
in the textbox provided. Once the 'Submit' button is clicked, the user is brought
back to their 'Main' page where they can select a survey to send to the specified
users group (future system, can modify user group and expiration date, select
multiple user groups for survey, this should send duplicated notifications to
Flow of Events
2. The email details the survey to be taken and provides a link to the page where
3. User accesses the main page of the system via the link in the email notification
4. Once authenticated, the user is sent to the main page of the system for the user
with a survey taker role. Here the user is presented with a list of surveys they
have previously taken and those they are required to take, with dates taken and
expiration dates displayed. Expired surveys and surveys already taken are
inactive.
5. The user selects a survey to complete via the survey title and is taken to a page
that displays the survey. Once completed, the user clicks the 'Submit' button; and
7.
This section establishes the use cases for the system and goes on to describe each use case.
"A use case represents a complete flow of events through the system in the sense that it
describes a series of related interactions that result from the initiation of the use case"
The use case diagram for the Online Survey System is shown below in Figure 1. There are
three actors for the system, Table 1 ; an actor is an external entity, which interacts with the
Survey Taker
-
System Administrator
Survey Administrator
In the sections following are detailed descriptions of the use cases displayed in Figure 1,
These use cases were derived from the functional requirements listed in section 3.2.2.
Name: Log on
37
Description
A User (a System Administrator, Survey Administrator or Survey Taker) logs onto the
Survey System by entering the username and password. The system identifies the user as a
Entry Condition:
The user accesses the main web page, which checks user authentication before entering the
system.
Normal Flow
3. Successful login.
4. The system displays the main page for the appropriate user.
Alternative Flow
1. At step 2, if the user is not authenticated, the logon screen says displayed.
1. At stepl, if the system fails to connect to the database to pull the user information
for any reason, e.g. the server is not running, the system informs the user that his or
Description
A survey administrator creates Survey to be stored in system database and notifies target user
Entry Condition:
Authentication of Survey Administrator and selection of create survey option from main
page.
Normal Flow
2. A form is displayed where the survey administrator can create or choose survey
3. User types a question(s) into the provided text box, selects question category and
5. User clicks the 'continue' button and is taken to a page where he/she can assign
6. User selects each question at a time and enters the answer(s) for that question
selecting the format of the answer and clicks the 'add' (answers) button.
9. User survey title and description and provides survey expiration date.
10. User clicks the create survey button. It is compulsory for the user to provide a survey
Alternative Flow
1. At step 3, user can click the display button after selecting a question category.
4. Clicks continue.
1 . At step 6 or 9, if the user wants to modify, change, or remove the selected questions,
they click the back button and make the necessary changes.
Description
Entry Condition:
Normal Flow
1 User selects a survey from the list of surveys they have previously created, using the
select link.
2. The survey questions for the chosen survey are displayed in a table, with the answers
to those questions
Alternative Flow
1. At step 2, user selects the question he/she wants to change, the survey selected has
1. At step 1, user clicks the "View Statistics' link to view the response rate of the survey.
2. A chart is displayed indicating the amount of responses for the survey, and the
Description
Survey Administrator sends email to targeted survey takers for a particular survey to inform
Entry Condition:
Survey Administrator authenticated and is on the main page where surveys he or she has
Normal Flow
1. User clicks the select link for the desired survey in the table of created surveys.
4. A textbox is displayed where user can selected the targeted user group for the survey.
5. A textbox with a prewritten message is displayed; the message contains a link to the
survey takers main page. The user modifies the message as they wish and presses the
send button.
41
6. User receives a confirmation that the emails have been sent to all users in the user
Alternative Flow
2. A textbox displaying a prewritten reminder message is displayed and the user group
4. Confirmation emails are sent to those who have not yet taken the survey from the
Description
Survey taker logs on to the system having received an email(s) saying he/she has a survey to
complete.
Entry Condition:
User authenticated and is on the main page where surveys to be taken are displayed along
Normal Flow
Alternative Flow
1. At Step 1, if the survey expiration date has passed, the survey displayed is not
editable.
1. At Step 1, if the survey start date is after the current date, an error message is
displayed.
Description
Survey taker logs on to the system to view a survey he/she has completed previously or take
a new survey.
Entry Condition:
User authenticated and is on the main page where surveys to be taken are displayed along
Normal Flow
1. User clicks the select link for the survey he/she wants to view.
2. System displays survey, the survey is disabled so that the user cannot modify previously
selected answers
43
The class diagram for the Online Survey System shown in Figure 2 shows the initial classes
that will implement the system". The diagram is an overview of the classes (objects)
discovered and includes inidal attributes and methods. The diagram also shows the
relationships (associadon) between the object classes. Table 2 describes each object in terms
of its responsibilities.
Class Description
Quesdon This class represents the question object that is pulled
from the database or written to he database
Answer This class represents the answer object that is allocated
to a question when a question is created. A question
possesses an answer arraylist to hold multiple answers
for a particular question.
SurveyUser The class represents the user who is logged onto the
system. It holds the user's credentials.
QuesdonCategory This is a class representing the category a question is
assigned on creation.
UserAuth This class is used for authenticating the user that is
Figure 2 shows the classes (objects) identified for the system. In the design section, this
diagram will be refined showing class dependencies and any other fields and methods which
Question
Survey Category
has assigned to
1 1 1
has
Answer
assigned to
takes
UserGroup
Survey User
assigned to
User Interface
The graphical user interface for the system is solely for the user. Figure 3 shows the Login
interface the actors of the system will interact with in order to use the system. Figure 4 show
the survey creation interface that the survey administration will interact with in order to
create a survey.
EEOHB Favorites
JifTIhHTOni
Tods Hefc
tan
Q | Bari
Bad- • m
S
j!"
\
Search Favorites -f>
, s- i %
£J http://sfoxJeftfwetK.colstate.edu/ifowuJatm^ - a<*
Go jlc - v' |G Search • < ^i 10 blacked "*' Check - s AutoUnr - jrj Options
Columbus State
1 1
\ I V I: II S I 1
User ID
Password
Login
*lDoi» • Internet
LOIUMBUS MATH
II \ I V I: II S I I 1
Survey Syste
^
Please enter a question to acid to survey General
I
Add
Selecl Category
Building
Previous Next
Remove Selected
Table 3 details all the system functional requirements prioritized and cross-referenced with
Must Have - MH
Should Have - SH
Could Have - CH
47
to survey administrators.
Survey takers should be able SH Use case:
to view surveys that they have View Survey
previously completed.
Summary
This chapter has dealt with requirement elicitation and requirements analysis. The functional
and non-functional requirements of the system were captured. The functional requirements
concerned with the functionality of the system were used to develop the functional model;
this is represented in UML by use case diagrams. The use cases were then described in detail
using natural language; this is so that anyone reading the report without UML knowledge will
understand the purpose of the use case model. The non-functional requirements focused on
the user visible aspects of the system that are not directly related with the functionality of the
system, i.e. the platform in which the system should operate. After identifying scenarios and
use cases, the initial classes required for the system functionality were identified. This chapter
established the user interface for the system and prioritized and cross-referenced the
4.1 Introduction
This chapter is concerned with the design and implementation of the system. It discusses the
objects (classes) used to implement the system and the various techniques used to build the
system. This chapter focuses on defining the subsystem interface, also referred to as the
The web-based application being developed will have a 3-tier architecture. The three layers
of the architecture will be highlighted and their functionality will be discussed. In particular,
the user interface layer is presented and the business logic layer will be defined by the names
This section lists all decisions made for the system to provide the services it proposes
effectively.
Providing the title field and description field for created surveys
As with any survey that one takes, a title is necessary to provide some indication of the
purpose of the survey. The provision of a survey title and description by survey
administrators on creation of a survey will serve as a way of informing the students of the
purpose of the survey they are required to take. The survey title will be required, but the
survey description will be optional. These will be used in the email notification to students.
50
Allowing survey administrators to view all answers that have previously been used for
a selected question
Survey administrators will have access to answers that have been used for a selected question
by any other survey administrators. The database will record all answers that have ever been
used for a question. This will save time on survey creation and decrease the chances of data
duplication.
The application will only have answers that have already been selected for a question
previously to be displayed when a user is formatting answers for a question. Having all
answers displayed will cause a problem once the answer pool gets large as users will have to
search through hundreds of answers to find one that they wish to use. Instead, allow the
users to create new answers if they cannot find the one that they wish to use and when
writing the new answer to the database ensure that it does not already exist.
New questions and answers written to the database during survey creation
New questions will be written to the database when the survey administrator adds them to
the survey being created. New answers however will not be written until the survey
administrator has actually attached the answer to the question for the survey. This will help
the user to retrieve new questions and answers if the connection is lost, and limit data
duplication, as the survey administrator will only be able to view answers which have been
The application will provide a three step process for selecting answers for questions. For
step 1 the user will select a question number. For step two, the user will select an answer
format. For step 3 the user will select responses for the selected question. The three-step
process should be visible only when 'Set Answer' link button for a question is selected. The
'add new answer' input control should remain invisible until the user clicks the 'add new
answer' button.
The application will provide answer formats that do not need to give any answer. These are
Open Ended (One Line), Open Ended (Multi Lines), Dichotomous (Yes/No), Likert
an answer for a question, he or she will not need to choose answers for the question, these
To avoid null pointer exceptions and limit the amount of connections to the database, store
relevant survey objects and question objects in the session once data for them is retrieved
.netCHARTING is a .NET control that will enable the application to display dynamically
generated data quickly and easily through a visual interface. This control is written in C# and
52
will be integrated with the application to enable statistical view of survey results. The
Overview
The proposed architecture for the system is the three tier architecture usually used in web
The presentation layer consists of HTML and ASP.NET pages; these create the look and
feel of the user interface. The business layer uses the code-behind classes to control the flow
of the application; these classes are written in the C# programming language. These code-
behind classes call other C# classes to store and retrieve data from the database and at times
forward the results to the ASP.NET pages or other code-behind objects. The data layer
These three layers are relatively independent and should be kept as separate and independent
as possible.
53
Presentation Layer
This is user interface of the application. This lavcr is
R,,o;«» M ^<v^
#
T T ^, TO , jM.
Query Data
The system can be divided into three subsystems that correspond to the 3-tier architecture.
Figure 6 is a diagram of the subsystem decomposition for the online survey system.
i 1
Survey AppObject Survey Application
+UserGroup +survey
+SurveyUser +answers2
+SurveyPageBaseClass +SurveyError
+Survey +Main
+UserAuth +create
+SurveyControlBaseClass < +AdminMain
+SurveyObject +page
+QuestionCategory +SendMail
+Pager +results
+Answer +messagesent
+Question +paging
+SurveyConnection
Security
subsystem objects used when the user interacts with the system. This is
As mentioned in the previous section (subsystem decomposition), the system is divided into
three subsystems: the survey application subsystem, survey object subsystem, and the
database subsystem. This section describes the services these subsystems provide for other
subsystems. 'A service is a set of related operations that share a common purpose' [Bruegge
This subsystem is concerned with the initialization of the system and is responsible for
interfacing with the user. The subsystem contains all the user interface files for all users of
the system. It contains the interface for the survey takers to complete surveys and view
completed surveys, for survey administrators to create and send out email notifications for
This subsystem interfaces the Survey Application Subsystem with the Database Subsystem.
It contains twelve classes which work together to provide database connection for the survey
when needed by the application subsystem and holding data retrieved from the database in
Database Subsystem
This subsystem is responsible for holding data and querying the data store for use by the
Survey Object Subsystem. The entity relationship diagram can be found in Figure 24.
56
This section provides the interface screen shots and class diagram for the Survey Application
subsystem. The class descriptions for each class with this subsystem can be found in
Appendix B. The diagram contains the attributes and operations of each class and the
association, which relate the objects. The class descriptions describe in detail each object in
terms of attributes and operations and their visibility. The following are definitions for the
Private- private methods and fields are visible only inside the class they are defined,
Static- static methods and fields belong to the class as a whole, rather than to any individual
instance.
57
Hid -.-..Men, -Vet •eijni SvSiefii weo ui vyebConuols TertBo Web Ul WebContiols Button
•blnSriowAttOFoim System
•Ibmne System Web Ul WebContiols Label •DSOuestionGnd Syslem wed Ul WebConiiois OalaOMc •trtSudiectSystem Weo Ul We
•nine Syslem Web Ul WebC onbols TertBoi MbCluestAddSTSlem Web Ul WebC (introls.euHon •baa Syslem.Web Ul WebCor jolsfiutlon
•idten System wed Ul WebConlrois Label •lemovebtn Syslem web Ul WebC onliols Button •curl Syslem Web Ul WebC on IOIS Button
-inrnaieeC omponentvoia
weln oiiliuls TertBoi •10 System web UlwebConrro
•FcrmalType Syslem Web Ul WebC onliols I
cnangeSysli rfcju b stio nC atAdd System Wei Jl WebContiols DropDownL flfom Syslem Web Ul WebC on
treateSurvey^Clickyold
febCoi •etnopdovm System Wed Ul ebConiiois OiopOownust »sub|ecl Syslem Weo Ul WebC
•geWnsyveis AirayUsi #*-..!. .
•cutienlQuestron Question
SirutyApp u l m ta l p age --.-l„IMd.l
•o»l System weo ui i ontrois HtmlTableCeli •lesporiseCeii ".i.i-''iWi,h n™i ir,,K Tshiu. eli •sBlRecipienlsstring
fiOPac.es System Wi -<lr[,ii-|.nlr.,l, ntpi.IT-ir.le' f •Orid Pagemoert angea rold -btoSend Clitk.void
•surveyTille System' •LoadDalavoid
•dbQueslAdd_Ciic yoid xsr
•btnBack Ciickroi
PageSize ml •btnContinue Cllr.k-.oid
It Systen
•tiiF jijie::.;e vvilem Weii'ii ^el>' ...r.ii...l ,
>.1Bl.
•hdnOyerallPeicent Syslem Web Ul HlmlConljcls H
-dsDalaSel
• o.l System Web Ul HtmIC onliols HlmlTableCell
riveiaiiPiogiess ml
BuiiaTables .1
The user interface is the presentation layer of the system. This section highlights the web
pages of the application and details how the pages are used in order to interact with the
system.
4.6.1.1 Login
This is the main login page of the system. From this page users are authenticated and
redirected to the page they initially requested. If the user came directly to the system login
page, the user will be directed to the default. aspx page, Figure 8.
4.6.1.2 Default
The user is directed to this page from the login. aspx page, Figure 3. Here the user is
presented with three links, one for System Administrators, one for Survey Administrator,
and one for Survey Takers. The user selects the appropriate link depending on which of the
three user roles the user falls into. If the user selects a role to which he/she is not assigned,
the user is redirected to the surveyerror.aspx page, Figure 22. When the System
Administrator link is clicked, the user is redirected to the sysadminmain page. When the
Survey Administrator link is clicked, the user is redirected to the adminmain.aspx page,
Figure 9. When the Survey Taker link is clicked, the user is redirected to the main. aspx page,
Figure 19.
59
^J Back • »] S" ,
:
Search Favorites 4p ii *
4»] http://studentwebs.colstate.edu/idowuJatima/Application/SurveyApplication/default.aspx v flGo
Go gle » v ( , Search $$S §1 10 blocked ""/ Check - N AutoLink - [rj Options
Survey System
L Mi^ijljt
4.6.1.3 AdminMain
The user is directed to this screen either from the login. aspx page, Figure 3, or from the
default.aspx page, Figure 8. This page displays all surveys that have been created by the user
in a table. The information displayed about each survey is the survey title, the start date of
the survey, the expiration date of the survey, and the date the survey was emailed to
60
students. For each survey in the table, the user can choose to view the survey by clicking the
'Select' link; in this case the user is redirected to the survey. aspx page, Figure 13. The user
can choose to delete the survey by clicking the 'Delete' link, this will delete the survey from
the database and from the user's view. The user can also choose to view the response rate of
results. aspx page, Figure 17 . Finally, on this page the user can use the 'Create Survey' button
to start the survey creation process. Once the 'Create Survey button is clicked the user is
LoqOut
SiJ r ey
Select Delete l Start Date Expiration Date Sent
Create Survey
m # Internet
4.6.1.4 Create
Figure 10 shows the first page in the survey creation process. The user has the option of
selecting questions from the database to add to the current survey. To do so the user must
select a question category from the dropdown menu in the table labeled 'Select Question
from Database.' Once a question category is chosen, all questions in that category on the
62
database are displayed in the grid. The user can select the questions he or she wants to use
by checking the check boxes or the user can select all the questions by clicking the 'select all'
link. The user then needs to click the 'add selected' button to add the selected questions to
the current survey. These questions are then added to the grid labeled 'New Survey
Questions.' The user can add a new question to the survey by clicking the 'add new question'
button. A panel is displayed, Figure 10, containing a textbox where the user can input the
question they wish to add to the survey. By default the 'General category is selected. The
user can choose another category and press the add button. This writes the new question the
database and adds it to the grid displaying questions for the current survey. In the 'New
Survey Question' grid, the user can remove questions by selecting the 'remove all' link or
checking the questions they want to remove and clicking the 'Remove selected' button. The
continue button takes the user to answer2.aspx page, Figure 11, which is the second stage of
the survey creation process. The back button takes the user to the adminmain.aspx page.
63
File Edit
!«»»
View Favorites tools
mm Help
Hi sr
Q Back * *"
m , 5e*ch FavorJes & - ~~\ -
f %
• 1
-
http://studentwet>s.colstaCe.edu/idowu_l aCrna/ApplitdtKxVS'Jti'eYAppkceSion/create.aspi «CJc
Co jt. - v t search • *' ^ 10 blocked '"- Check -
s AutoUnk -
'J Options
f
....„...,....,...
iaai
What i
s youi maioi :
'
P^JP^TMsI^Pb^bJb^BbI ^Tl Selecl C ateqor*
Student
MT avnmmi
D
What is you student status' Student 1
n
Previous Next
Remove Selected
£_ Done
4.6.1.5 Answer2
This answer2.aspx page, Figure 12, is used by a survey administrator. Here the user selects a
question from the grid by clicking the 'Set Answer" link and is presented with three steps.
Step one is the selection of the survey question number, step two is the selection of the
question response format, and step three is the selection of the response. Once the user
selects an answer format, the 'use selected' button is displayed. The user can select answers
from the answer grid or add new responses by clicking the 'add new answer' button. The
64
'add new answer' button displays a panel where the user can select the amount of answers he
or she wants to add, Figure 12. Once this is selected and the 'configure' button is clicked, a
textbox appears where the user can input the desired answers. The user can then press the
add button, these responses are now added to the answer grid, where the user can select the
answers they want for the question and click the 'use selected' button. This information is
updated in the question grid, which now displays the question number, the question and the
answer format. To change the question number, the user can select the question then select
the empty choice from the question number dropdown list and click the 'use selected'
button. This will make the previous question number available and give the user an
opportunity to choice a new question number. Once all questions are configured, the user
clicks the 'create survey' button and is redirected to the survey page.
:
65
E
He-
Q
GO
p— tdrt
Back
:*'* -
*
Vie* Favorites
*
Tods
£" |
Hefc
Sea/ch
^M
Favor»«
#Thttp://studcntwetjs.cofstate.edu/riowu_t'dtm s/ApplicatK>a/SijtvevAppkation/df
v t. search
i
-
f £b
^
10 blocked ^f Check tj Options
vflc
LiiJJJI
How muc h would you say you have learnt from thi s class? Classes Not Set
Answer
Set
How often do you use the library? Facilities Not Set
Answer
Do you find the books in the book store more expensive than outside of ej
Facilities Not Set
campus bookstores? Answer
|Previous Next
Stepl:
Question Number: 5 »
Step 2
Step 3:
C'Done # Internet
HlifrmfllFM'MillilHli'llflM
File Edt View Favorites Tools Help
** • ~~\
(^ Bad. * , Search Farartes 4P -
f ',
• 1 Mtp://shxlertfwebs.ccdstate.echj/dowuJatima/^
»a<*
Co git- " ( Search - f Z,:. 10 Nocked '-* Check - i, Autolink * [rjoptions
Step 1:
Step 2:
Step 3:
HBHMml" «
*500
:
HHSSl
D
500- 1000 II
—T
tl'Done
I , rMBf llr'i
4.6.1.6 Survey
The survey. aspx page, Figure 13, displays the survey to the user. It displays the questions and
the responses with the web controls that correspond to the answer format chosen for the
question. From this page the user can choose to submit the survey, in which case the survey
is written to the database and the user is redirected to the adminmain.aspx page. To submit a
survey the user needs to provide a survey tide and start/end dates for the survey. The user
can also choose to update questions or answers by clicking one of the update buttons or
67
administer the survey to students by clicking the 'distribute' button. Finally, the user can
choose to send out a reminder for the survey by clicking the 'reminder' button. If the user
clicks an update button, he or she is redirected to the create. aspx or the answer2.aspx page.
If the user chooses the 'distribute' of 'reminder' button, the user is redirected to the
sendmail.aspx page.
EB
FJe Edt View Favorites rods He£
D Computer Science
r Mathematics
Engineering
C Biology
Senior
"
I
Next
Start Date
ffHB MKWRJfTfSBI
bun Hon rue wed r„u m bat Sun Men lue wed Ihu in SJt
1 2 3 i 5 1 2 3 4 5
2 a 2 la 11 12 13 7 S 2 IB 11 12 13
14 IS Ui 12 IB IS 2Q 14 15 142 1Z IB 12 2E
21 22 23 24 25 26 2Z 21 22 23 24 25 26 22
2fi 22 3D 31 28 22 IB 31
4.6.1.7 SendMail
The sendmail.aspx page, Figure 14, is used to send email notifications to students. Here the
user can select user groups to send a survey notification. This is done by pressing the 'To'
link, which displays a list of survey taker groups. Once the groups are selected, the user
replaces the text in the text area with a message and presses the send button. This distributes
the survey notification. An email is received by a survey taker, Figure 1 8, providing a link to
log on to the system and complete the survey. For anonymity purposes, the survey
administrator cannot see the email addresses of recipients. If a reminder is being sent, Figure
15, the user only needs to type a message and click the send button. The user is then
Favorites #^ S- * ^
PI http://>tudentwebs.colstate.edu/idowu_fatima/Application/5urveyApplication/sendmail.aspx v Qgo
Go glc ( , Search • AutDLink » Options
Columbus State
i \ i \ i: li s i i v
,urvcy System
Select Survey
Li Audiencelctil
to select
From idowu_tatima@colstate edu
multiple)
Subject Survey Notification [Survey title will be inserted by system]
Staff
Computer Sck
Nursing Major
Engineering
[Send j
nEmamammmsmmmmmmmmmmmm
File Edt View Favorites Tods Heb
Qr3aoV - n £ \
Search FavoHes $* '* 3 "
J
'"•
Columbus Statk
U N I V i: Jt S 1 T 1
From [email protected]
this 13 to remind you that the above survey has yet to be completed.
Please complete it at your earliest convenience You can access the .
[
Send |
4jDone Internet
The messagesent.aspx page, Figure 16, displays a confirmation of the email sent and
HimffllMiMHIfllflffl
File EcK View Favorites Too* Heb
Q Back - U
X
£ \
Search Favorites & - ~~\ •
i
"»
• 1 http://srudei^websxotet3teechj/diwuJatiTia/Apphca^cWSufveyAppkation/rriessagesent.asp>
CoOglt- v ( search -
f Q 10 blocked
f Check • i, AutoUnk »
|
Options
COLUMBUS STATi:
l \ I \ I II S I I ^
Survey Administered.
your Survey Message has been sent to:
CDone # Internet
4.6.1.9 Results
The results. aspx page, Figure 17, displays a graph of the response rate of the selected survey.
The back button takes the user back to the AdminMain page.
)
72
3 Complete Survey
File Edit View Favorites
Microsoft Internet Explorer
Tools Help
Q
Qfiack -
|*J Z\ '. Search ': Favorites 4p i. ^
ss ^jhtcp://scudentwebs.colstate.edu/idowu_fatima/Applicacion/SurveyApplic3tion/re5ults.aspx?=48 a
Co gle v G Search » $ §1 10 blocked "J Check • v AutoLink » » -| Optic
Survey System
Li.".l'JUt
Responses Rate
www.dotnetcharting.com
Development Version: Not for production use.
:
<
1 6 -
14 .
1 2 -
0.8
Survey
4.6.1.10Main
This is the main page for survey takers. This page displays all surveys that have been
assigned to the user for completion and those the user has previously completed in a table.
The information displayed is the survey tide, the start date of the survey and the expiration
date of the survey. The user can select a survey by clicking the 'Select' link; in this case the
73
user is redirected to the page.aspx page, Figure 20, where the user can complete the survey.
The system will warn the user if he or she tries to select a survey before the start date.
NHUIX >
Free Shipping No Late r ees 60,000- Titles
MHIHiiiTlH
New
Adiliesses Calendai Notepad What's - Moil Foi Mobile Upcnades
"
Options
Previous I
Ne>t I Back to Me-raae;
f^Vonage 1 FREE Month
V>"Ofer expires 5/15/00 Delete |
Reply Forward [..
Spa
Check Other Mail [Edit] This message is not flagged ( Flag Message - Mark as Unread |
L_-Trash [Empty] Please Click lieie and login to the system to complete the survey
My Folders [Hide]
Sincerely
Jobs |17| Fatima Idowu
Thesis (3)
Delete [ Reply Forward Spam Mov
i # Internet
Si Columbus State
HSZSL^LMHHHl
LoaOut
4.6.1.11 Page
The page.aspx page, Figure 20, displays a survey for completion by a survey taker. Five
survey questions are displayed per page. The user can click the previous and next links to
navigate through survey pages. Once the user has completed the survey, the user clicks the
'Submit' button which writes the users response to the database. This page informs the user
75
of the number of questions on the survey and the page number they are currently viewing.
Users can also view surveys that they have previously taken on this page, Figure 21.
However, the controls will be disabled, preventing users from modifying their responses.
April 18 2006
1 Pages Fust Prey Mext LdSI 5 Questions
Page 1
La y [Ut
Field Survey
1Pages First F'rey NM Last 2 Questions
Page 1
First Prey !
i t
4.6.1.12Survey Error
The surveyerror.aspx page, Figure 22, is used to display an error message when users try to
Columbus Staff
LoqQut
ftj Done
This subsystem is concerned with managing data from the database, placing them in objects
that can be manipulated by the Survey Application subsystem. Figure 23 illustrates the class
diagram for this subsystem. The class descriptions for each class within this subsystem can
be found in Appendix C.
78
-description string
formal string -LogErrorvoid UserAuth
-idint wnteToLog void IsUserVaiidbool
GelErrorsFormatted string
ToString: string
Answer LJ System Web Ul UserConl/ol
•Answer .Web.SurveyControlBaseClass
LoadFromReader void
GelAnswerBvQuestionlD AnayList SurveyApp.Objects. Quest lonC at egory
C^D ...Database.SurveyConnection #m_conn SurveyConnectjon
GetAnsweiS ArrayLiSt
categorytD ml
• GetAnswerBylO «Vnswei SurveyControlBaseClass
-description siring
_conn SqlConnectlon
GetFormatlDByDescription ml #WnleEventLogEntryvoid
GetlDByDescnption ml SurveyConnection #WnteCookie void
ToString string
GelAnswerFormats ArrayLisl
Open bool iFWrileCookievoid
QuestionCategory
AddAnswer void
Isupen bool *ReadCookie string
QuestionCategory
Close bool #OpenDBConnection bool
QuestionCategory
Format string #CloseDBConnection void
AnswerlD int
LoadFromReader void
Connection SqIConnection -PageJJnioad void
•GetAHC ategones ArrayLisl
Description siring #ShowErrorPagevoid
'.'etc alegoryByQuestionlD Questions ategory
+ GetC alegoryBylD QuestionCategory
DBConnection SurveyConnection
SurveyApp. Objects. Question
C ategorylD in!
-(ategorylD QuestionCategory
-pg PagedDataSource
-questionNumber string
-description string S urveyApp.Objecls.Survey
Pager
- answers ArrayLiSt
C^ System Web Ul Page Get_NextLink string -surveylD int
-answerSet AnayList
.Web.SurveyPageBaseClass Get_PreviousLink string -creation DateTime
•caldes string
GetNewData Source PagedDataSource -expire DateTime
-answerForm at siring
#m_conn Survey: unntHttion -laken string
-id int
DataSource iCollectlon -title string
-result obiect
SurveyPageBaseClass PagedSource PagedDataSource -description string
-selecledResponse ArrayLiSt
#WnteEventLogEntn/ void -creatorlD string
responseStnng string
JWnteCookie void -userGroups ArrayLisl
#WnteCookie void -questions ArrayLiSt
Question
#ReadCookie string -sent string
geIC ategoryOesctiptlon void
#OpenDBConnection bool
ToStnng string
#CloseDBConnectionvoid Survey
Question
Page_Unloadvoid du ... Object s.SurveyUser Survey
LoadFromReader void
#SnowErrorPage void ToString string
GetQuestionsBylatecjory AnayList
-firstName string Survey
OetAIIQuestions ArrayLiSt
DBConnection SurveyConnection -lastName siring LoadFromReader void
GetQuestionBylD Queslion
-emailAddress string
getSurveyBylD Survey
AddQuestion void
-userlD stung getSurveyByC r eatorlD ArrayLiSt
AttachAnswervoid
SurveyApp Obiects.User Group -password string -' ^):'^
wnteSurveyQuestionResponse void L^J *a.t, I
.
l
'C'
, l
l
l
^.'JP_^I ^y_kL
,
: ,
Sent string
Figure 23: Object Model (class diagram) of the Survey Application Object subsystem
79
The database subsystem is the data access layer of the application. The tables held in the
database and the relationships can be seen in Figure 24, which shows the associations
between each table where data from the system will be stored.
*
-5
er>
5
.1°
L. pan
0)
* tu
H
c
< 1) Q T>
r c
a g Q
, S
V
1 n J > > o
oj 5 d _
3
3 >• > in
1
<a £
la
> 1
w
s
c
3 o
0) C 3 pr -<B C0H ID
> < i/i
u
J
CO
Op > s
> 5
3 X 1; 1)
> in U h
3 O
p oj to
Summary
This chapter has discussed the design and implementation process for the online survey
system. It talked about the design decisions made during the implementation of the system.
The 3-tier architecture of the system was discussed. The subsystems of the system were also
detailed, in particular the responsibility of each subsystem was outlined and the object
models for each subsystem were presented. The entity relationship diagram for data access
5. System Evaluation
After implementation, an evaluation of a system is carried out. This chapter focuses on the
evaluation carried out on the system developed as part of this thesis. The evaluation
considers the functional analysis of the system, in which the specified system functionalities
are tested one by one. The results of this evaluation can be found on section 5.2.
Functionality testing ensures that the complete system complies with the functional and non-
functional requirements of the system. The testing carried out in this section is comprised of
functional testing. This uses black box techniques, and the test cases were derived from the
use case model. The functional tests were identified by inspecting the use case model in
Chapter 3 and identifying use case instances that are likely to cause failure.
The test cases were chosen by going through the use case description of each use case in the
use case model in Figure 1 and finding the features of the system which are likely to fail and
should be tested. What follows is a list of features from each use case which are likely to fail.
1. The user have been authenticated and is directed to the default page of the
application, where they are provided three links and should choose the link
appropriate for their user role. The System Administrator has access to all pages.
83
2. The user may try to access a page on the system without logging onto the
system.
1 The user tries to create a new question which is already in the database.
2. When configuring answers for a question, the user selects an answer format that
needs answers to be supplied for the question, but the user does not provide any
answers. For example, if the user selects a multiple choice answer format for a
question, the user needs to indicate what responses are to be used for the
3. The user selects the create survey button without providing a start or end date
for the survey. Alternatively, the user selects a start date which is greater than the
end date.
4. The user tries to submit a survey for creation without supplying a survey title.
1 The user logs onto the system with the intention of modifying a survey that has
2. User selects a survey that has not been sent to recipients and tries to send a
reminder.
3. User selects the 'View Statistics' link to view statistical information on a survey
4. User selects the 'View Statistics' link to view statistical information on a survey
84
1. User selects a survey that they have previously submitted in order to change their
Expected system User should be directed to the system error screen and informed
response that he or she does not have access to the requested page. The
system should deny access to the Survey Administrator if he/she
tries to access the main page System Administrator. The Survey
Taker will be denied access to the System and Survey Administrator
main page.
Observed system System responds as expected.
response
Entry Condition User types in the URL to access a user main page
Flow of test events 1 The user is redirected to the system login page
2. The user logs onto the system
Expected system The system redirects the user to the page initially requested.
response
Observed system System responds as expected.
response
.
85
Entry Condition User is on the create. aspx page selecting quesdons to create a survey
Entry Condition User has selected survey questions and is on the answer2.aspx page
configuring question responses.
Flow of test events 1. User selects a question from the question grid.
2. The selected question ishighlighted on the grid and a
response configuration panel is displayed.
3. The user selects a question number to assign the question on
the survey.
4. The user selects the multiple choice option from the answer
format dropdown list.
Flow of test events 1 User enters the survey title in the textbox provided
2. User enters a description of the survey.
3. User clicks the create survey button
4. User selects a start date but does not select an end date for
the survey-
Expected system The system should inform the user of the error and should not write
response the survey to the database until the date issue has been rectified by
the user.
Error 1 : start and end date must be provided for the survey
86
Entry Condition User has configured answers for the survey questions and is looking
at the survey on the survey screen.
Flow of test events The user clicks the submit button to write the survey to the database
Expected system The system should inform the user that a survey title is required,
response before the survey can be submitted to the system for creation.
response
response
.
87
Entry Condition User is on the AdminMain.aspx page, a list of surveys which the
user has previously created is on display.
Flow of test events 1. User clicks the 'View Statistics' link for a survey that has not
been sent out to students.
2. User is directed to the results page
Expected system The system should display that no data could be retrieved for the
response survey.
Observed system System responds as expected.
response
Entry Condition User is on the AdminMain.aspx page, a list of surveys which the
user has previously created is on display.
Flow of test events 1 User clicks the View Statistics' link for a survey that has no
responses from students.
2. User is directed to the results page
Expected system The system should display that no data could be retrieved for the
response survey.
Observed system System responds as expected.
response
Flow of test events User selects a survey that has already been completed with the
intention of resubmitting with different responses.
Expected system The system should display the survey with the responses chosen by
response the user and not allow any modification of these responses.
Summary
This chapter has described the testing carried out on the developed system. Functional
testing was carried out on the whole system. This involved finding the differences between
the functional requirements and functionality of the system. Test cases were derived from
the use case model developed in Chapter 3 and the tests were recorded. Two of the test
cases failed, details of these can be viewed in the Create_Survey_Test (1) and
6. Conclusion
The development of the Online Survey system was successful and provided the
functionalities initially proposed. However, when considering the further development and
improvements for the system, a few new features were realized. These features are outlined
Currently the option to update questions and responses are available, but when these
changes are written to the database a new survey is created. It is desirable that updates to
survey questions and answers be written to the database as updates rather than new records.
Ability to resend surveys to different groups of students than those which they were
originally sent
Once a survey has been distributed to students, a survey administrator can only send
reminders to the group of students the survey was inidally distributed to. The system does
not allow a survey administrator to include another group of students to receive the survey.
Currently the system supports six answer formats, these are Likert (Agree /Disagree),
Dichotomous (Yes/No), Open Ended (multi lines), Open Ended (one line), Multiple Choice
(multi selection), and Multiple Choice (single selection). An extension to the system would
90
be to support Table (multiple choice), Table (open ended), and Multiple Choice (dropdown
list).
The process for changing a survey question number is a little tedious. An improvement to
the system would be to make the survey question number selection easier by allowing users
to be able to drag and drop survey questions to the position in the grid that they want the
Currently survey descriptions are not being used by the system. On improvement, the
will test the user satisfaction that the ease of use of the system for survey administrators
when creating and administering surveys and the ease of use and convenience of use for
survey takers. All use cases identified during the development of the analysis model will be
cross-referenced by the system requirements; therefore, the usability testing will aim to
derive from the users whether or not the requirements of the system were met.
The interface would provide the system administrator with the ability to create users
91
This will avoid the case of a user mistakenly deleting a survey. With a delete confirmation the
The survey process should be changed to ask for survey dtle at the first step rather than at
Although it would be ideal for every survey sent or distributed to students for the purpose of
educational evaluadon to be completed and returned in a timely manner, this is usually not
the case. In academia, and the same is true for the non-academia world, there are many
factors that contribute to recipients not completing surveys; these were discussed in Chapter
2. Due to these factors, it is difficult to get students to complete surveys in their entirety
Despite the fact that there are several influential factors that tend to prevent students from
2. Anonymity of responses
92
faculty's support of issuing student surveys; whether online or paper-based. Because faculty
members are responsible for distributing the surveys, the following are some factors that
application that would allow faculty members to adopt the role of survey administrators and
administer surveys to students that they teach. As part of this, the intent of this project was
to produce a system aimed at providing CSU Computer Science faculty members with a tool
to create surveys and administer these surveys online. The system developed took into
consideration the factors mentioned above and implemented functionality that would
protect anonymity, provide the convenience of allowing survey administrators to create and
administer surveys and the convenience for survey takers to complete surveys in their own
time. The system also provides an analytical tool to interpret survey responses. The attributes
of the system focused on were ease of use, ease of distribution and provision of data analysis
tools. The system developed utilizes user authentication in order to restrict users from
completing a survey more that once and makes surveys available only to those students
specified by faculty The system does not enforce that students should answer all questions
on a survey; this eliminates factors that could discourage student from completing the
On evaluation of the finished product of this project, it was evident that the objectives were
met successfully. All the functional requirements listed for the proposed system in Chapter
[Brueggc & Dutoit, 2000] Bruegge Bernd & Dutoit Allen H, Object-Oriented
Software 1 Engineering. Prentice Hall, 2000.
[Cummings & Ballantyne, Gumming Rick & Ballantyne Christina., Online student
2000] feedback surveys: Encouraging staff and student use,
Refereed Proceedings of Teaching Evaluation Forum,
p29-37, August 2000
[Denman, Robinson & Robinson Paulette, White Jason, & Denman Daniel W.,
White, 2004] Course Evaluation Online: Putting a Structure into Place,
Proceedings of the 32nd annual ACM SIGUCCS
conference on User services, p 52-55, October 2004
^
http://www.geog.le.ac.uk/orm/questionnaires/quesprint
^
3.pdf
http://jcmc.indiana.edu/voll0/issue3/wright.html
Appendix A
Ten Usability Heuristics
The system should always keep users informed about what is going on, through
The system should speak the users' language, with words, phrases and concepts
Users often choose system functions by mistake and will need a clearly marked
Users should not have to wonder whether different words, situations, or actions
Error prevention
Even better than good error messages is a careful design, which prevents a
conditions or check for them and present users with a confirmation option
Minimize the user's memory load by making objects, actions, and options visible.
The user should not have to remember information from one part of the
97
dialogue to another. Instructions for use of the system should be visible or easily
Accelerators — unseen by the novice user — may often speed up the interaction
for the expert user such that the system can cater to both inexperienced and
Every extra unit of information in a dialogue competes with the relevant units of
Even though it is better if the system can be used without documentation, it may be
necessary to provide help and documentation. Any such information should be easy to
search, focused on the user's task, list concrete steps to be carried out, and not be too large.
Appendix B
The following sections describe the classes and attributes of the Survey Application
subsystem. Figure 7 depicts the object model of the survey application subsystem. Any
classes that are not included in these descriptions can be found in the class diagram, Figure
7.
Class _default
This object handles the functionality of the default page of the application
Field Summary
protected administrator
utton
Link to the system administrator main page
Protected surveyadmin
LinkButton
Link to the Survey Administrator main page
protected taker
LinkButton
Link to the survey taker main page
Method Summary
private void InitializeComponent Q
Used to initialize components on the webpage and load event
handlers.
Class AdminMain
This object handles the functionality of the main page for the Survey Administrator. It
populates a DataGrid with all surveys created by the Survey Administrator and allows for the
Field Summary
protected nevvSurvey
button
Button used for creadon of new surveys.
Protected SurveyGrid
DataGrid which holds survey object items retrieved from the database
for the Survey Administrator. These are all surveys that the survey
administrator has created. The DataGrid displays the survey title, creadon
date, expiradon date and the date sent for each item.
Method Summary
private void newSlirvey Click (object sender, System.EventArgs e)
This method handles the create survey button click. Once the button is
Handles the 'Select' and 'Delete' and 'Stadsdcs' commands for the
DataGrid. When an Item is selected using the 'Select' link on the DataGrid,
this method sets a session object indicating whether the selected survey had
been sent to recipients and then calls the survey page passing the survey id as
a query string.
When an item is selected using the 'Delete' link, this item is deleted from the
grid and the database.
Class answers2
Field Summary
protected addAnswerbtn
utton
Button used to add a new answer to the Answergrid control
protected Table addTable
Table that holds controls for creatine new answers
protected AjisNo
DropDownList
Dropdown list that holds numbers used to create textboxes for
answers
protected answerFormat
DropDownList
DropDownList that holds the different answer formats that can be
assigned to a question.
protected AnswerGrid
nd
DataGrid that displays all answers attached to the selected question
protected mt AnswerlD
Used to hold an Answer object answered
protected back
Button
Back button
protected Table border
Table that holds answer configure controls
protected btnQuestionAdd
Button
DropDownList that holds question numbers
protected configBut
Button
Configure new answer(s) button
protected cont
Button
Continue button
protected currentQuestion
Question Holds the currently selected question
protected Label FormatType
Label that holds the response format of a question
protected QNoddl
DropDownList Question number dropdown list, used to select a question number for
the selected question
protected QuestionGrid
ta
DataGrid that holds and displays the question objects for the survey
being created
protected responseCell
TableCell
TableCell that holds all controls used to create a new answer
protected uscAnswctBut
Button
jj se answer button
Method Summary
protected void addAnswerbtn Click fobject sender, System.EventArgs e)
Adds new answers to the AnswerGrid and attached the answers to the
question in the database
Puts the "Select" string at the first index to the answerFormat control
stnngQ numbers Q
Returns a string array containing the numbers 1 to 1 and the string
"Select" in the first index. Returned array is bound to the AnsNo control.
102
Loads the page and binds the data sources for the QuestionGrid,
answerFormat DropDownList and the AnsNo DropDownList
protected void QuestionGrid ItemCommand fobject source, DataGridCommandEventArgs e)
Handles the '
Select' command for the QuestionGrid which displays
the border control and binds any answers that are attached to the selected
question to the AnswerGrid
Sets the text for the Format'lype of each question in the QuestionGrid
if the question has a format type the text of the edcon link button is changed
to 'Change' is not it remains as 'Set Answer'
Class create
Field Summary
protected addTable
HtmlTable
Table that holds the control used to create a new question
protected back
Button
Back button
protected btnQuestionAdd
Button
Buttons used to add new questions to the SurveyGrid and database
protected btnShowAddForm
Button
Button used to show the addTable control
protected categories
ArrayList
ArrayList that question categories from the database
protected COIll
Button
Continue button
protected dbOuestAdd
Button
Button used to add database questions the SurveyGnd
protected DBQuestionGrid
DataGnd DataGrids that hold questions from the question pool in the database
protected dbOuestions
ArrayList
ArrayList that holds questions retrieved from the database
protected e Dropdown
DropDownList DropDownList that holds question categories
protected question
Question Question object
protected questionCatAdd
DropDownList DropDownList that holds question categories
protected removebtn
Button
Remove question button, use to remove questions from the
surveyGrid and the survey being created
protected SurvevGrid
DataGrid
Datagrid that holds the questions for the survey being created.
protected survevOuestions
ArrayList
ArrayList that holds survey questions
protected txtOuestionAdd
TextBox Textbox input for new survey questions, questions which do not exist
already
Method Summary
protected void btnBack Click fobject sender, EventArgs e)
Checks the SurveyGrid for selected questions and removes them from
the grid
Class Main
This object handles the functionality of the main page for the Survey Takers. It populates a
DataGrid with all surveys that the Survey Taker has to complete and those already
completed.
105
Field Summary
protected Label beforeStartDatc
Error message label. Use to display error when a user selects a survey
protected SurveyGrid
DataGrid which holds survey object items retrieved from the database
for the Survey Taker. These are all surveys that have been assigned to the
Survey Taker, whether they have been completed or not. The DataGrid
displays the survey title, expiration date and the date that the survey was
completed.
Method Summary
private void Page Load fobject sender, System.EventArgs e)
Makes a request to the SurveyApplication subsystem to obtain surveys
assigned to the Survey Taker the first time the page is displayed in the users
browser.
selected using the 'Select' linkon the DataGrid this method checks the start
date of the selected survey and if the current date is before the state date sets
the error message of the 'beforeStartDate' label and sets the table to visible. It
also sets a session object indicating whether the selected survey had been has
expired or has already been completed by the user and then call the 'page'
page passing the survey id as a query string.
Class messagesent
Displays a email sent confirmation message, listing the user groups that the survey email has
Field Summary
protected recipients
t st
Datalist of recipients that the survey email has been sent to
106
Method Summary
private void Page Load fobject sender, System.EventArgs e)
Loads the messagesent page, binding the 'SendTo' session object to the
recipients dataiist.
Class page
This object handles the functionality of the 'page page for the Survey Taker. Its dynamically
Field Summary
protected back
Button
Back button
private int EndOfPage
The number of the last question on the current survey page
protected submit
Button
Submit survey button
protected Label surveyTitle
Survey title label
protected tdPages
HtmlTableCell
HTMLCell that holds survey questions
the pagesize
Method Summary
protected void back Click fobject sender, System.EventArgs e)
P nvate FillPages f Questlon Record, int tableNumber, int pageNumber, int numOfRecords)
Html table
Places the questions in the dynamically generated tables and puts these
tables in the tdPages cell.
Loads the survey page. Also sets the Survey tide for the survey. Checks
whether the survey has expired or has been taken if so disables the submit
button.
Generates the JavaScript to handle paging for the table holding the
survey questions
Collects the user responses for each question in the current survey and
creates a response entry in the database for the user.
Class SendMail
This object handles the functionality of theSendMail page for the Survey Administrator.
Field Summary
protected Label audiencelbl
Labels used to identify the ListBox of groups
protected back
Button
Back button
protected btnScnd
Button
g encj button, used to send email
protected emails
Array t
ArrayList that holds email recipient addresses
protected groups
ListBox
ListBox that holds all the users groups in the system.
protected select
button
Button used to select survey recipients from the groups listbox.
protected to
L Button
LinkButton used to display the groups listbox of survey recipient
groups
protected txtBody
ox
Textbox used to hold the body (message to be sent) of the email
protected txtFromAddress
lextx Textbox which holds the survey administrators email address
109
protected txtSllbject
TextBox
Textbox which holds the subject of the email message
protected txtToAddress
ox
Textbox which holds the group names of the survey recipients
protected usergfOlips
HtmlTable
HTMLTable that holds the groups listbox
Constructor Summary
SendMail O
Sets the SmtpServer to localhost.
Method Summary
private void back Click fobject sender, System.EventArgs e)
Redirects the user to the survey page, where the current survey is being
displayed.
Handles the event of the send button. Creates a mail message, gathers
the 'to', 'from', subject and body of the message from the appropriate
controls and send an email message.
Sets the email message body and from fields the first time the page is
loaded.
Handles the event of the select button. Takes all the selected user
groups from the groups listbox and sets the text for the txtToAddress
control. Also stores the selected groups in a 'SendTo' session object
Class survey
Field Summary
protected answerCell
Cell which holds question answer controls
protected back
Burton
Back button
protected chooseDates
HtmlTable
Table thaf holds ca l en d ars
protected cont
Button Create survey button
protected createdSurvey
Purvey Survey object that has just being created
protected distribute
Button
Button used to go to the SendMail page
protected enddate
Calendar
Calendar used to choose an expiration date for the survey
protected hlNext
HyperLink
Tink button used forgoing to the next page of the survey
protected hlPrevious
HyperLink
Link button used forgoing to the previous page of the survey
protected pager
'
a K cr Object used for separating the survey questions into multiple pages
protected reminder
Button
email reminder button
protected
RequiredFieldValidatorl
RequiredFieldVa
Ill
lidator Checks whether a survey title has been provided. If a title has not been
provided and error message is displayed
protected startdate
Calendar
Calendar used to choose an start date for the survey
protected SurvcyDescription
TextBox
Textbox for survey description
protected surveyQuestionsList
DataList
DataList which holds the survey question objects
protected surveyTable
HtmlTable
Parent control of the surveyQuestionList
protected SurveyTitlc
TextBox
Textbox for survey dtle
protected updateA
Button
Update answers button
protected updateQ
Button
Update question button
protected updates
HtmlTable
Table used to hold the update and email buttons
Method Summary
protected void back Click fobject sender, System.EventArgs e)
Dynamically creates the controls for each question in the survey being
created
Handles the create survey button. Check that the start and end dates
chosen for the survey are valid and writes the survey to the database
private void distribute Click (object sender, System.EventArgs e)
Sets the 'UpateAnswer' session variable to true and redirects the user
to the answer2 page
Sets the 'UpateQuestion' session variable to true and redirects the user
to the create page
Class SurveyError
Handles the functionality of the Error page for the Online Survey Application. Any errors
Field Summary
protected Label lblEi rror
Label that holds the error message
Method Summary
private void InitializeComponent Q
Used to initialize components on the webpage and load event
handlers.
Appendix C
Class Descriptions
The following are all the relevant classes that make up the survey application object
subsystem. Any classes not included can be found in the class diagram in Figure 23.
Class Answer
Field Summary
private int answerlD
ID of an answer object
Constructor Summary
Answer (stnng desc)
Default constructor
This constructor takes a SqlDataReader and attempts to load the object from it
Method Summary
void AddAnswer SurvcyConncction
(
conn)
This method is used to add an Answer the DB.
string ToString Q
Returns the Answer description
Class Pager
Object used to set pages for the survey object on the user interface.
Field Summary
private
pg
PagedDataSource
p ages datasource
Constructor Summary
Pager (int pagesize)
Method Summary
string Get NextLink fPage P)
Class Question
Field Summary
private string answerFormat
String representing the answer format
pnvate categorylD
QucstionCatcgory QuesdonCategory object that the quesdon belongs to
Constructor Summary
Qliestion fstnng description)
This constructor takes a SqlDataReader and attempts to load the object from it
Method Summary
void AddQuestion SurvevConncction
f conn)
This method is used to add a Question to the Database.
void getCategoryPescription Q
Sets the category description for the Quesdon object
Call a stored procedure to get the quesdon by category ID, returns all
SqlDataReader
string ToString O
Returns a description of the question
Class QuestionCategory
Field Summary
private int categorylD
Category id
Constructor Summary
QiiestionCategory fstnng des, int id)
Method Summary
static ArrayList GetAllCategories f SurveyConnection conn)
Call a stored procedure to get all of the question categories,
returns an ArrayList of QuestionCategories
Class Survey
Field Summary
private creation
DateTime
Survey creation date
18
private expire
DateTime
Expiration date of the survey
private questions
Array t
ArrayList of Questions that make up the survey
survey taker
Survey dde
private userGroups
ArrayList
ArrayList of user groups to whom the survey has been assigned
Constructor Summary
Survey (ArrayList questions, string title, string description, DateTime creationdate, DateTime expirationdate,
string creatorlD)
This constructor takes a SqlDataReader and attempts to load the object from it
Method Summary
void AddResult SurvcyConncction conn)
(
void AddSurveyQuestionAnswer (
Survcy( Connection conn, int answerlD, int
surveyQuestionID)
This method is used to add an answer to a survey question.
Gets the given question response for the user for a survey
string ToStringO
Returns the survey description
This method is used to add the sent date of a survey to the database.
Class SurveyConnection
Field Summary
private m conn
SqlConnection
Sql connection object
120
Constructor Summary
SurveyConnection (stnng strConnectionStnng)
The constructor takes a connection string and automatically opens the connection
Method Summary
bool Close O
Closes the database connection
bool IsQpen Q
Returns a boolean indicating whether the connection is open
bool Qpen fstring strConnectionStnng)
Opens a database connection
Class SurveyUser
Field Summary
private string emailAddress
User's email address
Constructor Summary
SurveyUser fSqlDataReader reader)
Method Summary
static getUser(stnng strUid, SurveyConncction conn)
Survey User Call a stored procedure to get user, returns a user object
Class UserAuth
Field Summary
private string err
Error string
Method Summary
stnng GetErrorsFormatted Q
Returns the error string
bool IsUserValid fstnng strUid, string strPwd, bool IsValid, Survey-Connection conn)
Returns a boolean indicating whether the user is a valid user
Class UserGroup
Field Summary
private stnng description
122
Constructor Summary
UserGroilp fSqlDataReader reader)
This constructor takes a SqlDataReader and attempts to load the object from it
Method Summary
static void AddUserGroupToSurvey SurvcvConnection (
conn, int surveylD, int
userGroupID)
This method is used to add a usergroup to a given survey.
ArrayList 'emaikname'
SqlDataReader