0% found this document useful (0 votes)
495 views35 pages

E-Learning System

Uploaded by

Eminent Projects
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
495 views35 pages

E-Learning System

Uploaded by

Eminent Projects
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 35

E-LEARNING SYSTEM

Abstract

The purpose of the E-Learning system is to automate the existing manual system by the
help of computreized equipments and full-fledged computer software, fulfilling their
requirements, so that their valuable data/information can be stored for a longer period with easy
accessing and manipulation for the same. The required hardware and software are easily
available and easy to work with. E-learning system, as described above, can lead to error free,
secure, reliable and user friendly.

Every organization, whether big or small, has challenges to overcome and managing the
information of Student, Assignment and Quiz etc. Every E-learning system has different
assignment needs; therefore we design exclusive E-learning system that is adapted to your
managerial requirements. This is designed to assist in strategic planning, and will help you
ensure that your organization is equipped with the right level of information and details for your
future goals. Also, for those busy executive who are always on the go, our systems comes with
the remote access features, which will allow you to manage your workforce anytime, at all times.
These systems will ultimately allow you to better manage resources.
Chapter I

Introduction

E-learning is an education via the Internet, network, or standalone computer. E-learning


is basically the network- enabled convey of skills and knowledge. E-learning refers to using
electronic applications and processes to learn. E-learning includes all forms of electronically
supported learning and teaching. The information and communication systems, whether
networked learning or not, serve as specific media to implement the learning process. This often
involves both out-of-classroom and in-classroom educational experiences via technology, even
as advances continue in regard to devices and curriculum.

E-learning is the computer and network-enabled transfer of skills and knowledge. E-


learning applications and processes include Web-based learning, computer-based learning,
virtual education opportunities and digital collaboration. Content is delivered via the Internet,
intranet/extranet, audio or video tape, satellite TV, and CD-ROM. That is to say E-learning
systems contain both Learning Management System and Course management system. It can be
self-pace or instructor-led and includes media in the form of text, image, animation, streaming
video and audio. It is commonly thought that new technologies can make a big difference in
education. In young ages especially, children can use the huge interactivity of new media, and
develop their skills, knowledge, and perception of the world, under their parents' monitoring, of
course.

Many proponents of e-learning believe that everyone must be equipped with basic
knowledge in technology, as well as use it as a medium to reach a particular goal and aim. In the
20th century, we have moved from the Industrial Age through the Information Age and now to
the Knowledge Age. Knowledge and its efficient management constitute the key to success and
survival for organizations in the highly dynamic and competitive world of today. Efficient
acquisition, storage, transfer, retrieval, application, and visualization of knowledge often
distinguish successful organizations from the unsuccessful ones.

The ability to obtain, assimilate, and apply the right knowledge effectively will become a
key skill in the next century. Learning is the key to achieving our full potential. Our survival in
the 21first century as individuals, organizations, and nations will depend upon our capacity to
learn and the application of what we learn to our daily lives. E-learning has the potential to
transform how and when employees learn. Learning will become more integrated with work and
will use shorter, more modular, just-in-time delivery systems. By leveraging workplace
technologies, e-learning is bridging the gap between learning and work. Workers can integrate
learning into work more effectively because they use the same tools and technology for learning
as they use for work.

Both employers and employees recognize that e-learning will diminish the narrowing gap
between work and home, and between work and learning. E-learning is an option to any
organization looking to improve the skills and capacity of its employees. With the rapid change
in all types of working environments, especially medical and healthcare environments, there is a
constant need to rapidly train and retrain people in new technologies, products, and services
found within the environment. There is also a constant and unrelenting need for appropriate
management and leveraging of the knowledge base so that it is readily available and accessible
to all stakeholders within the workplace environment.
Chapter II

System Analysis

System analysis is the overall analysis of the system before implementation and for
arriving at a precise solution. Careful analysis of a system before implementation prevents post
implementation problems that might arise due to bad analysis of the problem statement.

Thus the necessity for systems analysis is justified. Analysis is the first crucial step, detailed
study of the various operations performed by a system and their relationships within and outside
of the system. Analysis is defining the boundaries of the system that will be followed by design
and implementation.

Existing System

The current situation is very limited to few resources, students are unable to get
knowledge more than that the lecture provides to them. This in the end limits student’s
performances, because everything a student gets is collected from lectures in class.

Here are some of the problems of the current system:

 Students submit assignment to lectures through hard copies or personal emails.

 Students only get help from lectures if the lectures are in they’re office.

 New lectures to a course have to get materials on their own.

 Student are required to physical be in the classroom in order to gain knowledge thereby
sacrificing all other responsibilities.

Proposed system

The system will hopefully serve as a centralized database of syllabus for the courses
offered at the university allowing students and faculties (current, past and prospective), to view
them. The system will end up bringing an effective communication among students, lectures, and
the administration, by accessing information and other resources anytime, anywhere.
Here are some expected results of the project:

 Lectures to upload assignments and resources for their units.


 Students to download the resources and upload assignments.
 It provides an easy-to-use way to manage course websites that include schedule
information, announcements, as well as course discussions.
Chapter III

Feasibility Study

Preliminary investigation examines project feasibility; the likelihood the system will be useful to
the organization. The main objective of the feasibility study is to test Technical, Operational and
Economical feasibility for adding new modules and debugging old running system. All systems
are feasible if they are given unlimited resources and infinite time. There are aspects in the
feasibility study portion of the preliminary investigation:

Operational Feasibility

The application smart audit does not require additional manual involvement or labor towards
maintenance of the system. Cost for training is minimized due to the user friendliness of the
developed application. Recurring expenditures on consumables and materials are minimized.

Technical Feasibility

Keeping in mind the existing system network, software & Hardware, already available the audit
application generated in java provides an executable file that requires tomcat that provides
compatibility from windows98 without having to load java software. No additional hardware or
software is required which makes smart audit technically feasible.

Economic Feasibility

The system is economically feasible keeping in mind:

 Lesser investment towards training.


 One time investment towards development.
 Minimizing recurring expenditure towards training, facilities offered and Consumables.
 The system as a whole is economically feasible over a period of time.
Chapter IV

System Design

System design concentrates on moving from problem domain to solution domain. This
important phase is composed of several steps. It provides the understanding and procedural
details necessary for implementing the system recommended in the feasibility study. Emphasis
is on translating the performance requirements into design specification.

The design of any software involves mapping of the software requirements into Functional
modules. Developing a real time application or any system utilities involves two processes. The
first process is to design the system to implement it. The second is to construct the executable
code.

Software design has evolved from an intuitive art dependent on experience to a science, which
provides systematic techniques for the software definition. Software design is a first step in the
development phase of the software life cycle.

Before design the system user requirements have been identified, information has been gathered
to verify the problem and evaluate the existing system. A feasibility study has been conducted to
review alternative solution and provide cost and benefit justification. To overcome this proposed
system is recommended. At this point the design phase begins.

The process of design involves conceiving and planning out in the mind and making a drawing.
In software design, there are three distinct activities: External design, Architectural design and
detailed design. Architectural design and detailed design are collectively referred to as internal
design. External design of software involves conceiving and planning out and specifying the
externally observable characteristics of a software product.

INPUT DESIGN:

Systems design is the process of defining the architecture, components, modules, interfaces, and
data for a system to satisfy specified requirements. Systems design could be seen as the
application of systems theory to product development. There is some overlap with the disciplines
of systems analysis, systems architecture and systems engineering.
Input Design is the process of converting a user oriented description of the inputs to a computer-
based business system into a programmer-oriented specification.

• Input data were found to be available for establishing and maintaining master and
transaction files and for creating output records

• The most suitable types of input media, for either off-line or on-line devices, where
selected after a study of alternative data capture techniques.

INPUT DESIGN CONSIDERATIONS

• The field length must be documented.

• The sequence of fields should match the sequence of the fields on the source document.

• The data format must be identified to the data entry operator.

Design input requirements must be comprehensive. Product complexity and the risk associated
with its use dictate the amount of detail

• These specify what the product does, focusing on its operational capabilities and the
processing of inputs and resultant outputs.

• These specify how much or how well the product must perform, addressing such issues
as speed, strength, response times, accuracy, limits of operation, etc.

OUTPUT DESIGN:

A quality output is one, which meets the requirements of the end user and presents the
information clearly. In any system results of processing are communicated to the users and to
other system through outputs.

In output design it is determined how the information is to be displaced for immediate need and
also the hard copy output. It is the most important and direct source information to the user.
Efficient and intelligent output design improves the system’s relationship to help user decision-
making.
1. Designing computer output should proceed in an organized, well thought out manner;
the right output must be developed while ensuring that each output element is designed so
that people will find the system can use easily and effectively. When analysis design
computer output, they should Identify the specific output that is needed to meet the
requirements.

2. Select methods for presenting information.

3. Create document, report, or other formats that contain information produced by the
system.

The output form of an information system should accomplish one or more of the following
objectives.

• Convey information about past activities, current status or projections of the

• Future.

• Signal important events, opportunities, problems, or warnings.

• Trigger an action.

• Confirm an action.
System Architecture

Fig 4.1 System Architecture

Data Flow Diagrams (DFD)

A data flow diagram is graphical tool used to describe and analyze movement of data
through a system. These are the central tool and the basis from which the other components are
developed. The transformation of data from input to output, through processed, may be
described logically and independently of physical components associated with the system. These
are known as the logical data flow diagrams.

The physical data flow diagrams show the actual implements and movement of data
between people, departments and workstations. A full description of a system actually consists
of a set of data flow diagrams. Using two familiar notations Yourdon, Gane and Sarson notation
develops the data flow diagrams. Each component in a DFD is labeled with a descriptive name.
Process is further identified with a number that will be used for identification purpose.

The development of DFD’S is done in several levels. Each process in lower level
diagrams can be broken down into a more detailed DFD in the next level. The lop-level diagram
is often called context diagram. It consists of a single process bit, which plays vital role in
studying the current system. The process in the context level diagram is exploded into other
process at the first level DFD.

The idea behind the explosion of a process into more process is that understanding at one
level of detail is exploded into greater detail at the next level. This is done until further
explosion is necessary and an adequate amount of detail is described for analyst to understand
the process.

A DFD is also known as a “bubble Chart” has the purpose of clarifying system
requirements and identifying major transformations that will become programs in system design.
So it is the starting point of the design to the lowest level of detail. A DFD consists of a series of
bubbles joined by data flows in the system.

DFD Symbols

In the DFD, there are four symbols

1. A square defines a source(originator) or destination of system data

2. An arrow identifies data flow. It is the pipeline through which the information flows

3. A circle or a bubble represents a process that transforms incoming data flow into
outgoing data flows.

4. An open rectangle is a data store, data at rest or a temporary repository of data

Process that transforms data flow


Source or Destination of data

Data flow

Data Store

Constructing a DFD

Several rules of thumb are used in drawing DFD’S:

1. Process should be named and numbered for an easy reference. Each name should be
representative of the process.

2. The direction of flow is from top to bottom and from left to right. Data traditionally flow
from source to the destination although they may flow back to the source. One way to
indicate this is to draw long flow line back to a source. An alternative way is to repeat the
source symbol as a destination. Since it is used more than once in the DFD it is marked with
a short diagonal.

3. When a process is exploded into lower level details, they are numbered.

4. The names of data stores and destinations are written in capital letters. Process and dataflow
names have the first letter of each work capitalized

A DFD typically shows the minimum contents of data store. Each data store should
contain all the data elements that flow in and out.

Questionnaires should contain all the data elements that flow in and out. Missing
interfaces redundancies and like is then accounted for often through interviews.
Salient Features of DFD’s

1. The DFD shows flow of data, not of control loops and decision are controlled considerations
do not appear on a DFD.

2. The DFD does not indicate the time factor involved in any process whether the dataflow take
place daily, weekly, monthly or yearly.

3. The sequence of events is not brought out on the DFD.

Rules Governing the DFD’s

Process

1) No process can have only outputs.

2) No process can have only inputs. If an object has only inputs than it must be a sink.

3) A process has a verb phrase label.

Data Store

1) Data cannot move directly from one data store to another data store, a process must move
data.

2) Data cannot move directly from an outside source to a data store, a process, which receives,
must move data from the source and place the data into data store

3) A data store has a noun phrase label.

Source or Sink

- origin and /or destination of data

1) Data cannot move direly from a source to sink it must be moved by a process

2) A source and /or sink has a noun phrase land


Data Flow

1) A Data Flow has only one direction of flow between symbols. It may flow in both directions
between a process and a data store to show a read before an update. The latter is usually
indicated however by two separate arrows since these happen at different type.

2) A join in DFD means that exactly the same data comes from any of two or more different
processes data store or sink to a common location.

3) A data flow cannot go directly back to the same process it leads. There must be at least one
other process that handles the data flow produce some other data flow returns the original
data into the beginning process.

4) A Data flow to a data store means update (delete or change).

5) A data Flow from a data store means retrieve or use.

Fig 4.2 Level 0 DFD (Stuedent)


Fig 4.3 Level 0 DFD (Professor)
Chapter V

System Requirements

The hardware and software specification specifies the minimum hardware and software required
to run the project. The hardware configuration specified below is not by any means the optimal
hardware requirements. The software specification given below is just the minimum
requirements, and the performance of the system may be slow on such system.

Hardware Requirements

 System : Pentium IV 2.4 GHz


 Hard Disk : 40 GB
 Floppy Drive : 1.44 MB
 Monitor : 15 VGA color
 Mouse : Logitech.
 Keyboard : 110 keys enhanced
 RAM : 256 MB

Software Requirements

 Operating System : Windows


 Front End : PHP
 Back End : MySQL
Chapter VI

System Implementation

Implementation is the stage in the project where the theoretical design is turned into a working
system. The implementation phase constructs, installs and operates the new system. The most
crucial stage in achieving a new successful system is that it will work efficiently and effectively.

There are several activities involved while implementing a new project.

 End user Training


 End user Education
 Training on the application software

Modules

In this system there are three main users which are the administrator, the lecture and the
student, each one of them has their specific task and roles they can perform within the system.
The system is sensitive to privacy protection thereby the system has been designed to observe
these matters.

Administrator

The system administrator will have full access privilege of the system which the other users
cannot perform. Some of these include: - assigning roles to user (who is the Admin, lecture or
student), deleting users, adding (department, faculties), and lastly creating users.

Lecture

The lecture will have the privileges of uploading and downloading documents, posting news
about (test, class, and assignments), start blogs for discussions and upload results (coursework).
Student

The student will have fewer privileges, the student will be able to upload and download
documents, comment on the blogs created by lectures, and view posts news by lectures, and
administrators, and lastly students will be able to view their coursework.
Chapter VII

Software Description

PHP

Hypertext Preprocessor (PHP) is a server-side scripting language designed primarily for


web development but also used as a general-purpose programming language. Originally created
by Rasmus Lerdorf in 1994, the PHP reference implementation is now produced by The PHP
Development Team. PHP originally stood for Personal Home Page, but it now stands for the
recursive acronym. PHP code may be embedded into HTML code, or it can be used in
combination with various web template systems, web content management systems and web
frameworks. PHP code is usually processed by a PHP interpreter implemented as a module in the
web server or as a Common Gateway Interface (CGI) executable. The web server combines the
results of the interpreted and executed PHP code, which may be any type of data, including
images, with the generated web page. PHP code may also be executed with a command-line
interface (CLI) and can be used to implement standalone graphical applications. The standard
PHP interpreter, powered by the Zend Engine, is free software released under the PHP License.
PHP has been widely ported and can be deployed on most web servers on almost every operating
system and platform, free of charge. The PHP language evolved without a written formal
specification or standard until 2014, leaving the canonical PHP interpreter as a de facto standard.
Since 2014 work has gone on to create a formal PHP specification. PHP is a widely-used open
source general-purpose scripting language that is especially suited for web development and can
be embedded into HTML. Instead of lots of commands to output HTML, PHP pages contain
HTML with embedded code that does "something".

The PHP code is enclosed in special start and end processing instructions that allow you
to jump into and out of "PHP mode." What distinguishes PHP from something like client-side
JavaScript is that the code is executed on the server, generating HTML which is then sent to the
client. The client would receive the results of running that script, but cannot know the underlying
code. The web server is configured to process all your HTML files with PHP. The best things in
using PHP are that it is extremely simple for a newcomer, but offers many advanced features for
a professional programmer.
Functions of PHP

 Generate dynamic page content


 Create, open, read, write, delete, and close files on the server
 Collect form data
 Send and receive cookies
 Add, delete and modify data in your database
 Can be used to control user-access
 Can encrypt data

Characteristics of PHP

Five important characteristics make PHP's practical nature possible

 Simplicity

 Efficiency

 Security

 Flexibility

 Familiarity

MYSQL

MySQL is an open-source relational database management system (RDBMS). Its name


is a combination of “My”, the name of co-founders Michael Widenius' daughter, and "SQL", the
abbreviation for Structured Query Language. The MySQL development project has made its
source code available under the terms of the GNU General Public License, as well as under a
variety of proprietary agreements. MySQL was owned and sponsored by a single for-profit firm,
the Swedish company MySQL AB, now owned by Oracle Corporation. For proprietary use,
several paid editions are available, and offer additional functionality.
MySQL is a central component of the LAMP open-source web application software
stack (and other "AMP" stacks). LAMP is an acronym for "Linux, Apache, MySQL,
Perl/PHP/Python". Applications that use the MySQL database include: TYPO3, MODx, Joomla,
WordPress, phpBB, MyBB, and Drupal. MySQL is also used in many high-profile, large-scale
websites, including Google (though not for searches), Facebook, Twitter, Flickr, and YouTube.
MySQL is written in C and C++. Its SQL parser is written in yacc, but it uses a home-brewed
lexical analyzer. MySQL works on many system platforms, including AIX, BSDi, FreeBSD,
HP-UX, eComStation, i5/OS, IRIX, Linux, macOS, Microsoft Windows, NetBSD, Novell
NetWare, OpenBSD, OpenSolaris, OS/2 Warp, QNX, Oracle Solaris, Symbian, SunOS, SCO
OpenServer, SCO UnixWare, Sanos and Tru64. A port of MySQL to OpenVMS also exists.

MySQL is the most popular Open Source Relational SQL database management system.
MySQL is one of the best RDBMS being used for developing web-based software applications.

MySQL is a fast, easy-to-use RDBMS being used for many small and big businesses.
MySQL is developed, marketed, and supported by MySQL AB, which is a Swedish company.
MySQL is becoming so popular because of many good reasons:

 MySQL is released under an open-source license. So you have nothing to pay to use it.

 MySQL is a very powerful program in its own right. It handles a large subset of the
functionality of the most expensive and powerful database packages.

 MySQL uses a standard form of the well-known SQL data language.

 MySQL works on many operating systems and with many languages including PHP,
PERL, C, C++, JAVA, etc.

 MySQL works very quickly and works well even with large data sets.

 MySQL is very friendly to PHP, the most appreciated language for web development.

 MySQL supports large databases, up to 50 million rows or more in a table. The default
file size limit for a table is 4GB, but you can increase this (if your operating system can
handle it) to a theoretical limit of 8 million terabytes (TB).
 MySQL is customizable. The open-source GPL license allows programmers to modify
the MySQL software to fit their own specific environments.

Features

 A broad subset of ANSI SQL 99, as well as extensions

 Cross-platform support

 Stored procedures, using a procedural language that closely adheres to SQL/PSM

 Triggers

 Cursors

 Updatable views

 Online DDL when using the InnoDB Storage Engine.

 Information schema

 Performance Schema that collects and aggregates statistics about server execution and
query performance for monitoring purposes.

 A set of SQL Mode options to control runtime behavior, including a strict mode to better
adhere to SQL standards.

 X/Open XA distributed transaction processing (DTP) support; two phase commit as part
of this, using the default InnoDB storage engine

 Transactions with save points when using the default InnoDB Storage Engine. The NDB
Cluster Storage Engine also supports transactions.

 ACID compliance when using InnoDB and NDB Cluster Storage Engines

 SSL support
 Query caching

 Sub-SELECTs (i.e. nested SELECTs)

 Built-in replication support (i.e., master-master replication and master-slave replication)


with one master per slave, many slaves per master. Multi-master replication is provided
in MySQL Cluster, and multi-master support can be added to unclustered configurations
using Galera Cluster.

 Full-text indexing and searching

 Embedded database library

 Unicode support

 Partitioned tables with pruning of partitions in optimizer

 Shared-nothing clustering through MySQL Cluster

 Multiple storage engines, allowing one to choose the one that is most effective for each
table in the application.

 Native storage engines InnoDB, MyISAM, Merge, Memory (heap), Federated, Archive,
CSV, Blackhole, NDB Cluster.

 Commit grouping, gathering multiple transactions from multiple connections together to


increase the number of commits per second.
Chapter VIII

System Testing

Software Testing

Software testing is an investigation conducted to provide stakeholders with information about


the quality of the product or service under test. Software testing can also provide an objective,
independent view of the software to allow the business to appreciate and understand the risks of
software implementation. Test techniques include, but are not limited to the process of executing
a program or application with the intent of finding software bugs (errors or other defects).The
purpose of testing is to discover errors. Testing is the process of trying to discover every
conceivable fault or weakness in a work product. It provides a way to check the functionality of
components, sub-assemblies, assemblies and/or a finished product It is the process of exercising
software with the intent of ensuring that the software system meets its requirements and user
expectations and does not fail in an unacceptable manner. There are various types of test. Each
test type addresses a specific testing requirement.

Software testing is the process of evaluation a software item to detect differences between given
input and expected output. Also to assess the feature of a software item. Testing assesses the
quality of the product. Software testing is a process that should be done during the development
process. In other words software testing is a verification and validation process.

Types of testing

There are different levels during the process of Testing .Levels of testing include the different
methodologies that can be used while conducting Software Testing. Following are the main
levels of Software Testing:

 Functional Testing.

 Non-Functional Testing.
Steps Description

I The determination of the functionality that the intended application is meant to


perform.

II The creation of test data based on the specifications of the application.

III The output based on the test data and the specifications of the application.

IV The writing of Test Scenarios and the execution of test cases.

V The comparison of actual and expected results based on the executed test cases.

Functional Testing

Functional Testing of the software is conducted on a complete, integrated system to evaluate the
system's compliance with its specified requirements. There are five steps that are involved when
testing an application for functionality.

An effective testing practice will see the above steps applied to the testing policies of every
organization and hence it will make sure that the organization maintains the strictest of standards
when it comes to software quality.

Unit Testing

This type of testing is performed by the developers before the setup is handed over to the testing
team to formally execute the test cases. Unit testing is performed by the respective developers on
the individual units of source code assigned areas. The developers use test data that is separate
from the test data of the quality assurance team. The goal of unit testing is to isolate each part of
the program and show that individual parts are correct in terms of requirements and
functionality.

Limitations of Unit Testing

Testing cannot catch each and every bug in an application. It is impossible to evaluate every
execution path in every software application. The same is the case with unit testing.

There is a limit to the number of scenarios and test data that the developer can use to verify the
source code. So after he has exhausted all options there is no choice but to stop unit testing and
merge the code segment with other units.

Integration Testing

The testing of combined parts of an application to determine if they function correctly together is

Integration testing. There are two methods of doing Integration Testing Bottom-up Integration
testing and Top- down Integration testing.

S.N. Integration Testing Method

1 Bottom-up integration
This testing begins with unit testing, followed by tests of progressively higher-
level combinations of units called modules or builds.

2 Top-Down integration
This testing, the highest-level modules are tested first and progressively lower-
level modules are tested after that.

In a comprehensive software development environment, bottom-up testing is usually done first,


followed by top-down testing. The process concludes with multiple tests of the complete
application, preferably in scenarios designed to mimic those it will encounter in customers'
computers, systems and network.
System Testing

This is the next level in the testing and tests the system as a whole. Once all the components are
integrated, the application as a whole is tested rigorously to see that it meets Quality Standards.
This type of testing is performed by a specialized testing team. System testing is so important
because of the following reasons:

 System Testing is the first step in the Software Development Life Cycle, where the
application is tested as a whole.

 The application is tested thoroughly to verify that it meets the functional and technical
specifications.

 The application is tested in an environment which is very close to the production


environment where the application will be deployed.

 System Testing enables us to test, verify and validate both the business requirements as
well as the Applications Architecture.

Regression Testing

Whenever a change in a software application is made it is quite possible that other areas within
the application have been affected by this change. To verify that a fixed bug hasn't resulted in
another functionality or business rule violation is Regression testing. The intent of Regression
testing is to ensure that a change, such as a bug fix did not result in another fault being uncovered
in the application. Regression testing is so important because of the following reasons:

 Minimize the gaps in testing when an application with changes made has to be tested.

 Testing the new changes to verify that the change made did not affect any other area of
the application.

 Mitigates Risks when regression testing is performed on the application.

 Test coverage is increased without compromising timelines.


 Increase speed to market the product.

Acceptance Testing

This is arguably the most importance type of testing as it is conducted by the Quality Assurance
Team who will gauge whether the application meets the intended specifications and satisfies the
client requirements. The QA team will have a set of pre written scenarios and Test Cases that
will be used to test the application.

More ideas will be shared about the application and more tests can be performed on it to gauge
its accuracy and the reasons why the project was initiated. Acceptance tests are not only intended
to point out simple spelling mistakes, cosmetic errors or Interface gaps, but also to point out any
bugs in the application that will result in system crashers or major errors in the application.

By performing acceptance tests on an application the testing team will deduce how the
application will perform in production. There are also legal and contractual requirements for
acceptance of the system.

Alpha Testing

This test is the first stage of testing and will be performed amongst the teams (developer and QA
teams). Unit testing, integration testing and system testing when combined are known as alpha
testing. During this phase, the following will be tested in the application:

 Spelling Mistakes

 Broken Links

 Cloudy Directions

 The Application will be tested on machines with the lowest specification to test loading
times and any latency problems.
Beta Testing

This test is performed after Alpha testing has been successfully performed. In beta testing a
sample of the intended audience tests the application. Beta testing is also known as pre-release
testing. Beta test versions of software are ideally distributed to a wide audience on the Web,
partly to give the program a "real-world" test and partly to provide a preview of the next release.
In this phase the audience will be testing the following:

 Users will install, run the application and send their feedback to the project team.

 Typographical errors, confusing application flow, and even crashes.

 Getting the feedback, the project team can fix the problems before releasing the software
to the actual users.

 The more issues you fix that solve real user problems, the higher the quality of your
application will be.

 Having a higher-quality application when you release to the general public will increase
customer satisfaction.

Non-Functional Testing

This section is based upon the testing of the application from its non-functional attributes. Non-
functional testing of Software involves testing the Software from the requirements which are
nonfunctional in nature related but important a well such as performance, security, and user
interface etc. Some of the important and commonly used non-functional testing types are
mentioned as follows:

Performance Testing

It is mostly used to identify any bottlenecks or performance issues rather than finding the bugs in
software. There are different causes which contribute in lowering the performance of software:

 Network delay.
 Client side processing.

 Database transaction processing.

 Load balancing between servers.

 Data rendering.

Performance testing is considered as one of the important and mandatory testing type in terms of
following aspects:

 Speed (i.e. Response Time, data rendering and accessing)

 Capacity

 Stability

 Scalability

It can be either qualitative or quantitative testing activity and can be divided into different sub
types such as Load testing and Stress testing.

Load Testing

A process of testing the behavior of the Software by applying maximum load in terms of
Software accessing and manipulating large input data. It can be done at both normal and peak
load conditions. This type of testing identifies the maximum capacity of Software and its
behavior at peak time. Most of the time, Load testing is performed with the help of automated
tools such as Load Runner, App Loader, IBM Rational Performance Tester, Apache J Meter, Silk
Performer, Visual Studio Load Test etc. Virtual users (V Users) are defined in the automated
testing tool and the script is executed to verify the Load testing for the Software. The quantity of
users can be increased or decreased concurrently or incrementally based upon the requirements.

Stress Testing

This testing type includes the testing of Software behavior under abnormal conditions. Taking
away the resources, applying load beyond the actual load limit is Stress testing.

The main intent is to test the Software by applying the load to the system and taking over the
resources used by the Software to identify the breaking point. This testing can be performed by
testing different scenarios such as:

 Shutdown or restart of Network ports randomly.

 Turning the database on or off.

 Running different processes that consume resources such as CPU, Memory, server etc.

Usability Testing

This section includes different concepts and definitions of Usability testing from Software point
of view. It is a black box technique and is used to identify any error(s) and improvements in the
Software by observing the users through their usage and operation.

According to Nielsen, Usability can be defined in terms of five factors i.e. Efficiency of use,
Learn-ability, Memor-ability, Errors/safety, satisfaction. According to him the usability of the
product will be good and the system is usable if it possesses the above factors.

Nigel Bevan and Macleod considered that Usability is the quality requirement which can be
measured as the outcome of interactions with a computer system. This requirement can be
fulfilled and the end user will be satisfied if the intended goals are achieved effectively with the
use of proper resources.

Molich in 2000 stated that user friendly system should fulfill the following five goals i.e. Easy to
Learn, Easy to Remember, Efficient to Use, Satisfactory to Use and Easy to Understand.

In addition to different definitions of usability, there are some standards and quality models and
methods which define the usability in the form of attributes and sub attributes such as ISO-9126,
ISO-9241-11, ISO-13407 and IEEE std.610.12 etc.
UI vs. Usability Testing

UI testing involves the testing of Graphical User Interface of the Software. This testing ensures

that the GUI should be according to requirements in terms of color, alignment, size and other
properties.

On the other hand Usability testing ensures that a good and user friendly GUI is designed and is
easy to use for the end user. UI testing can be considered as a sub part of Usability testing.

Security Testing

Security testing involves the testing of Software in order to identify any flaws ad gaps from
security and vulnerability point of view. Following are the main aspects which Security testing
should ensure:

 Confidentiality.

 Integrity.

 Authentication.

 Availability.

 Authorization.

 Non-repudiation.

Portability Testing

Portability testing includes the testing of Software with intend that it should be re-useable and
can be moved from another Software as well. Following are the strategies that can be used for
Portability testing.

 Transferred installed Software from one computer to another.

 Building executable (.exe) to run the Software on different platforms.


Portability testing can be considered as one of the sub parts of System testing, as this testing type
includes the overall testing of Software with respect to its usage over different environments.
Chapter IX

Conclusion

Our project is only a humble venture to satisfy the needs to manage their project work.
Several user friendly coding have also adopted. This package shall prove to be a powerful
package in satisfying all the requirements of the school. The objective of software planning is to
provide a framework that enables the manager to make reasonable estimates made within a
limited time frame at the beginning of the software project and should be updated regularly as
the project progresses.
References

1. Ware, P., & Warschauer, M. (2006). Electronic feedback and second language writing. In
K Hyland and F. Hyland (Eds.) Feedback in second language writing: Contexts and
issues (pp. 105-122). New York: Cambridge University Press.
2. Warschauer, M. (1997). Computer-mediated collaborative learning: Theory and practice.
Modern Language Journal, 81, 470-481.
3. Aroyo, L., Dolog, P., Houben, G-J., Kravcik, M., Naeve, A., Nilsson, M., et al. (2006).
Interoperability in Personalized AdaptiveLearning. Journal of Educational Technology &
Society, 9 (2), 4–18.
4. Aydin, C.C., & Tirkes, G. (2010). Open source learning management systems in e-
learning and Moodle. In Proceedings of IEEE EDUCON 2010 - IEEE Engineering
Education 2010, Madrid, 14–16 April, 593–600.
5. ASP.NET 4.0 in Practice By Daniele Bochicchio
6. Programming ASP.NET MVC 4 Developing Real-World Web Applications with
ASP.NET MVC
7. http://www.aspsnippets.com (Released on 07 August 2012 by Mudassar Khan)
8. http://www.codeproject.com (Retrieved on 4 July 2014)
9. Improving the analysis of students participation and collaboration in Moodle by Raquel
Hijon- Angel Velazquez Iturbide
10. Sams Teach Yourself JavaScript™ in 21Days
11. http://www.dotnettips4u.com (Retrieved on 5 June 2014)

You might also like