Student Grade
Student Grade
5 DECISION TREE
ABSTRACT:
To come up with a system where student final grade is predicted based on the marks he had
scored during his previous course and years. In order to predict the grade of the student we
need some data to analyze and to predict the grade. So we will input student basic
information and their previous academic information into the system which will be used to
predict the grade of the student. We here used an effective data mining algorithm to predict
the result. We used C4.5 decision tree algorithm to predict the grade of the student.C4.5 is a
program for inducing classification rules in the form of decision trees from a set of given
examples. C4.5 is a software extension of the basic ID3 algorithm designed by Quinlan.
Admin and user will use the system. Here user will be the student. Admin and user will use
their credentials to access the system. Admin can add student details with basic information.
Admin must add academic details of the student. Like his SSC, HSC, Graduation marks. User
can view the grade and here system will generate a report where he will get grade prediction
using C4.5 algorithm. This system can be used in schools, colleges and other educational
institutes.
Advantages of the Proposed Project:
Application:
This system can be used in schools, colleges and other educational institutes.
INTRODUCTION
Every year, educational institutes admit students under various courses from different
locations, educational background and with varying merit scores in entrance examinations.
Moreover, schools and junior colleges may be affiliated to different boards, each board
having different subjects in their curricula and also different level of depths in their subjects.
Analyzing the past performance of admitted students would provide a better perspective of
the probable academic performance of students in the future. This can very well be achieved
using the concepts of data mining. For this purpose, we have analysed the data of students
enrolled in first year of engineering. This data was obtained from the information provided by
the admitted students to the institute. It includes their full name, gender, application ID,
scores in board examinations of classes X and XII, scores in entrance examinations, category
and admission type. We then applied the ID3 and C4.5 algorithms after pruning the dataset to
predict the results of these students in their first semester as precisely as possible.
CT SUMMARY
Student grade prediction using c4.5 decision tree is a System that manages the records of
student regarding admission and examination part.
The SMS module is a component covering many other student aspects from
application to retirement. The system records basic personal information, admission
information, education information regarding student. Leading edge systems provide the
ability to "read" applications and enter relevant data to applicable database fields, notify
student and provide result. Student management function involves
In SMS, every user has a Login ID and Password. Also all the users have different
permission rights to access the applications. These rights are Dynamic and can be changed.
There are three main roles in the system. Admin, accountant and operator. Admin has
complete access to the whole system, while accountant is only concerned with payment of
fees for the admission of the student. Operator is the role that is responsible for the use of the
system.
Now when the user with the particular role Logs on he can see only those pages which are
allowed to them.
PU
The project is about to handle all the information of the student regarding admission and
examination. Also it manages resources which were managed and handled by manpower
previously. The main purpose of the project is to integrate distinct sections of the organization
into consistent manner so that complex functions can be handled smoothly by any technical
or non-technical persons.
Automation of admission and enrolment as per board, quota, category and available
seats.
Assistance in decision-making.
To manage information of student, faculty and courses.
Consistently update information of all the students.
Reports- To gather all the related information about any application of the HRMS.
All the above-mentioned matters are to be incorporated in the application along with some
additional requirements.
The main purpose of the Admin Module is to introduce new things and configure
important aspects. For e.g. only admin is authorized to introduce quota, board, subject,
category, etc. and only admin is allowed to configure exam and set fees structure. So the
master screens for all these are visible to only admin role. This is done by the Admin Module.
It also can create the users and Physical and Logical Locations. Thus the main purpose of the
Admin Module is to managing the dynamic working of the system.
1.3 SCOPE
The scope of the project includes the following
Any college can use this system as it is not client centric.
All admission and examination related work for the student can be done using this
system.
Deliver Electronic Workplace
Provide Bi-lingual support
Application Support & Maintenance after deployment to production
The Admin Module can be reused for projects as well which have many users with
different rights. Hence it is reusable.
The Common Language Runtime (CLR) is an environment that manages the execution of
code. In other words, it runs and maintains any code that you write. With the .NET framework and
CLR you still write code and compile it. However, instead of compiling it into something that
computer understands, you compile it into a language called the Microsoft Intermediate Language
(MSIL). This language is shorthand way of representing all the code you have written. ASP.NET
pages are compiled into MSIL as well. When you compile to MSIL, your application produces
something called metadata. This is descriptive information about your application. It tells what the
application can do, where it belongs, and so on.
ASP.NET, the latest version of Active Server Pages, is Microsoft’s technology for building
dynamic pages, database-driven Web sites. Active Server Pages is one of the most popular languages
for building scalable, interactive Web sites. Several of the highest traffic Web sites on the Internet
employs Active Server Pages. Examples include Dell Online, Barnes and Noble, 1-800-flowers, and
the Microsoft site itself.
ASP.NET makes building real world Web applications dramatically easier. ASP.NET server
controls enable an HTML-like style of declarative programming that let you build great pages with far
less code than with classic ASP. Displaying data, validating user input, and uploading files are all
amazingly easy. Best of all, ASP.NET pages work in all browsers -- including Netscape, Opera, AOL,
and Internet Explorer.
You can harness the full power of ASP.NET using any text editor -- even Notepad! But Visual
Studio 2005 adds the productivity of Visual Basic-style development to the Web. Now you can
visually design ASP.NET Web Forms using familiar drag-drop-double-click techniques, and enjoy
full-fledged code support including statement completion and color-coding. VS.NET also provides
integrated support for debugging and deploying ASP.NET Web applications.
The Professional version of Visual Studio 2005 delivers life-cycle features to help
organizations plan, analyze, design, build, test, and coordinate teams that develop ASP.NET Web
applications. These include UML class modeling, database modeling (conceptual, logical, and
physical models), testing tools (functional, performance and scalability), and enterprise frameworks
and templates, all available within the integrated Visual Studio .NET environment.
ASP.NET output caching can dramatically improve the performance and scalability of your
application. When output caching is enabled on a page, ASP.NET executes the page just once, and
saves the result in memory in addition to sending it to the user. When another user requests the same
page, ASP.NET serves the cached result from memory without re-executing the page. Output
caching is configurable, and can be used to cache individual regions or an entire page. Output
caching can dramatically improve the performance of data-driven pages by eliminating the need to
query the database on every request.
ASP.NET session state lets you share session data user-specific state values across all
machines in your Web farm. Now a user can hit different servers in the web farm over multiple
requests and still have full access to her session. And since business components created with the
.NET Framework are free-threaded, you no longer need to worry about thread affinity.
ASP.NET automatically detects and recovers from errors like deadlocks and memory leaks to
ensure your application is always available to your users.
For example, say that your application has a small memory leak, and that after a week the leak has
tied up a significant percentage of your server's virtual memory. ASP.NET will detect this condition,
automatically start up another copy of the ASP.NET worker process, and direct all new requests to the
new process. Once the old process has finished processing its pending requests, it is gracefully
disposed and the leaked memory is released. Automatically, without administrator intervention or any
interruption of service, ASP.NET has recovered from the error.
ASP.NET now lets you update compiled components without restarting the web
server. In the past with classic COM components, the developer would have to restart the
web server each time he deployed an update. With ASP.NET, you simply copy the
component over the existing DLL -- ASP.NET will automatically detect the change and start
using the new code.
You do not have to migrate your existing applications to start using ASP.NET.
ASP.NET runs on IIS side-by-side with classic ASP on Windows 2000 and Windows XP
platforms. Your existing ASP applications continue to be processed by ASP.DLL, while new
ASP.NET pages are processed by the new ASP.NET engine. You can migrate application by
application, or single pages. And ASP.NET even lets you continue to use your existing
classic COM business components. ASP.Net represents a radical departure from previous
versions of Active Server Pages.
Following are some of the significant new features of ASP.NET 2.0 Framework:
ASP.NET uses compiled code written in Common Language Runtime language such as
Visual Basic and C#. Unlike previous versions of Active Server Pages, this version not use
interpreted scripting language such as VBScript.
An advanced version of .NET 1.1 which has proved to be a milestone in web
technology of today’s time. ASP.NET pages are built out of server – side controls. Web server
controls enable you to represent and program against Hypertext Markup Language (HTML)
elements using an intuitive object model.
ASP.NET includes a new technology called Web Services. You can use Web Services
to access methods and properties and transfer database data across the Internet. ASP.NET is
part of Microsoft’s .NET framework. You can access thousands of .NET classes in your code
that enable you to perform such wondrously diverse tasks as generating images on - the - fly
and saving an array to a file. ASP.Net includes page and data caching mechanisms that
enable you to easily and dramatically improve the performance of your Web Site.
With ASP.NET 3.5, Microsoft aims to continue its success by refining and enhancing
ASP.NET. The good news is that Microsoft hasn’t removed features, replaced functionality,
or reversed direction. Instead, almost all the changes add higher-level features that can make
your programming more productive.
• ASP.NET 1.0: This first release created the core ASP.NET platform and introduced a wide
range of essential features.
• ASP.NET 1.1: This second release added performance tune-ups and bug fixes, but no new
features.
• ASP.NET 2.0: This third release piled on a huge set of new features, all of which were built
on top of the existing ASP.NET plumbing. The overall emphasis was to supply developers
with prebuilt goodies that they could use without writing much (if any) code. Some of the
new features included built-in support for website navigation, a theming feature for
standardizing web page design, and an easier way to pull information out of a database.
• ASP.NET 3.5: This fourth release keeps the same basic engine as ASP.NET 2.0, but adds a
few frills and two more dramatic changes. The most significant enhancement is the ASP.NET
AJAX toolkit, which gives web developers better tools for creating highly responsive web
pages that incorporate rich effects usually seen in desktop applications (such as drag-and-
drop and auto complete). The other innovation is support for LINQ, a set of language
enhancements included with .NET 3.5 that allows you to search in-memory data in the same
way that you query a database.
Microsoft used the .NET 3.0 name to release a set of new technologies, including Windows
Presentation Foundation (WPF), a platform for building slick Windows applications;
Windows Workflow Foundation (WF), a platform for modelling application logic using
flowchart-style diagrams; and Windows Communication Foundation (WCF), a platform for
designing services that can be called from other computers. However, .NET 3.0 did not
include an updated version of ASP.NET.
Visual Studio has come a long way since its inception in 1997. Visual Studio 97 hit the street
with the goals of enabling developers to share and see large projects through a complete
development cycle regardless of the different languages and deployment schemes.
That was followed up by Visual Studio 6.0 with its integrated development environment and
built-in data designers for architecting large-scale and multi-tier applications, with the goals
of supporting distributed and heterogeneous environments and architectures.
1.4.2.13 Linq
Many of the new language features and enhancements in Visual Studio 2008—both in Visual
C# and Visual Basic .NET—make many of the LINQ features possible and enable you to take
advantage of some of the LINQ capabilities.
Included with the new Visual Studio release are a number of designers that can help
developers visually create many aspects of their SQL entity classes and associations. For
example, the Object Relational Designer (O/R Designer) provides a visual interface for
creating and designing LINQ to SQL entity classes and associations of database objects.
Visual Studio 2008 also comes with the DataSet Designer, a visual tool used for creating and
manipulating typed DataSets and the associated items of which the datasets are made,
providing a visual image of the objects within the DataSets.
LINQ will be released in the next version of Visual Studio and the .NET Framework,
currently slated for version 3.5. Because much of the LINQ functionality is based on the new
features of the .NET Framework, this chapter explores those features and enhancements that
help support LINQ and provide LINQ with the foundation it needs from a language
perspective. It looks at the new language-specific features in both C# and Visual Basic .NET
media content.
deploying reliable, secure, and interoperable connected systems across distributed systems
and environments.
Today, Visual Studio 2008 focuses on providing developers with a rich experience for
Windows Vista, the web, and Office 2008, while continuing to improve its development
languages and innovations. Visual Studio 2008 contains a number of new features, including
C# and Visual Basic .NET language features, improved data features such as multi-tier
support for typed datasets and hierarchical update capabilities, and a web application project
model.
However, the most exciting new feature of Visual Studio 2008 (in my opinion) is LINQ,
Microsoft’s new Language Integrated Query, which extends powerful query capabilities into
your favourite .NET programming language.
10101010
10101010
01110101
00101011
10101010
10100010
10101010
10011110
ASP ASP
For designing the entire software we have divided the whole software into four main
layers. And each layer provides service to the other layer. So we can easily proceed towards
the target. These layers are namely
Presentation layer
Business layer
Control layer
Data Access layer
Presentation Layer
The Presentation layer is responsible for the user interface and communicates directly
with the business logic layer. Separating the presentation layer from the rest of the application
enables the development of different user interface (i.e. Web form, Windows form, mobile
devices) that all uses the same business logic and database access code.
Business Layer
The logic layer separates the code specific to the application, for the way company
does the business, from the user interface and the database specific code. Other line of
business Applications a company build can use the business logic layer if needed,
maximizing the code reuse.
Control Layer
The Control layer is responsible for communication between business layer and
presentation layer. It connects the logic and data with each other and gives a better
connectivity and separation between layers.
Project Flow Lines and Artificial Lift use a Microsoft SQL Server Express Edition database.
LITERATURE SURVEY
2.1.1. Classification
Classification is a data mining technique that maps data into predefined groups or classes. It
is a supervised learning method which requires labelled training data to generate rules for
classifying test data into predetermined groups or classes [2]. It is a two-phase process. The
first phase is the learning phase, where the training data is analyzed and classification rules
are generated. The next phase is the classification, where test data is classified into classes
according to the generated rules. Since classification algorithms require that classes be
defined based on data attribute values, we had created an attribute “class” for every student,
which can have a value of either “Pass” or “Fail”.
2.1.2. Clustering
Clustering is the process of grouping a set of elements in such a way that the elements in the
same group or cluster are more similar to each other than to those in other groups or clusters
[1]. It is a common technique for statistical data analysis used in the fields of pattern
recognition, information retrieval, bioinformatics, machine learning and image analysis.
Clustering can be achieved by various algorithms that differ about the similarities required
between elements of a cluster and how to find the elements of the clusters efficiently. Most
algorithms used for clustering try to create clusters with small distances among the cluster
elements, intervals, dense areas of the data space or particular statistical distributions.
Missing data values cause problems during both the training phase and to the classification
process itself. For example, the reason for non-availability of data may be due to [2]:
•Equipment malfunction
•Deletion due to inconsistency with other recorded data
Since the attributes for marks would have discrete values, to produce better results, specific
classes were defined. Thus, the “merit” attribute had a value “good” if the merit score of the
student was 120 or above out of a maximum score of 200, and was classified as “bad” if the
merit score was below 120. Also, the value that can be held by the “percentage” attribute of
the student are three - “distinction” if the percentage of marks scored by the student in the
subjects of Physics, Chemistry and Mathematics was 70 or above, “first_class” if the
percentage was less than 70 and greater than or equal to 60, then it was classified as
“second_class” if the percentage was less than 60. The attribute for admission type is labelled
“type” and the value held by a student for it can be either “AI” (short for All-India), if the
student was admitted to a seat available for All-India candidates, or “OTHER” if the student
was admitted to another seat.
4.3. Data Processing Using RapidMiner
The next step was to feed the pruned student database as input to RapidMiner. This helped us
in evaluating interesting results by applying classification algorithms on the student training
dataset.
4.4.1. CodeIgniter
The web application was developed using a popular PHP framework named CodeIgniter. The
application has provisions for multiple simultaneous staff registrations and staff logins. This
ensures that the work of no two staff members is interrupted during performance evaluation.
Figure 5 and Figure 6 depict the staff registration and staff login pages respectively.
Chapter 5
SYSTEM DESIGN
________________________________________________
INTRODUCTION
During analysis, the focus is on what needs to be done intendment of how it is done.
During design, decisions are made about how the problem will be solved, first at a high level,
then at increasingly detailed levels.
System design is the first stage in which the basic approach to solving the problem is
selected. During system designing the overall structure and style are decided. The system
architecture is the overall organization of the system into components called system. System
design deals with transforming the customer requirements, as described in the SRS document,
into a form that is implement able using the programming language. Certain items such as
modules, relationships among identified modules, data structures, relationships between the
data structures, and algorithms for implementation should be designed during this phase.
SMS_STUDENT_ADMISSION_DETAILS
SMS_QUOTA_MASTER STUDENT_ID
SMS_FACULTY_DETAIL
FACULTY_ID
QUOTA_ID DATE_OF_ADMISSION
FACULTY_NAME
QUOTA_NAME GENERAL_MERIT_NO
DESIGNATION_ID
BOARD_ID CATEGORY_MERIT_NO
SPECIALIZATION_ID
FRESHER
BOARD_ID
CATEGORY_ID
SPECIALITY_ID
SMS_CATEGORY_MASTER
QUOTA_ID
CATEGORY_ID
HOSTEL
CATEGORY_NAME
FACULTY_ID
SMS_BOARD_MASTER REMARKS
DESCRIPTION
BOARD_ID
YCS_ID
BOARD_NAME
SMS_YEAR_COURSE_SEM
YCS_ID
YEAR_ID
COURSE_ID
SEM_ID
SMS_SPECIALITY_MASTER
SPECIALITY_ID
SPECIALITY_NAME
SMS_COURSE_MASTER
COURSE_ID
COURSE_ID
DESCRIPTION
COURSE_NAME
COURSE_DURATION
DESCRIPTION
TOTAL_MARKS SMS_EXAM_TYPE_MASTER
PASSING_MARKS EXAM_TYPE_CODE
EXAM_TYPE_NAME
SMS_RESULT_DATA DESCRIPTION
EXAM_ID
YCS_ID
SMS_EXAM_DETAIL
EXAM_ID
SMS_EXAM_MASTER SUB_EXAM_ID
SMS_SUBJECT_SEMESTER_ALLOCATION *
EXAM_ID DATE
ID
YCS_ID EXAM_TIME
YCS_ID
EXAM_TYPE_ID
SUB_CODE
SPECIALITY_ID
Modules:
Admin:
SYSTEM CONFIGURATION
The .NET Framework is a new computing platform that simplifies application development
in the highly distributed environment of the internet
The CLR is described as the “execution engine” of .NET. It provides the environment
within which programs run. The most important features are:
The following features of the .NET framework are also worth description:
Managed Code - is code that targets .NET, and which contains certain extra
information - “metadata” - to describe itself. Whilst both managed and unmanaged code can
run in the runtime, only managed code contains the information that allows the CLR to
guarantee, for instance, safe execution and interoperability.
Managed Data - With Managed Code comes Managed Data. CLR provides memory
allocation and Deal location facilities, and garbage collection. Some .NET languages use
Managed Data by default, such as C#, Visual Basic.NET and JScript.NET, whereas others,
namely C++, do not. Targeting CLR can, depending on the language you’re using, impose
certain constraints on the features available. As with managed and unmanaged code, one can
have both managed and unmanaged data in .NET applications - data that doesn’t get garbage
collected but instead is looked after by unmanaged code.
Common Type System - The CLR uses something called the Common Type System
(CTS) to strictly enforce type-safety. This ensures that all classes are compatible with each
other, by describing types in a common way. CTS define how types work within the runtime,
which enables types in one language to interoperate with types in another language, including
cross-language exception handling. As well as ensuring that types are only used in
appropriate ways, the runtime also ensures that code doesn’t attempt to access memory that
hasn’t been allocated to it.
Common Language Specification - The CLR provides built-in support for language
interoperability. To ensure that you can develop managed code that can be fully used by
developers using any programming language, a set of language features and rules for using
them called the Common Language Specification (CLS) has been defined. Components that
follow these rules and expose only CLS features are considered CLS-compliant.
The class library is subdivided into a number of sets (or namespaces), each providing distinct
areas of functionality, with dependencies between the namespaces kept to a minimum
ASP.NET Windows Forms
Operating System
3.3.3.1 ASP.NET
ASP.NET is the .NET framework layer that handles Web requests for specific types of files,
namely those with (.aspx or .ascx) extensions. The ASP.NET engine provides a robust object
model for creating dynamic content and is loosely integrated into the .NET framework.
Client requests for these file types cause the server to load, parse, and execute
code to return a dynamic response. For Web Forms, the response usually consists of HTML
or WML. Web Forms maintain state by round-tripping user interface and other persistent
values between the client and the server automatically for each request.
A request for a Web Form can use View State, Session State, or Application
State to maintain values between requests. Both Web Forms and Web Services requests can
take advantage of ASP. Net’s integrated security and data access through ADO.NET, and can
run code that uses system services to construct the response. So the major difference between
a static request and a dynamic request is that a typical Web request references a static file.
The server reads the file and responds with the contents of the requested file.
ASP.NET supports all the .NET languages (currently C#, C++, VB.NET, and
JScript, but there are well over 20 different languages in development for .NET), so you will
eventually be able to write Web applications in your choice of almost any modern
programming language.
Every time an ASP.NET page is viewed, many tasks are being performed
behind the scenes. Tasks are performed at key points ("events") of the page's execution
lifecycle.
OnInit
The first event in our list to be raised is OnInit. When this event is raised, all
of the page's server controls are initialized with their property values. Post Back values are
not applied to the controls at this time.
On Load
The next event to be raised is On Load, which is the most important event of
them all as all the pages server controls will have their Post Back values now.
Next all the Post Back events are raised. These events are only raised when the
page view is the result of a Post Back. The order that these events are raised can't be defined
or relied upon; the only consistency with the order that Post Back events are raised is that
they are all raised between the Unload and OnPreRender events.
OnPreRender
This event is raised just prior to the page or server control's html output being
written into the response stream that's sent to the client web browser. This is last chance you
have to make any modifications. By this point, all the server controls on the page have the
final data applied.
On Unload
This is the last event in our list to be raised and you should destroy any un-
managed objects and close any currently open database connection at this point. It is not
possible to modify any controls on the page at this point as the response stream has already
been sent to the client web browser.
As each event of the page is raised it also automatically tells all its child
controls to raise their own implementation of the same event.Then execution flow is passed
back to the main page class to continue onto the next event and the process is repeated for
that event.
• Object-oriented
• Event-based
NAMESPACES
LANGUAGE INDEPENDENT
An ASP.NET page can be created in any language supported by .NET
framework. Currently .NET framework supports VB, C#, JScript and Managed C++.
TYPES OF CONTROLS
ASP.NET has two basic types of controls: HTML server controls and Web
server controls.HTML Server Controls are generated around specific HTML elements and the
ASP.NET engine changes the attributes of the elements based on server-side code that you
provide.The ASP.NET engine takes the extra steps to decide based upon the container of the
requester, what HTML to output.
ADO.NET provides a set of classes which a script can use to interact with
databases. Scripts can create instances of ADO.NET data classes and access their properties
and methods. A set of classes which work with a specific type of database is known as a .NET
Data Provider. ADO.NET comes with two Data Providers, the SQL Server.NET Data
Provider (which provides optimised access for Microsoft SQL Server databases) and the
OLEDB.NET Data Provider, which works with a range of databases. The main ADO.NET
OLEDB data access classes are OLEDBConnection,OLEDBCommand,OLEDBDataReader
and OLEDBDataAdapter.
Interoperablity
Maintainablity
Programmability
Performance Scalability
Visual studio.NET is a complete set of development tools for building ASP web applications
XML web applications ,XML web services desktop applications and mobile applications in
addition to building high performing desktop applications ,you can use visual studio’s
powerful component-based development tools and other technologies to simplify term-based
design, development and deployment of enterprise solutions. Visual basic .NET, Visual C+
+.NET and visual C#,NET all use the same integrated development environment (IDE) which
allows them to share tools and facilitates in the creation of mixed language solutions .
The OLAP Services feature available in SQL Server version 7.0 is now called
SQL Server 2000 Analysis Services. The term OLAP Services has been replaced with the
term Analysis Services. Analysis Services also includes a new data mining component. The
Repository component available in SQL Server version 7.0 is now called Microsoft SQL
Server 2000 Meta Data Services.
They are,
1. TABLE
2. QUERY
3. FORM
4. REPORT
5. MACRO
3.4.1.1 TABLE:
VIEWS OF TABLE:
1. Design View
2. Datasheet View
Design View
Datasheet View
To add, edit or analyses the data itself we work in tables datasheet view
mode.
3.4.1.2 QUERY:
A query is a question that has to be asked the data. Access gathers data
that answers the question from one or more table. The data that make up the answer is either
dynaset (if you edit it) or a snapshot(it cannot be edited).Each time we run query, we get
latest information in the dynaset.Access either displays the dynaset or snapshot for us to view
or perform an action on it ,such as deleting or updating.
3.4.1.3 FORMS:
A form is used to view and edit information in the database record by
record .A form displays only the information we want to see in the way we want to see it.
Forms
use the familiar controls such as textboxes and checkboxes. This makes
viewing and entering data easy.
Views of Form:
We can work with forms in several primarily there are two views,
They are,
1. Design View
2. Form View
Design View
Form View
The form view which display the whole design of the form.
3.4.1.4 REPORT:
A report is used to vies and print information from the database. The report
can ground records into many levels and compute totals and average by checking values from
many records at once. Also the report is attractive and distinctive because we have control
over the size and appearance of it.
3.4.1.5 MACRO:
A macro is a set of actions. Each action in macros does something. Such as
opening a form or printing a report .We write macros to automate the common tasks the work
easy and save the time.
3.4.1.6 MODULE:
Modules are units of code written in access basic language. We can write
and use module to automate and customize the database in very sophisticated ways.
SYSTEM TESTING
TESTING METHODOLOGIES
o Unit Testing.
o Integration Testing.
o User Acceptance Testing.
o Output Testing.
o Validation Testing.
Unit Testing
Unit testing focuses verification effort on the smallest unit of Software design that is
the module. Unit testing exercises specific paths in a module’s control structure to
ensure complete coverage and maximum error detection. This test focuses on each module
individually, ensuring that it functions properly as a unit. Hence, the naming is Unit Testing.
During this testing, each module is tested individually and the module interfaces are
verified for the consistency with design specification. All important processing path are tested
for the expected results. All error handling paths are also tested.
Integration Testing
Integration testing addresses the issues associated with the dual problems of
verification and program construction. After the software has been integrated a set of high
order tests are conducted. The main objective in this testing process is to take unit tested
modules and builds a program structure that has been dictated by design.
2. Bottom-up Integration
This method begins the construction and testing with the modules at the lowest level
in the program structure. Since the modules are integrated from the bottom up, processing
required for modules subordinate to a given level is always available and the need for stubs is
eliminated. The bottom up integration strategy may be implemented with the following steps:
The low-level modules are combined into clusters into clusters that
perform a specific Software sub-function.
A driver (i.e.) the control program for testing is written to coordinate test
case input and output.
The cluster is tested.
Drivers are removed and clusters are combined moving upward in the
program structure
The bottom up approaches tests each module individually and then each module is module is
integrated with a main module and tested for functionality.
User Acceptance of a system is the key factor for the success of any system. The
system under consideration is tested for user acceptance by constantly keeping in touch with
the prospective system users at the time of developing and making changes wherever
required. The system developed provides a friendly user interface that can easily be
understood even by a person who is new to the system.
After performing the validation testing, the next step is output testing of the proposed
system, since no system could be useful if it does not produce the required output in the
specified format. Asking the users about the format required by them tests the outputs
generated or displayed by the system under consideration. Hence the output format is
considered in 2 ways – one is on screen and another in printed format.
Text Field:
The text field can contain only the number of characters lesser than or equal to its
size. The text fields are alphanumeric in some tables and alphabetic in other tables. Incorrect
entry always flashes and error message.
Numeric Field:
The numeric field can contain only numbers from 0 to 9. An entry of any character
flashes an error messages. The individual modules are checked for accuracy and what it has
to perform. Each module is subjected to test run along with sample data. The individually
tested modules are integrated into a single system. Testing involves executing the real data
information is used in the program the existence of any program defect is inferred from the
output. The testing should be planned so that all the requirements are individually tested.
A successful test is one that gives out the defects for the inappropriate data and
produces and output revealing the errors in the system.
Live test data are those that are actually extracted from organization files. After a system is
partially constructed, programmers or analysts often ask users to key in a set of data from
their normal activities. Then, the systems person uses this data as a way to partially test the
system. In other instances, programmers or analysts extract a set of live data from the files
and have them entered themselves.
Artificial test data are created solely for test purposes, since they can be generated to test all
combinations of formats and values. In other words, the artificial data, which can quickly be
prepared by a data generating utility program in the information systems department, make
possible the testing of all login and control paths through the program.
The most effective test programs use artificial test data generated by persons other
than those who wrote the programs. Often, an independent team of testers formulates a
testing plan, using the systems specifications.
The package “Virtual Private Network” has satisfied all the requirements specified as
per software requirement specification and was accepted.
Whenever a new system is developed, user training is required to educate them about
the working of the system so that it can be put to efficient use by those for whom the
system has been primarily designed. For this purpose the normal working of the
project was demonstrated to the prospective users. Its working is easily
understandable and since the expected users are people who have good knowledge of
computers, the use of this system is very easy.
7.3 MAINTAINENCE
This covers a wide range of activities including correcting code and design errors. To
reduce the need for maintenance in the long run, we have more accurately defined the user’s
requirements during the process of system development. Depending on the requirements, this
system has been developed to satisfy the needs to the largest possible extent. With
development in technology, it may be possible to add many more features based on the
requirements in future. The coding and designing is simple and easy to understand which will
make maintenance easier.
TESTING STRATEGY :
A strategy for system testing integrates system test cases and design techniques into a
well planned series of steps that results in the successful construction of software. The
testing strategy must co-operate test planning, test case design, test execution, and the
resultant data collection and evaluation .A strategy for software testing must accommodate
low-level tests that are necessary to verify that a small source code segment has been
correctly implemented as well as high level tests that validate major system functions
against user requirements.
Software testing is a critical element of software quality assurance and represents the
ultimate review of specification design and coding. Testing represents an interesting anomaly
for the software. Thus, a series of testing are performed for the proposed system before
the system is ready for user acceptance testing.
SYSTEM TESTING:
Software once validated must be combined with other system elements (e.g.
Hardware, people, database). System testing verifies that all the elements are proper and that
overall system function performance is
achieved. It also tests to find discrepancies between the system and its original objective,
current specifications and system documentation.
UNIT TESTING:
In unit testing different are modules are tested against the specifications produced
during the design for the modules. Unit testing is essential for verification of the code
produced during the coding phase, and hence the goals to test the internal logic of the
modules. Using the detailed design description as a guide, important Conrail paths are
tested to uncover errors within the boundary of the modules. This testing is carried out during
the programming stage itself. In this type of testing step, each module was found to be
working satisfactorily as regards to the expected output from the module.
In Due Course, latest technology advancements will be taken into consideration. As part of
technical build-up many components of the networking system will be generic in nature so
that future projects can either use or interact with this. The future holds a lot to offer to the
development and refinement of this project.
CONCLUSION:
In this project, we presented a case study in educational data mining. It shows the potential of
data mining in higher education. It was especially used to improve students' performance and
detect early predictor of their final GPA. We utilized the classification technique, decision
tree in particular, to predict students’ final GPA based on their grades on previous courses. We
discovered classification rules to predict students final GPA based on their grades in
mandatory courses. We also evaluate the most important courses in the study plan that have a
big impact on the students' final GPA.
FUTURE WORK:
In this project, prediction parameters such as the decision trees generated using RapidMiner
are not updated dynamically within the source code. In the future, we plan to make the entire
implementation dynamic to train the prediction parameters itself when new training sets are
fed into the web application. Also, in the current implementation, we have not considered
extra-curricular activities and other vocational courses completed by students, which we
believe may have a significant impact on the overall performance of the students.
Considering such parameters would result in better accuracy of prediction.
REFERENCES:
[2]F. Siraj and M. A. Abdoulha, "Mining enrolment data using predictive and descriptive
approaches," Knowledge-Oriented Applications in Data Mining, pp. 53-72, 2007.
[4]B. K. Baradwaj and S. Pal, "Mining educational data to analyze students' performance "
International Journal of Advanced Computer and Applications, vol. 2, 2011.
[5]. Nandeshwar and S. Chaudhari. (2009). Enrollment Prediction Models Data Mining.
[Online]. Available: http://nandeshwar.info/wp-content/uploads/2008/11/DMWVU_Proje.pdf
[6]. Kabakchieva, "Predicting student performance busing data mining methods for
classification," Cybernetics and Information Technologiesvol. 13, 2013.