F2F
F2F
AIPresent: Random Interval Query and Face Recognition Attendance System for
Virtual Classroom using Machine Learning (FACE2FACE)
Technical field:
Artificial Intelligence :
Artificial Intelligence (AI) is the field of computer science dedicated to solving cognitive
problems commonly associated with human intelligence, such as learning, problem solving, and
pattern recognition. Artificial Intelligence, often abbreviated as "AI", may connote robotics or
futuristic scenes, AI goes well beyond the automatons of science fiction, into the non-fiction of
modern-day advanced computer science. Professor Pedro Domingo’s, a prominent researcher in
this field, describes “five tribes” of machine learning, comprised of symbolists, with origins in
logic and philosophy; connectionists, stemming from neuroscience; revolutionaries, relating to
evolutionary biology; Bayesians, engaged with statistics and probability; and analogizes with
origins in psychology. Recently, advances in the efficiency of statistical computation have led to
Bayesians being successful at furthering the field in a number of areas, under the name “machine
learning”. Similarly, advances in network computation have led to connectionists furthering a
subfield under the name “deep learning”. Machine learning (ML) and deep learning (DL) are
both computer science fields derived from the discipline of Artificial Intelligence.
Broadly, these techniques are separated into “supervised” and “unsupervised” learning
techniques, where “supervised” uses training data that includes the desired output, and
“unsupervised” uses training data without the desired output.
AI becomes “smarter” and learns faster with more data, and every day, businesses are generating
this fuel for running machine learning and deep learning solutions, whether collected and
extracted from a data warehouse like Amazon Redshift, ground-trothed through the power of
“the crowd” with Mechanical Turk, or dynamically mined through Kinesis Streams. Further,
with the advent of IoT, sensor technology exponentially adds to the amount of data to be
analyzed -- data from sources and places and objects and events that have previously been nearly
untouched.
Deep Learning :
Deep Learning is a branch of machine learning that involves layering algorithms in an effort to
gain greater understanding of the data. The algorithms are no longer limited to create an
explainable set of relationships as would a more basic regression. Instead, deep learning relies
on these layers of non-linear algorithms to create distributed representations that interact based
on a series of factors. Given large sets of training data, deep learning algorithms begin to be able
to identify the relationships between elements. These relationships may be between shapes,
colors, words, and more. From this, the system can then be used to create predictions. Within
machine learning and artificial intelligence, the power of deep learning stems from the system
being able to identify more relationships than humans could practically code in software, or
relationships that humans may not even be able to perceive. After sufficient training, this allows
the network of algorithms to begin to make predictions or interpretations of very complex data.
Image and Video Classification, Segmentation
Convolutional Neural Networks out-perform humans on many vision tasks including object
classification. Given millions of labeled pictures, the system of algorithms is able to begin
identifying the subject of the image. Many photo-storage services include facial recognition,
driven by Deep Learning
Background:
Existing System
Zoom, Google Meet, Microsoft Teams, and Cisco Webex Meetings are used to create virtual
classrooms.
Manual attendance calling, self-reporting attendance systems (using tools like Google forms),
video calling students, short quizzes or polls, questions and discussions by selecting random
students, and timed assignments.
In the case of physical classrooms, biometric-based attendance monitoring systems are
essentially based on face, fingerprint, and iris recognition technologiesFacial recognition is a
technology that is capable of recognizing a person based on their face. It employs machine
learning algorithms which find, capture, store and analyses facial features in order to match them
with images of individuals in a pre-existing database.Early approaches mainly focused on
extracting different types of hand-crafted features with domain experts in computer vision and
training effective classifiers for detection with traditional machine learning algorithms. Such
methods are limited in that they often require computer vision experts in crafting effective
features, and each individual component is optimized separately, making the whole detection
pipeline often sub-optimal. There are many existing FR methods that achieve a good
performance
Support Vector Machine (SVM)
Support Vector Machines (SVM) are a popular training tool which can be used to generate a
model based on several classes of data, and then distinguish between them. For the basic two-
class classification problem, the goal of an SVM is to separate the two classes by a function
induced from available examples. In the case of facial recognition, a class represents a unique
face, and the SVM attempts to find what best separates the multiple feature vectors of one unique
face from those of another unique face.
Principal Component Analysis (PCA)
One of the most used and cited statistical method is the Principal Component Analysis. A
mathematical procedure performs a dimensionality reduction by extracting the principal
component of multi-dimensional data.Principal component analysis id reducing the Eigen value
and Eigen vectors problem in a matrix. Simply Principal component analysis is used for a wide
range of variety in different applications such as Digital image processing, Computer vision and
Pattern recognition. The main principal of principal component analysis is reducing the
dimensionality of a database. In the communication of large number of interrelated features and
those retaining as much as possible of the variation in the database
Linear Discriminant Analysis (LDA)
LDA is widely used to find the linear combination of features while preserving class separability.
Unlike PCA, the LDA tries to model to the difference between levels. For each level the LDA
obtains differenced in multiple projection vectors.Linear discriminant analysis method is related
to fisher discriminant analysis. Linear discriminant analysis is using to describing the local
features of the images. Features are extracting the form of pixels in images; these features are
known as shape feature, color feature and texture feature. The linear discriminant analysis is
using for identifying the linear separating vectors between features of the pattern in the images.
This procedure is using maximization between class scatter, when minimizing the intra class
variance in face identification.
Neural Network (NN)
Neural Network has continued to use pattern recognition and classification. Kohonen was the
first to show that a neuron network could be used to recognize aligned and normalized faces.
There are methods, which perform feature extraction using neural networks. There are many
methods, which combined with tools like PCA or LCA and make a hybrid classifier for face
recognition. These are like Feed Forward Neural Network with additional bias, Self-Organizing
Maps with PCA, and Convolutional Neural Networks with multi-layer perception, etc. These can
increase the efficiency of the models.
K-Nearest Neighbors
One of the basic classification algorithms in machine learning is known to be the k-NN
algorithm. In machine learning, the k-NN algorithm is considered a well monitored type of
learning. It is commonly used in the sorting of related elements in searching apps. By
constructing a vector representation of objects and then measuring them using appropriate
distance metrics, the similarities between the items are determined.
Other Application used FR
Face Recognition Applications areSecurity System and Smart Home Automation System.
Face recognition based voting system are proposed
Disadvantages
Calling students' names in virtual classrooms to take attendance is both trivial and
time-consuming.
Students may resort to unethical activities like not attending the class but still keeping
their status as `online’.
student can go offline at any time without letting the teacher know.
it is not easy to find out whether the student is really attending the class.
student might have turned off the video camera.
All the existing process will consume valuable lecturing time and will affect the
teaching efficiency.
The accuracy of the system is not 100%.
Face detection and loading training data processes just a little bit slow.
It can only detect face from a limited distance.
The instructor and training set manager still have to do some work manually.
Handcrafted feature
High Computational Complexity
Proposed System
Proposed System of the project introduces the novel feature of randomness in anAI-based face
recognition system to effectively trackand manage students' attendance and engagement invirtual
classrooms.Enhances the efficacy of the attendance management invirtual classrooms by
integrating two ancillary modalitiesstudents' real-time response to CAPTCHAs, Concept QA and
UIN (Unique Identification Number) queries.Monitors students' attendance and engagement
duringvirtual learning without affecting their focus onlearning.
Proposed two ancillary modalities - verifying students' responses to Subjects and UIN (Unique
Identification) queries at random intervals of time.
Develops a user-friendly attendance recording systemfor teachers that can automatically record
students'attendance and generate attendance reports for virtualclassrooms.
Deep learning in the form of Convolutional Neural Networks (CNNs) to perform the face
recognition.
DCNN
CNNs are a category of Neural Networks that have proven very effective in areas such as image
recognition and classification. CNNs are a type of feed-forward neural networks made up of
many layers. CNNs consist of filters or kernels or neurons that have learnable weights or
parameters and biases. Each filter takes some inputs, performs convolution and optionally
follows it with a non-linearity. A typical CNN architecture can be seen as shown in Fig.1. The
structure of CNN contains Convolutional, pooling, Rectified Linear Unit (ReLU), and Fully
Connected layers.
Advantages
Randomness ensures that students cannot predict at which instant of time the attendance
is registered.
highly efficient and robust attendance management system for virtual learning,
Monitors students' attendance and engagement during virtual learning without affecting
their focus on learning.
students' attention and engagement in virtual learning are enhanced.
Introduces the novel feature of randomness
face-embedding learning approach that yielded a recognition accuracy of 98.95%
Provide authorized access.
Ease of use.
Multiple face detection.
Provide methods to maximize the number of extracted faces from an image.
Ease of use.
Manipulate and recognize the faces in real time using live video data.
Multipurpose software.
Can be used in different places university college, school and even business too.
Objects:
The key objective of AIPresent is todevelop a robust system that monitor students'
attendanceand engagement in a virtual classroom, at random intervalsof time.It encompasses a
novel design using the AI DeepCNN (Convolution Neural Network) model to capture
facebiometric randomly from students' video stream and recordtheir attendance automatically.
Thus, the main component
of the proposed model is a face recognition module builtusing the AI-DL tools. RIAMS also
incorporates ancillarysubmodules for assessing students' responses to CAPTCHAsand UIN
queries, to ensure active engagement in virtualclassrooms.
College Sever
View Time Table Boot QA for Learning Pre. Generate Time Table
AIPresent
Auto QA Loader VC Attendance
VC Attentive Prediction
AIPresent
Speaker Identification
Enrollment Auto Random Time Face Capture
Preprocessing
Preprocessing
Face Detection
Face Detection
Face Recognition
Face Recognition
Feature Extraction
Feature Extraction
Classification
Classified Result
Student
Database Matching
Server
College Admin
Staff
Parent
Detailed description :
SOFTWARE DESCRIPTION
Python 3.7.4
Python is a general-purpose interpreted, interactive, object-oriented, and high-level programming
language. It was created by Guido van Rossum during 1985- 1990. Like Perl, Python source
code is also available under the GNU General Public License (GPL). This tutorial gives enough
understanding on Python programming language.
Python is a high-level, interpreted, interactive and object-oriented scripting language. Python is
designed to be highly readable. It uses English keywords frequently where as other languages
use punctuation, and it has fewer syntactical constructions than other languages.
Python is a MUST for students and working professionals to become a great Software Engineer
specially when they are working in Web Development Domain. I will list down some of the key
advantages of learning Python:
Python is Interpreted − Python is processed at runtime by the interpreter. You do not need to
compile your program before executing it. This is similar to PERL and PHP.
Python is Interactive − You can actually sit at a Python prompt and interact with the interpreter
directly to write your programs.
Python is Object-Oriented − Python supports Object-Oriented style or technique of programming
that encapsulates code within objects.
Python is a Beginner's Language − Python is a great language for the beginner-level
programmers and supports the development of a wide range of applications from simple text
processing to WWW browsers to games.
The Python Package Index (PyPI) hosts thousands of third-party modules for Python. Both
Python's standard library and the community-contributed modules allow for endless possibilities.
The most basic use case for Python is as a scripting and automation language. Python isn’t just a
replacement for shell scripts or batch files; it is also used to automate interactions with web
browsers or application GUIs or to do system provisioning and configuration in tools such
as Ansible and Salt. But scripting and automation represent only the tip of the iceberg with
Python.
General application programming with Python
You can create both command-line and cross-platform GUI applications with Python and deploy
them as self-contained executables. Python doesn’t have the native ability to generate a
standalone binary from a script, but third-party packages like cx_Freeze and PyInstaller can be
used to accomplish that.
Data science and machine learning with Python
Sophisticated data analysis has become one of fastest-moving areas of IT and one of Python’s
star use cases. The vast majority of the libraries used for data science or machine learning have
Python interfaces, making the language the most popular high-level command interface to for
machine learning libraries and other numerical algorithms.
Web services and RESTful APIs in Python
Python’s native libraries and third-party web frameworks provide fast and convenient ways to
create everything from simple REST APIs in a few lines of code to full-blown, data-driven sites.
Python’s latest versions have strong support for asynchronous operations, letting sites
handle tens of thousands of requests per second with the right libraries.
Metaprogramming and code generation in Python
In Python, everything in the language is an object, including Python modules and libraries
themselves. This lets Python work as a highly efficient code generator, making it possible to
write applications that manipulate their own functions and have the kind of extensibility that
would be difficult or impossible to pull off in other languages.
Python can also be used to drive code-generation systems, such as LLVM, to efficiently create
code in other languages.
“Glue code” in Python
Python is often described as a “glue language,” meaning it can let disparate code (typically
libraries with C language interfaces) interoperate. Its use in data science and machine learning is
in this vein, but that’s just one incarnation of the general idea. If you have applications or
program domains that you would like to hitch up, but cannot talk to each other directly, you can
use Python to connect them.
Python 2 vs. Python 3
Python is available in two versions, which are different enough to trip up many new users.
Python 2.x, the older “legacy” branch, will continue to be supported (that is, receive official
updates) through 2020, and it might persist unofficially after that. Python 3.x, the current and
future incarnation of the language, has many useful and important features not found in Python
2.x, such as new syntax features (e.g., the “walrus operator”), better concurrency controls, and a
more efficient interpreter.
Python 3 adoption was slowed for the longest time by the relative lack of third-party library
support. Many Python libraries supported only Python 2, making it difficult to switch. But over
the last couple of years, the number of libraries supporting only Python 2 has dwindled; all of the
most popular libraries are now compatible with both Python 2 and Python 3. Today, Python 3 is
the best choice for new projects; there is no reason to pick Python 2 unless you have no choice. If
you are stuck with Python 2, you have various strategies at your disposal.
Python’s libraries
The success of Python rests on a rich ecosystem of first- and third-party software. Python
benefits from both a strong standard library and a generous assortment of easily obtained and
readily used libraries from third-party developers. Python has been enriched by decades of
expansion and contribution.
Python’s standard library provides modules for common programming tasks—math, string
handling, file and directory access, networking, asynchronous operations, threading,
multiprocessors management, and so on. But it also includes modules that manage common,
high-level programming tasks needed by modern applications: reading and writing structured file
formats like JSON and XML, manipulating compressed files, working with internet protocols
and data formats (webpages, URLs, email). Most any external code that exposes a C-compatible
foreign function interface can be accessed with Python’s ctypes module.
The default Python distribution also provides a rudimentary, but useful, cross-platform GUI
library via Tkinter, and an embedded copy of the SQLite 3 database.
The thousands of third-party libraries, available through the Python Package Index (PyPI),
constitute the strongest showcase for Python’s popularity and versatility.
For example:
The Beautiful Soup library provides an all-in-one toolbox for scraping HTML—even tricky,
broken HTML—and extracting data from it.
Requests makes working with HTTP requests at scale painless and simple.
Frameworks like Flask and Django allow rapid development of web services that encompass
both simple and advanced use cases.
Multiple cloud services can be managed through Python’s object model using Apache Libcloud.
NumPy, Pandas, and Matplotlib accelerate math and statistical operations, and make it easy to
create visualizations of data.
Python’s compromises Like C#, Java, and Go, Python has garbage-collected memory
management, meaning the programmer doesn’t have to implement code to track and release
objects. Normally, garbage collection happens automatically in the background, but if that poses
a performance problem, you can trigger it manually or disable it entirely, or declare whole
regions of objects exempt from garbage collection as a performance enhancement.
An important aspect of Python is its dynamism. Everything in the language, including functions
and modules themselves, are handled as objects. This comes at the expense of speed (more on
that later), but makes it far easier to write high-level code. Developers can perform complex
object manipulations with only a few instructions, and even treat parts of an application as
abstractions that can be altered if needed.
Python’s use of significant whitespace has been cited as both one of Python’s best and worst
attributes. The indentation on the second line below isn’t just for readability; it is part of
Python’s syntax. Python interpreters will reject programs that don’t use proper indentation to
indicate control flow.
Syntactical white space might cause noses to wrinkle, and some people do reject Python for this
reason. But strict indentation rules are far less obtrusive in practice than they might seem in
theory, even with the most minimal of code editors, and the result is code that is cleaner and
more readable.
Another potential turnoff, especially for those coming from languages like C or Java, is how
Python handles variable typing. By default, Python uses dynamic or “duck” typing—great for
quick coding, but potentially problematic in large code bases. That said, Python has recently
added support for optional compile-time type hinting, so projects that might benefit from static
typing can use it.
What is MySQL? – An Introduction to Database Management Systems
Database Management is the most important part when you have humungous data around
you. MySQL is one of the most famous Relational Database to store & handle your data. In
this What is MySQL blog, you will be going through the following topics:
What are Data & Database?
Database Management System & Types of DBMS
Structured Query Language (SQL)
MySQL & its features
MySQL Data Types
What are Data & Database?
Suppose a company needs to store the names of hundreds of employees working in the
company in such a way that all the employees can be individually identified. Then, the company
collects the data of all those employees. Now, when I say data, I mean that the company collects
distinct pieces of information about an object. So, that object could be a real-world entity such as
people, or any object such as a mouse, laptop etc.
Database Management System & Types of DBMS
A Database Management System (DBMS) is a software application that interacts with
the user, applications and the database itself to capture and analyze data. The data stored in the
database can be modified, retrieved and deleted, and can be of any type like strings, numbers,
images etc.
Types of DBMS
There are mainly 4 types of DBMS, which are Hierarchical, Relational, Network, and
Object-Oriented DBMS.
Hierarchical DBMS: As the name suggests, this type of DBMS has a style of
predecessor-successor type of relationship. So, it has a structure similar to that of a
tree, wherein the nodes represent records and the branches of the tree represent fields.
Relational DBMS (RDBMS): This type of DBMS, uses a structure that allows the
users to identify and access data in relation to another piece of data in the database.
Network DBMS: This type of DBMS supports many to many relations wherein
multiple member records can be linked.
Object-oriented DBMS: This type of DBMS uses small individual software called
objects. Each object contains a piece of data, and the instructions for the actions to be
done with the data.
Structured Query Language (SQL)
SQL is the core of a relational database which is used for accessing and managing the
database. By using SQL, you can add, update or delete rows of data, retrieve subsets of
information, modify databases and perform many actions. The different subsets of SQL are as
follows:
DDL (Data Definition Language) – It allows you to perform various operations on the
database such as CREATE, ALTER and DELETE objects.
DML (Data Manipulation Language) – It allows you to access and manipulate data. It
helps you to insert, update, delete and retrieve data from the database.
DCL (Data Control Language) – It allows you to control access to the database.
Example – Grant or Revoke access permissions.
TCL (Transaction Control Language) – It allows you to deal with the transaction of
the database. Example – Commit, Rollback, save point, Set Transaction.
Using MySQL
Of course, there’s not a lot of point to being able to change HTML output dynamically unless
you also have a means to track the changes that users make as they use your website. In the early
days of the Web, many sites used “flat” text files to store data such as usernames and passwords.
But this approach could cause problems if the file wasn’t correctly locked against corruption
from multiple simultaneous accesses. Also, a flat file can get only so big before it becomes
unwieldy to manage—not to mention the difficulty of trying to merge files and perform complex
searches in any kind of reasonable time. That’s where relational databases with structured
querying become essential. And MySQL, being free to use and installed on vast numbers of
Internet web servers, rises superbly to the occasion. It is a robust and exceptionally fast database
management system that uses English-like commands. The highest level of MySQL structure is a
database, within which you can have one or more tables that contain your data. For example,
let’s suppose you are working on a table called users, within which you have created columns for
surname, first name, and
email, and you now wish to add another user. One command that you might use to do this is:
INSERT INTO users VALUES ('Smith', 'John', '[email protected]'); Of course, as mentioned
earlier, you will have issued other commands to create the database and table and to set up all the
correct fields, but the INSERT command here shows how simple it can be to add new data to a
database. The INSERT command is an example of SQL (which stands for Structured Query
Language), a language designed in the early 1970s and reminiscent of one of the oldest
programming languages, COBOL. It is well suited, however, to database queries, which is why it
is still in use after all this time. It’s equally easy to look up data. Let’s assume that you have an
email address for a user and you need to look up that person’s name. To do this, you could issue
a MySQL query such as: SELECT surname, first name FROM users WHERE
email='[email protected]'; MySQL will then return Smith, John and any other pairs of names
that may be associated with that email address in the database. As you’d expect, there’s quite a
bit more that you can do with MySQL than just simple INSERT and SELECT commands. For
example, you can join multiple tables according to various criteria, ask for results in a variety of
different orders, make partial matches when you know only part of the string that you are
searching for, return only the nth result, and a lot more. Using PHP, you can make all these calls
directly to MySQL without having to run the MySQL program yourself or use its command-line
interface. This means you can save the results in arrays for processing and perform multiple
lookups, each dependent on the results returned from earlier ones, to drill right down to the item
of data you need. For even more power, as you’ll see later, there are additional functions built
right into MySQL that you can call up for common operations and extra speed.
The Apache Web Server
In addition to PHP, MySQL, JavaScript, and CSS, there’s actually a fifth hero in the dynamic
Web: the web server. In the case of this book, that means the Apache web server. We’ve
discussed a little of what a web server does during the HTTP server/client exchange, but it
actually does much more behind the scenes. For example, Apache doesn’t serve up just HTML
files—it handles a wide range of files, from images and Flash files to MP3 audio files, RSS
(Really Simple Syndication) feeds, and more. Each element a web client encounters in an HTML
page is also requested from the server, which then serves it up. But these objects don’t have to be
static files, such as GIF images. They can all be generated by programs such as PHP scripts.
That’s right: PHP can even create images and other files for you, either on the fly or in advance
to serve up later. To do this, you normally have modules either precompiled into Apache or PHP
or called up at runtime. One such module is the GD library (short for Graphics Draw), which
PHP uses to create and handle graphics.
Apache also supports a huge range of modules of its own. In addition to the PHP module, the
most important for your purposes as a web programmer are the modules that handle security.
Other examples are the Rewrite module, which enables the web server to handle a varying range
of URL types and rewrite them to its own internal requirements, and the Proxy module, which
you can use to serve up often-requested pages from a cache to ease the load on the server. Later
in the book, you’ll see how to actually use some of these modules to enhance the features
provided by the core technologies we cover. About Open Source Whether or not being open
source is the reason these technologies are so popular has often been debated, but PHP, MySQL,
and Apache are the three most commonly used tools in their categories. What can be said,
though, is that being open source means that they have been developed in the community by
teams of programmers writing the features they themselves want and need, with the original code
available for all to see and change. Bugs can be found and security breaches can be prevented
before they happen. There’s another benefit: all these programs are free to use. There’s no
worrying about having to purchase additional licenses if you have to scale up your website and
add more servers. And you don’t need to check the budget before deciding whether to upgrade to
the latest versions of these products.
What Is a WAMP, MAMP, or LAMP?
WAMP, MAMP, and LAMP are abbreviations for “Windows, Apache, MySQL, and PHP,”
“Mac, Apache, MySQL, and PHP,” and “Linux, Apache, MySQL, and PHP,” 13 www.it-
ebooks.info respectively. These abbreviations describe a fully functioning setup used for
developing dynamic Internet web pages. WAMPs, MAMPs, and LAMPs come in the form of a
package that binds the bundled programs together so that you don’t have to install and set them
up separately. This means you can simply download and install a single program and follow a
few easy prompts to get your web development server up and running in the quickest time with
the minimum hassle. During installation, several default settings are created for you. The security
configurations of such an installation will not be as tight as on a production web server, because
it is optimized for local use. For these reasons, you should never install such a setup as a
production server. However, for developing and testing websites and applications, one of these
installations should be entirely sufficient.
Using an IDE
As good as dedicated program editors can be for your programming productivity, their utility
pales into insignificance when compared to Integrated Developing Environments (IDEs), which
offer many additional features such as in-editor debugging and program testing, as well as
function descriptions and much more.
Web Framework
Web Application Framework or simply Web Framework represents a collection of libraries and
modules that enables a web application developer to write applications without having to bother
about low-level details such as protocols, thread management etc.
Flask
Flask is a web framework. This means flask provides you with tools, libraries and technologies
that allow you to build a web application. This web application can be some web pages, a blog, a
wiki or go as big as a web-based calendar application or a commercial website.
Flask is often referred to as a micro framework. It aims to keep the core of an application simple
yet extensible. Flask does not have built-in abstraction layer for database handling, nor does it
have formed a validation support. Instead, Flask supports the extensions to add such
functionality to the application. Although Flask is rather young compared to
most Python frameworks, it holds a great promise and has already gained popularity among
Python web developers. Let’s take a closer look into Flask, so-called “micro” framework for
Python.
Flask was designed to be easy to use and extend. The idea behind Flask is to build a solid
foundation for web applications of different complexity. From then on you are free to plug in any
extensions you think you need. Also, you are free to build your own modules. Flask is great for
all kinds of projects. It's especially good for prototyping.
Flask is part of the categories of the micro-framework. Micro-framework is normally framework
with little to no dependencies to external libraries. This has pros and cons. Pros would be that the
framework is light, there are little dependency to update and watch for security bugs, cons is that
some time you will have to do more work by yourself or increase yourself the list of
dependencies by adding plugins. In the case of Flask, its dependencies are:
WSGI
Web Server Gateway Interface (WSGI) has been adopted as a standard for Python web
application development. WSGI is a specification for a universal interface between the web
server and the web applications.
Werkzeug
It is a WSGI toolkit, which implements requests, response objects, and other utility functions.
This enables building a web framework on top of it. The Flask framework uses Werkzeug as one
of its bases.
Jinja2
Jinja2 is a popular templating engine for Python. A web templating system combines a template
with a certain data source to render dynamic web pages.
built-in development server and fast debugger
integrated support for unit testing
RESTful request dispatching
Jinja2 templating
support for secure cookies (client-side sessions)
WSGI 1.0 compliant
Unicode based
Plus, Flask gives you so much more CONTROL on the development stage of your project. It
follows the principles of minimalism and lets you decide how you will build your application.
Flask has a lightweight and modular design, so it easy to transform it to the web
framework you need with a few extensions without weighing it down
ORM-agnostic: you can plug in your favorite ORM e.g., SQLAlchemy.
Basic foundation API is nicely shaped and coherent.
Flask documentation is comprehensive, full of examples and well structured. You can
even try out some sample application to really get a feel of Flask.
It is super easy to deploy Flask in production (Flask is 100% WSGI 1.0 compliant”)
HTTP request handling functionality
High Flexibility
To sum up, Flask is one of the most polished and feature-rich micro frameworks available. Still
young, Flask has a thriving community, first-class extensions, and an elegant API. Flask comes
with all the benefits of fast templates, strong WSGI features, thorough unit testability at the
web application and library level, extensive documentation. So next time you are starting a new
project where you need some good features and a vast number of extensions, definitely check out
Flask.
Claims:
Random Interval Attendance Management System (AIPresent)is an innovation based on
Artificial Intelligence – DeepLearning, specially designed to help the teachers/instructorsacross
the globe for effective management of attendanceduring virtual learning. AIPresent facilitates
precise and automatictracking of students' attendance in virtual classrooms.It incorporates a
customized face recognition module alongwith specially designed ancillary submodules. Both
the facerecognition and the sub modalities are for students' attendancemonitoring in virtual
classrooms. The submodulescheck students' responses to CAPTCHAs, ConceptQA and UIN
queries.The system captures face biometric from the video stream ofparticipants and gathers the
timely responses of students to ConceptQA and UIN queries, at random intervals of time.An
intelligible and adaptive weighting strategy is employedfor finalizing the decisions from the
three modalities. AIPresent could be integrated with any existing virtual meeting
platformthrough an application interface like a web page or a specificApp.
Abstract:
The COVID-19 pandemic outbreak has resulted in anunprecedented crisis across the globe. The
pandemiccreated an enormous demand for innovative technologies tosolve crisis-specific
problems in different sectors of society. In the case of the education sector and allied
learningtechnologies, significant issues have emerged while substitutingface-to-face learning
with online virtual learning.Several countries have closed educationalinstitutions temporarily to
alleviate the COVID-19 spread.The closure of educational institutions compelled the
teachersacross the globe to use online meeting platforms extensively. The virtual classrooms
created by onlinemeeting platforms are adopted as the only alternative for face – to-face
interaction in physical classrooms.In this regard, students' attendance managementin virtual
classes is a major challenge encounteredby the teachers. Student attendance is a measure oftheir
engagement in a course, which has a direct relationshipwith their active learning.However,
during virtual learning, it is exceptionally challengingto keep track of the attendance of students.
Callingstudents' names in virtual classrooms to take attendance isboth trivial and time-
consuming.Thus, in the backdrop of the COVID-19 pandemicand the extensive usage of virtual
meeting platforms, there is a crisis-specific immediate necessity to develop aproper tracking
system to monitor students' attendance andengagement during virtual learning. In this project,we
are addressing the pandemic-induced crucial necessity byintroducing a novel approach.In order
to realize a highly efficient and robust attendancemanagement system for virtual learning, we
introducethe Random Interval Query and Face Recognition Attendance Management
System(hereafter, AI Present). To the best of our knowledge no suchautomated system has been
proposed so far for trackingstudents' attendance and ensuring their engagement duringvirtual
learning.