0% found this document useful (0 votes)
4 views20 pages

Nvidia Report

The project report details the development of a real-time face detection and attendance system using the NVIDIA Jetson Nano Developer Kit, aimed at automating attendance logging in educational and corporate environments. It employs computer vision techniques and libraries such as face_recognition and OpenCV to recognize faces from live video feeds. The project not only enhances practical skills in AI deployment but also addresses efficiency and safety concerns in attendance tracking post-pandemic.

Uploaded by

danish.s1849
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views20 pages

Nvidia Report

The project report details the development of a real-time face detection and attendance system using the NVIDIA Jetson Nano Developer Kit, aimed at automating attendance logging in educational and corporate environments. It employs computer vision techniques and libraries such as face_recognition and OpenCV to recognize faces from live video feeds. The project not only enhances practical skills in AI deployment but also addresses efficiency and safety concerns in attendance tracking post-pandemic.

Uploaded by

danish.s1849
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

VISVESVARAYA TECHNOLOGICAL UNIVERSITY,

BELAGAVI – 590018

Project On
“REAL-TIME FACE DETECTION AND
ATTENDNANCE SYSTEM USING NVIDIA
JETSON NANO KIT”
A Report submitted in Partial Fulfillment of the Requirements for the Award
a degree of
BACHELOR OF ENGINEERING
In
COMPUTER SCIENCE AND ENGINEERING
(DATA SCIENCE)

Submitted by
Prashanth J N 4AL23CD032

Under Supervision of
Prof. Harish Kunder
Dept. of Artificial Intelligence & Machine Learning

DEPARTMENT OF ARTIFICIAL INTELLIGENCE & MACHINE LEARNING

ALVA’S INSTITUTE OF ENGINEERING & TECHNOLOGY, MIJAR


(Unit of Alva’s Education Foundation ®, Moodbidri) Affiliated to Visvesvaraya
Technological University, Belagavi, Approved by AICTE, New Delhi, Recognized by
Government of Karnataka. Accredited by NAAC with A+ Grade
Shobavana Campus, Mijar, Moodbidri, D.K., Karnataka

2024 – 2025
ALVA’S INSTITUTE OF ENGINEERING & TECHNOLOGY, MIJAR
(Unit of Alva’s Education Foundation ®, Moodbidri) Affiliated to Visvesvaraya
Technological University, Belagavi, Approved by AICTE, New Delhi, Recognized by
Government of Karnataka. Accredited by NAAC with A+ Grade
Shobavana Campus, Mijar, Moodbidri, D.K., Karnataka

2024 – 2025

DEPARTMENT OF ARTIFICIAL INTELLIGENCE & MACHINE LEARNING

CERTIFICATE
This is to certify that the “Project Work” entitled “NVIDIA JETSON DEVELOPER
KIT, REAL-TIME FACE DETECTION AND ATTENDNANCE SYSTEM ”has been
successfully completed by PRASHANTH J N (4AL23CD032) in the partial fulfillment for the award
of Degree of Bachelor of Engineering in Artificial Intelligence and Machine Learning of the
Visvesvaraya Technological University, Belagavi during the year 2024- 2025. It is certified that all
corrections/suggestions indicated have been incorporated in the report. The seminar report has been
approved as it satisfies the academic requirements in respect of “Seminar Work” prescribed for the
award of Bachelor of Engineering Degree.

Prof. Harish Kunder


HOD, Dept. of
AIML,DS
ACKNOWLEDGEMENT
The satisfaction and euphoria that accompany a successful completion of any task would be
incomplete without the mention of people who made it possible, success is the epitome of hard
work and perseverance, but steadfast of all is encouraging guidance.

So, with gratitude we acknowledge all those whose guidance and encouragement served as
beacon of light and crowned the effort with success.

I sincerely thank, Prof. Harish Kunder, Associate Professor & HOD, Department of Artificial
Intelligence and Machine Learning who has been the constant driving force behind the
completion of the work of seminar.

I thank our beloved Principal Dr. Peter Fernandes, for his constant help and support throughout.

I indebted to Management of Alva’s Institute of Engineering and Technology, Mijar,


Moodbidri for providing an environment which helped us in completing our mini project.

Also, I thank all the friends, family members, teaching and non-teaching staff of
Department of Artificial Intelligence and Machine Learning Data Science for the help rendered.

PRASHANTH J N
4AL23CD032
ABSTRACT

This project aims to build a real-time face detection and attendance system using the NVIDIA Jetson
Nano Developer Kit. It uses computer vision techniques to detect faces from live video feeds, recognize
known individuals, and log attendance automatically.

The system leverages the face_recognition library built along with OpenCV for handling video
processing. A lightweight local csv stored file is used to store and track attendance.

The project demonstrates an efficient and cost-effective edge AI solution for educational institutions or
workplaces to automate attendance using facial recognition.

This project provided practical experience in deploying AI models on embedded hardware, working with
real-time video processing, and integrating software components to form a complete solution.
TABLE OF CONTENTS
SL NO: CONTENT PAGE NO

1 Introduction 1

2 Overview of NVIDIA Jetson Nano 2

3 Prepare for Setup 3

4 Write the Image to the microSD Card 4

5 Different Types of NVIDIA Kits 6-8

6 Applications and Use Cases 9

7 Project on Live Image Detection Project 10–12

Objectives 12

Methodology 12

Tools Used 12

8 Challenges Faced 13

9 Outcomes 14

10 Conclusion 16

11 Reference 17
INTRODUCTION
Over the years, facial recognition technology has emerged as one of the most transformative innovations in
the field of computer vision and artificial intelligence. With its ability to identify and verify individuals based
on their facial features, it has found widespread applications across multiple domains, including surveillance,
security, financial systems, smart homes, and automated identity verification processes. Among these
applications, the concept of automating attendance systems using facial recognition has gained significant
traction, particularly in educational institutions and corporate environments, where traditional manual or
biometric methods are either inefficient or vulnerable to proxy attendance and manipulation.

This project focuses on building a real-time Face Detection and Attendance System that leverages the
capabilities of the NVIDIA Jetson Nano Developer Kit—a powerful yet compact edge AI computing platform
designed for embedded AI applications. The Jetson Nano supports GPU-accelerated computing, enabling the
execution of deep learning algorithms in real time with high efficiency. Its low cost and energy efficiency
make it an ideal choice for deploying AI models at the edge, especially in environments where cloud
connectivity may not be reliable or data privacy is a concern.

The goal of this project was to design and implement a system that can continuously capture video from a live
feed, detect and recognize faces using pre-trained models, and then automatically log the presence of identified
individuals into a database. This real-time, contactless attendance system aims to replace conventional
methods such as manual roll calls, RFID cards, and fingerprint scanners, thereby reducing time consumption
and the risk of infection due to shared physical contact—an especially important consideration in post-
pandemic scenarios.

The system is implemented using Python along with the face_recognition and OpenCV libraries, with a
lightweight SQLite database for local storage. It uses the face_recognition library, built on top of dlib, to detect
and encode facial features, which are then compared in real time with a dataset of known faces. Recognized
individuals are marked present along with the timestamp of detection.

Through this project, students gained valuable experience in configuring and deploying embedded systems,
understanding hardware-software integration, working with AI and deep learning models, and handling
challenges associated with real-time video stream processing. The implementation also offered insight into
edge computing, where inference is performed locally on the device without the need for cloud processing.
As a result, the system operates efficiently even in low-connectivity environments.

This face detection and attendance solution not only contributes to smart automation in academic and corporate
setups but also serves as a foundation for building scalable AI-based monitoring systems. With further
enhancement, it can be expanded to support features such as spoof detection, mask detection, or cloud-based
dashboards for attendance analytics.

Dept. of AIML, DS AIET 1 |P ag e


OVERVIEW:
JETSON NANO DEVELOPER KIT
The NVIDIA® Jetson Nano™ Developer Kit is a compact yet powerful AI computer
designed for makers, students, hobbyists, and developers who are eager to explore the world
of artificial intelligence. It offers an affordable and accessible platform to learn, prototype,
and deploy AI solutions with ease. With its impressive computing capabilities packed into
a small form factor, the Jetson Nano enables users to build a wide range of AI- powered
projects, from smart devices and intelligent robots to computer vision applications and edge
AI systems. By following this brief guide, you’ll be fully equipped to set up your developer
kit and take your first steps toward creating innovative, real-world AI applications, exciting
autonomous robots, and much more.

Figure 3.1 Jetson Nano kit

Included in the Box:

• NVIDIA Jetson module and reference carrier board


• Small paper card with quick start and support information

• Folded paper stand

Items not Included:

• microSD card (32GB UHS-1 minimum recommended)


• USB keyboard and mouse

Dept. of AIML, DS AIET 2 |P ag e


Prepare for setup
Before you can start building exciting AI projects with your Jetson Nano, it's important to
properly set up your developer kit. This section covers everything you’ll need to get ready
for your first project:
Items Needed for Getting Started:
microSD Card: The Jetson Nano Developer Kit uses a microSD card as its primary boot
device and for all main storage. Choosing the right card is crucial for the best performance
and a smooth experience. You should use a high-quality microSD card with a minimum
capacity of 32 GB and support for UHS-1 speed or better.
A larger and faster card will allow for quicker data handling, smoother operation, you need
to flash the microSD card with the appropriate operating system image and essential
software before use. Detailed instructions for flashing your microSD card are provided in
the next section.

Micro-USB Power Supply: The developer kit must be powered through its Micro-USB
port using a reliable, high-quality power supply capable of consistently delivering 5V⎓2A.
It’s important to note that not every power adapter labeled "5V⎓2A" can actually maintain
that output under real-world conditions. Insufficient power can lead to instability, boot
failures, or performance issues.
For a tested and reliable option, NVIDIA recommends Adafruit’s 5V 2.5A Switching Power
Supply with a 20AWG Micro-USB Cable (GEO151UB-6025). This specific model is
designed to address common problems associated with standard USB power supplies,
ensuring a stable and sufficient power supply for the Jetson Nano. For more information,
please refer to the product details linked on Adafruit’s website.

Note:
We can usually find the claimed power output of a USB power supply printed directly on
its label. However, actual performance can vary between products, even if they list the same
specifications.

Dept. of AIML, DS AIET 3 |P ag e


Write the Image to the microSD Card

Before you can start using your Jetson Nano Developer Kit, you need to prepare your
microSD card by writing the correct operating system image to it. This process requires a
computer with an internet connection and the ability to read and write SD cards, either
through a built-in card reader or a USB adapter.
Step 1: Download the image: First, download the latest Jetson Nano Developer Kit SD
Card Image from the official NVIDIA website. Make sure to note the location where the
file is saved on your computer—you will need it shortly.
Step 2: Write the Image to the microSD Card: The instructions for writing the image
depend on your computer’s operating system: Windows, macOS, or Linux. Here, we'll walk
through the Windows setup process.
Instruction for windows:
A. Format the microSD Card
Before writing the image, it's important to properly format your microSD card:
1. Download and install the SD Memory Card Formatter from the SD Association.
2. Launch the SD Memory Card Formatter application.
3. Select the correct drive corresponding to your microSD card (be very careful to
select the right one).
4. Choose "Quick format" as the formatting option.
5. Leave the "Volume label" field blank.
6. Click "Format" to start formatting the card, and click "Yes" when prompted with a
warning dialog.
B. Flash the Image Using Etcher
Next, use Etcher (a simple and reliable tool) to write the downloaded image to the microSD
card:
1. Download, install, and launch Etcher (available for free).
2. Click "Select image" and browse to choose the zipped image file you downloaded
earlier.
3. Insert your microSD card into your computer if it isn’t already inserted.
4. If Windows displays a dialog box saying it doesn’t recognize the card or asks to
format it, click Cancel.
5. In Etcher, click "Select drive" and choose your microSD card. Double-check that
you are selecting the correct device to avoid overwriting other drives.

Dept. of AIML, DS AIET 4 |P ag e


6. Click "Flash!" to start the process.
o Writing and validating the image typically takes about 10 minutes if your
microSD card is connected via a USB 3.0 port.
Step 3: Continue with Developer Kit Setup: Once the flashing process is complete and
the microSD card is safely removed, you are ready to proceed to setting up your Jetson
Nano Developer Kit!

4.1 Different Types of NVIDIA Kits


NVIDIA provides a wide range of development kits designed for education, prototyping,
and commercial deployment. These kits enable developers, researchers, and students to
build powerful AI applications across industries. Some of the major NVIDIA development
kits include:
1.Jetson Nano – A low-cost, entry-level AI kit perfect for beginners, education, and
small projects like object detection and robotics.

Figure 4.3.1 Jetson Nano

2.Jetson Xavier NX – Offers higher computational power, ideal for edge AI applications
requiring more processing capability, such as surveillance, medical imaging, and smart
retail.

Figure 4.3.2 Jetson Xavier NX

Dept. of AIML,DS AIET 5 |P ag e


3.Jetson AGX Orin – Designed for advanced robotics, autonomous machines, and
industrial AI, this kit provides top-tier GPU performance with energy efficiency.

Figure 4.3.3 Jetson AGX Orin

4.NVIDIA Clara Developer Kit – Tailored for healthcare AI, used in radiology,
genomics, and drug discovery.

Figure 4.3.4 NVIDIA Clara Developer Kit

Dept. of AIML,DS AIET 6 |P ag e


5.NVIDIA DRIVE Kits – Automotive-focused kits for autonomous vehicles,
providing a platform for perception, mapping, and decision-making.

Figure 4.3.5 NVIDIA DRIVE Kits

6.NVIDIA Isaac Kits – Designed for robotics and simulation, these kits support high-
level autonomy in machines like delivery bots and factory robots.

Figure 4.3.6. NVIDIA Isaac Kits

These kits are supported by the NVIDIA Jetpack SDK, which includes libraries for AI,
computer vision, multimedia processing, and deep learning frameworks

Dept. of AIML,DS AIET 7 |P ag e


4.2 Applications and Use Cases
NVIDIA’s development kits are used in a broad spectrum of real-world
applications:
• Education and Training: Jetson Nano is widely used in schools,
colleges, and workshops to teach AI and robotics fundamentals.
• Robotics: Kits like Jetson Xavier and Isaac are employed in building
drones, robotic arms, and autonomous navigation systems.

AI-enabled research in hospitals and labs.


• Smart Cities: Jetson AGX Orin is used in surveillance, traffic
monitoring, and crowd analytics.
• Automotive: DRIVE kits power in-vehicle infotainment, self-driving car
algorithms, and safety systems.
• Industrial IoT: NVIDIA edge AI kits are used in manufacturing for
predictive maintenance, defect detection, and process automation.
• Agriculture: AI kits assist in crop monitoring, weed detection, and
autonomous tractors or drones for farming.
The versatility and performance of NVIDIA’s kits have made them the
backbone of many emerging technologies and research projects, enabling
rapid prototyping and commercial- scale deployment.

Dept. of AIML,DS AIET 8 |P ag e


PROJECT ON LIVE IMAGE DETECTOR

The Face Detection and Attendance System project was conceptualized and executed as part of
the NVIDIA Jetson Nano-based training initiative. The project aimed to provide students with
practical, hands-on exposure to edge AI deployment using real-time facial recognition
techniques. The core objective was to develop a fully functional, real-time system that could
detect and recognize human faces from a live video feed and automatically log attendance
based on recognition results.

Unlike generic object detection models, this project utilized a specialized pipeline built on
Python libraries such as face_recognition, dlib, and OpenCV, optimized to run on the Jetson
Nano Developer Kit. The face recognition engine compares live camera input against a pre-
encoded dataset of known individuals, and upon successful identification, records the
recognized person's name along with a timestamp into a local SQLite database. This automated
process replaces traditional attendance systems, offering a contactless and efficient alternative
suitable for classrooms, offices, and restricted access environments.

By leveraging the GPU-accelerated capabilities of Jetson Nano, students were able to process
high-resolution video frames in real-time with minimal latency, making the system highly
responsive and reliable. This efficiency was made possible by using hardware-accelerated
inference through CUDA, which allowed the system to maintain detection performance even
with constrained computational resources.

Throughout the implementation, students explored how to fine-tune system performance under
varying real-world conditions. They dealt with challenges such as varying lighting, camera
positioning, multiple faces in a frame, and system resource management. Parameters such as
detection confidence, matching thresholds, and bounding box overlays were adjusted to
improve recognition accuracy and avoid false positives or missed identifications.

The final system integrated a real-time display window that showed live camera feed with
overlays indicating detected faces, their names, and recognition confidence levels. Behind the
scenes, every recognized face was also logged in a database, allowing the retrieval of daily
attendance records for analysis or reporting. The interface can be further extended with web-
based dashboards for administrative access or cloud synchronization features.

This project not only provided students with a robust learning experience in embedded AI
development and computer vision, but also equipped them with skills in system integration,
real-time performance tuning, and practical application of AI models. The solution developed
is applicable in real-world scenarios like smart classrooms, secure office entry systems, and
automated monitoring setups, forming a strong foundation for future enhancements in AI-based
identity verification and automation.

Dept. of AIML,DS AIET 9 |P ag e


OBJECTIVES:
➢ To implement a real-time face detection and recognition system using Jetson Nano..
➢ To automate attendance logging based on facial recognition with time stamps..
➢ To optimize detection performance using GPU acceleration and edge deployment.
➢ To gain hands-on experience in deploying AI models on embedded platforms..

Methodology:
The Face Detection and Attendance System project followed a structured, step-by-step
implementation pipeline that enabled students to understand and deploy real-time face recognition
models on the Jetson Nano platform. The process began with flashing the Jetson Nano using the
JetPack SDK, which includes essential components such as CUDA, cuDNN, and TensorRT. Once
the OS was installed, the necessary Python libraries such as face_recognition, OpenCV, dlib, and
SQLite3 were set up to enable face detection, recognition, and database storage.
A custom dataset of known faces was created and encoded using the face_recognition library. These
facial encodings were stored and later compared to faces detected from real-time video input. The
face recognition logic was integrated into a Python script that continuously accessed a live video
feed and detected known individuals
Next, students explored various input sources for detection. These included:

• Static images, used for initial verification of face encodings and detection logic

• Recorded video files to simulate continuous input and assess system performance
• Live camera streams, accessed via MIPI CSI (csi://0) or USB webcam (/dev/video0)

These input modes were chosen to expose students to different types of vision data and help them
understand how facial recognition models behave under varied input formats and stream
conditions.
For model execution, a custom Python script utilizing the face_recognition and OpenCV libraries
was used.
It was configured with adjustable parameters including face distance thresholds, display options (e.g.,
bounding boxes, labels, time stamps), and optional flags to save attendance logs to a SQLite database.
As the models ran, the real-time output was rendered using OpenCV windows.
Detected faces were highlighted with rectangles and labeled with their names and recognition
confidence scores. In addition to the live video overlay, attendance records were written to a local
database with time-stamped entries, enabling students to review logs and assess recognition reliability
post-execution.
Finally, a thorough analysis phase was conducted. Students reviewed terminal logs that displayed
recognition results, match confidence values, and attendance entries. They compared recognition

Dept. of AIML,DS AIET 10 | P a g e


behavior under different lighting conditions, face angles, and movement speeds to evaluate system
accuracy, latency, and robustness. This analysis provided a critical understanding of the strengths and
limitations of deploying face recognition models in dynamic, real-world environments.

System Setup and Flashing:


The project began with flashing the Jetson Nano Developer Kit using the official JetPack SDK,
which includes the essential components for deploying deep learning models:Jetson Linux
(Ubuntu based)
➢ Jetson Linux (Ubuntu-based)
➢ CUDA Toolkit
➢ cuDNN
➢ TensorRT
➢ Multimedia API support
Once the Jetson Nano was successfully flashed and booted, a USB webcam or CSI camera
was configured and tested for real-time image capture. The development environment was
then set up by installing necessary Python packages Initial trials were conducted on static
images and short video clips to verify the accuracy of face detection and recognition. Once
validated, the real-time camera feed was activated, and the live detection and attendance
logging pipeline was deployed and tested under varied environmental conditions.
Tools and Their Roles:

• Jetson Nano: A compact edge AI hardware platform responsible for executing face
detection and recognition models locally in real time.
• JetPack SDK: Provides the essential libraries and drivers including
CUDA, cuDNN, and TensorRT, enabling GPU-accelerated inference on the
Jetson Nano.
• Python: Used for scripting the face detection and attendance pipeline,
managing camera input, running recognition logic, and logging data into
the database.
• face_recognition Library: A Python library built on dlib used for
encoding, comparing, and recognizing faces with high accuracy.
• OpenCV: Handles real-time video capture from the camera, displays the
GUI window with bounding boxes and labels, and assists in frame-by-frame
image processing.
• CSV File: A simple, lightweight file format used to store attendance records by saving
recognized names and timestamps in a structured, tabular format.

Dept. of AIML,DS AIET 11 | P a g e


Challenges Faced:

• Limited Memory and Storage: The Jetson Nano’s 2GB RAM and limited storage
made it difficult to handle large face datasets and high-resolution video streams, leading
to system lag during real-time detection.

• Library Installation Difficulties: Installing essential libraries like dlib,


face_recognition, and OpenCV on Jetson Nano was challenging due to compatibility
with ARM-based architecture and required manual compilation.

• Camera and Device Access Errors: The system sometimes failed to detect
the CSI/USB camera, which was resolved by verifying device ports, updating
permissions, and restarting the video capture service.

• Recognition Accuracy in Varying Conditions: The face recognition model's


performance dropped in poor lighting or at unusual angles. This was improved
by adjusting face distance thresholds and preprocessing image data.

Outcomes:
The Face Detection and Attendance System was successfully implemented and deployed on the
Jetson Nano. The system:
• Accurately detected and recognized known faces from live video streams in real time.
• Automatically logged attendance with name and timestamp into a local file/database.
• Provided hands-on experience in deploying face recognition on embedded GPU hardware.
• Served as a functional prototype for smart, contactless attendance systems in real-world
environments.

Dept. of AIML,DS AIET 12 | P a g e


figure : detecting the face and marking attendance for known faces

figure : marking the unknown faces as unknown

Dept. of AIML,DS AIET 13 | P a g e


CONCLUSION

The development of the Face Detection and Attendance System using NVIDIA Jetson Nano
provided an impactful and hands-on opportunity to integrate artificial intelligence with embedded
systems in a practical and purposeful application. The core objective of the project—to automate
attendance using real-time face recognition—was successfully achieved, resulting in a working
prototype capable of identifying individuals from a live camera feed and recording their presence
in a structured, timestamped database.

The project helped bridge the gap between theory and practice. Through this experience, students
explored how facial recognition works at the algorithmic level and learned how to deploy those
models in resource-constrained environments like the Jetson Nano. Key tasks included
configuring the JetPack SDK, setting up camera input, installing essential libraries
(face_recognition, OpenCV, SQLite), and handling real-time video stream processing. Each of
these steps provided technical insight into real-world AI deployment on embedded platforms.

One of the most significant learning outcomes was the ability to handle limitations imposed by
the hardware itself. The Jetson Nano’s 2GB RAM and limited storage presented challenges that
required careful optimization of image resolution, memory management, and library installations.
These obstacles not only strengthened troubleshooting skills but also reinforced an understanding
of system constraints when developing AI solutions at the edge.

Additionally, the project promoted collaborative problem-solving as students encountered and


resolved issues related to camera initialization, frame drops, recognition accuracy under different
lighting conditions, and duplication in attendance logging. The process of testing, debugging, and
improving the system under realistic conditions instilled a strong sense of technical ownership
and resilience.

Beyond the technical achievements, this project also emphasized the practical value of AI in
everyday tasks. By replacing manual attendance methods with a contactless, automated system, it
showcased how computer vision and embedded hardware can be used to increase efficiency,
accuracy, and safety in academic and workplace environments. The system developed can serve
as a foundation for future enhancements such as cloud integration, GUI dashboards, security-
based access control, or mask detection features.

In conclusion, this project has not only fulfilled its academic requirements but also demonstrated
the feasibility and advantages of applying edge AI in meaningful, real-time applications. It has
prepared students to tackle more advanced projects involving embedded intelligence, deep
learning deployment, and smart automation. The experience gained through this work will serve
as a valuable stepping stone for future explorations and innovations in artificial intelligence,
computer vision, and edge computing.

Dept. of AIML,DS AIET 14 | P a g e


REFERENCES
1. NVIDIA Jetson Nano Developer Kit Documentation – NVIDIA Corporation.
https://developer.nvidia.com/embedded/jetson-nano-developer-kit
2. face_recognition Library – GitHub Repository.
https://github.com/ageitgey/face_recognition

3. OpenCV-Python Documentation – OpenCV Team.

https://docs.opencv.org/
4. dlib C++ Library – Davis King.
http://dlib.net/
5. SQLite Documentation – SQLite.org.
https://www.sqlite.org/docs.html

6. J. Redmon and A. Farhadi, “YOLOv3: An Incremental Improvement,”


arXiv:1804.02767, 2018.

https://arxiv.org/abs/1804.02767
7. S. J. Russell and P. Norvig, Artificial Intelligence: A Modern Approach, 3rd Edition,
Pearson Education, 2016
8. V. Choudhary, S. Saini, and P. Kumar, "Face Recognition for Smart Attendance System
using Machine Learning," International Journal of Computer Applications, vol. 182,
no. 40, 2019.
9. JetPack SDK Documentation – NVIDIA Developer Portal.
https://developer.nvidia.com/embedded/jetpack

Dept. of AIML,DS AIET 15 | P a g e

You might also like