Project Report
Project Report
Project Report
EDUCATION
PROJECT REPORT
Submitted by
ABHIJITHA K 21DI01
KIRAN TS 21DI14
LAKSHMANDEV VK 21DI15
RAJARAJAN C 21IH12
DIPLOMA IN INFORMATION
TECHNOLOGY STATE BOARD OF
TECHNICAL EDUCATION GOVERNMENT
OF TAMILNADU
APRIL 2024
DEPARTMENT OF INFORMATION
TECHNOLOGY PSG POLYTECHNIC COLLEGE
(Autonomous and an ISO 9001: 2015 certified Institution)
COIMBATORE – 641 004
PSG POLYTECHNIC COLLEGE
(Autonomous and an ISO 9001: 2015 certified Institution)
DEPARTMENT OF INFORMATION
CERTIFICATE
ABHIJITHA K 21DI01
KIRAN TS 21DI14
LAKSHMANDEV VK 21DI15
RAJARAJAN C 21IH12
ABHIJITHA K
KIRAN TS
LAKSHMANDEV
VK RAJARAJAN C
DIPLOMA IN INFORMATION
TECHNOLOGY
of the State Board of Technical Education,
Government of Tamil Nadu during the academic year 2024
Certified that the candidate was examined by us in the Project viva- voce examination held on
……………
II
ACKNOWLEDGEMENT
First and foremost, I would like to thank the Almighty God for giving us the strength,
knowledge, ability, and opportunity to undertake this project study and to persevere and
complete it with satisfaction.
We are ineffably indebted to our principal for giving us this opportunity and encouraging us
to accomplish this project.
We are highly indebted to Mr. A Kathiresan, for his valuable guidance and constant
supervision. Without his able guidance, this project would not have been possible and we
shall eternally be grateful to his for his assistance.
We acknowledge with a deep sense of reverence, our special gratitude towards our Head of
the Department Mr. A. Kathiresan, Department of Information Technology for his guidance,
inspiration, and suggestions in our quest for knowledge.
We would like to express our special gratitude and thanks to the special machines laboratory
and technicians for giving us such attention and time.
We would like to express our gratitude towards our parents for their tremendous contribution
in helping us reach this stage in our life. This would not have been possible without their
unwavering and unselfish love, cooperation, and encouragement is given to us at all times.
We have taken efforts in this project. However, it would not have been possible without the
kind support and help of many individuals. We would like to extend our sincere thanks to all
of them.
Any omission in this brief acknowledgment does not mean a lack of gratitude.
III
ABSTRACT
The project's primary goal is to augment the learning process by seamlessly integrating digital
information into the physical world. Leveraging Unity's powerful development environment,
the application aims to provide learners with a dynamic three-dimensional educational
experience.
Methodologically, Unity is chosen for its versatility and widespread adoption, facilitating the
seamless integration of AR elements across platforms. The development process focuses on
creating 3D models, animations, and interactive elements, meticulously aligned with specific
learning objectives.
IV
TABLE OF CONTENTS
TOPICS PAGE.NO
Certificate Page ………....………………………………………………… II
Acknowledgement ………………………………………………………… III
Abstract ...………………………………………………………………....... IV
Table of Contents.…………………………………………………………. V
List of Figures ……...………………………………………………………. VIII
1. INTRODUCTION ….….….….….….….….….….….….….…. 01
1.2 Objective 02
1.4 Summary 04
2.2 Summary 06
3.2.1.1 Unity 09
3.2.1.2 Blender 11
3.4 Summary 15
V
4. SYSTEM DESIGN ...……...……………………………............................................. 16
4.4 Summary 23
5. IMPLEMENTATION ……………………………………………………………. 24
5.3 Summary 28
6. TESTING …………………………………………………………………………………. 19
6.2 Summary 30
VI
7. CODING AND OUTPUT ………………………………………………… 32
7.1.1 Coding 32
7.1.2 Description 33
7.2 Output 34
7.2.2 Sonometer 38
PROJECT DEVELOPMENT……………………………………………. 40
8. CONCLUSION……………………………………………………………. 41
BIBLIOGRAPHY ……………………………………….............................. 43
VII
LIST OF FIGURES
11
3.3 BLENDER
21
4.1 BLOCK DIAGRAM
22
4.2 DFD 0 LEVEL
34
7.1 EARTH
35
7.2 URANUS
35
7.3 MARS
7.4 MERCURY 36
7.5 NEPTUNE 36
7.6 SUN 37
7.7 SATURN 37
7.8 VENUS 38
7.9 SONOMETER 38
7.10 LATHE 39
7.11 MOTOR 39
7.12 DRAGON ON AR 40
VIII
Introduction Chapter 1
CHAPTER 1
INTRODUCTIO
N
Education has always been a dynamic field, constantly evolving to incorporate new
technologies and methodologies that enhance the learning process. In recent years, one such
groundbreaking technology, Augmented Reality (AR), has emerged as a powerful tool with the
potential to revolutionize education. Augmented Reality, a blend of digital and physical worlds,
overlays digital content onto the real world through devices like smartphones, tablets, or AR
glasses. This technology introduces a new dimension to learning, offering immersive, interactive,
and engaging experiences for students of all ages and across various disciplines.
AR's potential in education is boundless, offering a diverse range of applications that can be
tailored to cater to different learning styles and objectives. By seamlessly integrating virtual
elements into the physical environment, AR bridges the gap between theoretical knowledge and
real-world applications, providing students with a unique opportunity to explore, interact, and
understand complex concepts in a more tangible and memorable way.
One of the key advantages of AR in education is its ability to make abstract or complex subjects
more accessible. For instance, in biology classes, students can use AR to dissect virtual
organisms, gaining a deeper understanding of anatomical structures and biological processes
without the need for physical specimens. Similarly, in physics, AR simulations can bring
complex physical phenomena to life, allowing students to experiment with concepts like gravity,
motion, and electricity in a controlled, interactive environment.
1
Introduction Chapter 1
AR also addresses the diverse learning needs of students by providing customizable experiences.
Educators can adapt AR content to cater
to different learning styles, ensuring that visual, auditory, and kinesthetic learners all have
opportunities to excel. For example, a history lesson on ancient civilizations could incorporate
AR reconstructions of ancient cities, allowing visual learners to explore, auditory learners to
listen to historical narratives, and kinesthetic learners to physically interact with virtual artifacts.
1.2 OBJECTIVE
The primary objective of integrating Augmented Reality (AR) into education is to enhance
and enrich the learning experience for students across various subjects and disciplines. This is
achieved through several key goals
2
Introduction Chapter 1
Accessible Experiential Learning: AR brings virtual field trips, simulations, and experiments
into the classroom, making experiential learning more accessible and inclusive. Students can
explore places and scenarios that would otherwise be impractical or impossible to visit
physically.
Inspiring Curiosity and Lifelong Learning: AR stimulates curiosity and a sense of wonder in
students. It encourages exploration, inquiry, and a desire to seek out knowledge independently,
fostering a lifelong love for learning.
Measurable Learning Outcomes: AR in education allows for the collection of data on student
interactions and performance. This data can be used to assess learning outcomes, identify areas
for improvement, and tailor instructional approaches for better results.
3
Introduction Chapter 1
1.4 SUMMARY:
4
Literature Review Chapter 2
CHAPTER 2
LITERATURE REVIEW
Title: "Augmented Reality in Education and Training: Pedagogical Approaches and Illustrative
Case Studies"
Abstract: This review offers a comprehensive overview of AR technology and suggests five
pivotal directions for its application in education. The authors highlight how AR has the
potential to revolutionize educational practices by providing immersive, interactive, and
contextual learning experiences, ultimately leading to improved learning outcomes.
Title: "A Review on the Use of Augmented Reality in Education: From the Perspective of
Motivation, Cognitive Load, Presence, and Practical Implementation"
Authors: Bacca, J., Baldiris, S., Fabregat, R., Graf, S., & Kinshuk
Abstract: Bacca et al. consider crucial factors such as motivation, cognitive load, and presence
when evaluating the effectiveness of AR in education. The review discusses practical
5
Literature Review Chapter 2
implementation strategies and offers a nuanced understanding of how AR influences the learning
environment. It highlights the potential of AR to create immersive and interactive learning
experiences.
Authors: Azuma, R. T.
Journal: ISMAR
2.2 SUMMARY:
The literature survey on Augmented Reality (AR) in education reveals a growing body of
research and practical applications that highlight the transformative impact of AR on the learning
experience. Studies consistently emphasize the potential of AR to enhance student engagement,
understanding, and retention of educational content. AR is found to be particularly effective in
science, technology, engineering, and mathematics (STEM) subjects, providing interactive
simulations and 3D visualizations that facilitate deeper comprehension. Additionally, literature
suggests that AR fosters a student-centric learning environment, catering to diverse learning
styles and encouraging active participation. Challenges such as hardware limitations, integration
into curricula, and the need for teacher training are also acknowledged. Overall, the literature
survey underscores the significant role of AR in reshaping education by offering innovative and
immersive learning opportunities that go beyond traditional teaching methods. As the field of AR
in education continues to advance, research and practical implementations contribute to a
growing understanding of its benefits, challenges, and the potential for revolutionizing the
educational landscape.
6
System Requirements Chapter 3
CHAPTER 3
SYSTEM REQUIREMENTS
Development Machine:
AR Device:
Depending on the target platform (smartphones, tablets, AR glasses), choose hardware with
AR capabilities. For example, ARKit for iOS devices, ARCore for Android.
Development Environment:
Operating System:
AR SDKs/Frameworks:
7
System Requirements Chapter 3
Programming Languages:
Version Control:
Blender, Maya, 3ds Max, or other 3D modeling tools for creating assets.
AR Content Creation:
Tools for creating AR content, including 3D models, animations, and AR markers.
In the development and maintenance of the AR application, paramount attention must be given
to key facets such as data security, privacy compliance, and comprehensive documentation and
support. Robust data security measures, including encryption and secure authentication protocols,
must be implemented to safeguard user data from unauthorized access or breaches, while ensuring
compliance with data protection regulations like GDPR and HIPAA is essential to uphold user
privacy and trust. User documentation in the form of manuals or guides should be provided to
facilitate user navigation and usage of the application, while developer documentation detailing the
codebase, APIs, and custom features is crucial for future maintenance and development efforts. By
addressing these aspects holistically, the AR application can ensure a seamless user experience,
maintain data integrity, and support sustainable growth and innovation.
8
System Requirements Chapter 3
Unity Engine is a versatile and user-friendly platform renowned for game and interactive
application development. Its cross-platform capabilities streamline development across various
devices and operating systems. With support for multiple languages, including C#, Unity offers a
flexible coding environment. The Unity Asset Store provides a wealth of pre-made assets and
tools, expediting the development process. Whether crafting 2D games or complex 3D
simulations, Unity's toolset is adaptable to various dimensions. Its robust physics engine and
animation systems allow for lifelike movements and interactions. With advanced graphics
features and dedicated frameworks for VR and AR, Unity is a go-to choose for immersive
experiences and modern interactive applications.
UNITY IN AR:
Unity is a powerful engine for developing AR (Augmented Reality) applications due to its
specialized framework called Unity AR Foundation. Here's how Unity works in AR
applications:
9
System Requirements Chapter 3
Taking the lead in tracking and detecting real-world objects and surfaces, AR Foundation utilizes the
device's camera to comprehend the geometry of the environment. This advanced functionality
enables virtual objects to interact convincingly with their surroundings, elevating the immersion of
the AR experience to new heights.
Harnessing Unity's scripting capabilities, often implemented in C#, developers script the behavior of
virtual objects. These objects can be anchored to physical surfaces or dynamically placed within the
AR environment, offering users an immersive and engaging experience.
Unity further empowers developers by facilitating various forms of user interaction, including
gestures, touch controls, and voice commands. These intuitive features enable users to seamlessly
manipulate and engage with virtual objects, enriching their overall immersion and enjoyment of the
AR experience.
10
System Requirements Chapter 3
Unity's comprehensive suite extends beyond mere visual fidelity, ensuring virtual objects are
seamlessly integrated into the real-world environment through its powerful rendering engine. With
meticulous attention to lighting, shadows, and reflections, Unity enhances the AR experience by
imbuing it with a heightened sense of realism.
In addition to graphics and rendering, Unity offers robust tools for crafting intuitive AR user
interfaces (UI). Developers can seamlessly integrate elements like information displays, menus, and
interactive components, enriching the user experience and facilitating seamless interaction within
the AR environment.
Unity's integrated development environment (IDE) provides developers with a suite of testing and
debugging tools tailored specifically for AR applications. This empowers developers to simulate AR
experiences on their computers and conduct thorough testing on real devices, ensuring a smooth and
bug-free user experience.
Once the AR application is meticulously developed, tested, and debugged, Unity facilitates seamless
deployment across various target platforms, including iOS and Android. With support for building
and deploying applications through their respective app stores, Unity simplifies the final steps of
bringing AR experiences to the hands of users worldwide.
11
System Requirements Chapter 3
simulate dynamic environments. Its intuitive interface and extensive documentation make it
accessible to professionals and novices alike. With real-time rendering and an integrated game
engine, Blender caters to game developers and filmmakers, offering a comprehensive suite of tools.
Moreover, its scripting capabilities enable customization, enhancing its adaptability for various
projects. From architectural visualization to visual effects, Blender stands as a cost- effective and
powerful solution for 3D content creation.
WORKING IN BLENDER:
Blender 3D is a powerful open-source 3D creation suite that is used for various purposes, including
3D modeling, animation, rendering, and even game development. If you're working in Blender 3D,
you can do a wide range of tasks, from creating 3D models to producing animated films or games.
Here are some common tasks and features you might encounter while working in Blender 3D:
Blender stands as a versatile powerhouse in the realm of 3D modeling, offering a plethora of tools
and features to cater to diverse creative needs. From crafting intricate objects, characters, to entire
environments, Blender's robust suite enables users to sculpt, extrude, and texture with precision,
resulting in stunningly realistic models.
Animation in Blender transcends simple movement, delving into the realm of character rigging and
keyframe animation. Users can breathe life into their creations with dynamic character animations
and complex rigging systems, paving the way for immersive storytelling and compelling visual
narratives.
Rendering capabilities in Blender are nothing short of breathtaking, with its built-in rendering
engines, Cycles, and Eevee, offering both photorealistic and real-time rendering options. Users can
12
System Requirements Chapter 3
achieve visually stunning results, whether rendering animations or still images, elevating their
projects to new heights of visual fidelity.
Texture and material application in Blender add depth and realism to 3D models, allowing users to
emulate various materials like wood, metal, or glass with astonishing accuracy. Lighting tools further
enhance scenes, enabling users to set up and control different types of lighting to achieve desired
atmospheres and visual effects.
Blender's physics simulation capabilities extend the creative possibilities, offering simulations for
smoke, fluid, cloth, and more. Users can create lifelike physical interactions, adding an extra layer of
realism to their scenes.
Compositing features in Blender provide users with a powerful node-based compositor for post-
processing renders, adding effects, and manipulating the final output. Additionally, Blender serves as
a comprehensive tool for video editing and post-production work, including cutting, splicing, and
adding effects to videos.
For game developers, Blender's integrated game engine facilitates the creation of interactive 3D
games, allowing for the building, animating, and scripting of game assets directly within Blender's
interface.
Blender also caters to 3D printing enthusiasts with tools to prepare models for 3D printing, ensuring
models are checked and fixed for printability.
With a Python scripting API, Blender offers extensive scripting capabilities, enabling users to
automate tasks, create custom tools, and extend Blender's functionality to suit their specific needs.
Furthermore, Blender's flexibility is further enhanced by its vast library of add-ons, allowing users to
expand its capabilities through community-developed or custom-built add-ons. In essence, Blender
stands as a comprehensive and indispensable tool for 3D modeling, animation, rendering, and
beyond, empowering users to bring their creative visions to life with unparalleled depth and
precision.
Functional Requirements:
13
System Requirements Chapter 3
In developing an AR application, several key components must be addressed to ensure a seamless
and immersive user experience. First and foremost, image recognition algorithms should be
implemented to enable real-time recognition and tracking of predefined images or patterns. This
functionality lays the foundation for integrating AR interaction, allowing virtual objects to be placed
in the real world based on recognized images. Interactive features such as tapping or dragging virtual
objects in response to recognized images further enhance user engagement.
Markerless tracking capabilities are essential for providing AR experiences without the need for
predefined markers, enhancing flexibility and usability. Object manipulation through gestures or
touch interactions adds another layer of interactivity, empowering users to interact naturally with
virtual objects within the AR environment.
Scene understanding is crucial for recognizing the environment and adapting AR content
accordingly, ensuring seamless integration with real-world surroundings. Cross-platform
compatibility is paramount, requiring integration with AR Foundation to ensure the application
works seamlessly on various platforms such as iOS and Android.
A user-friendly interface is essential for configuring and interacting with AR features, providing
intuitive controls and feedback to enhance usability. Performance optimization is critical for ensuring
smooth performance, considering factors such as frame rate, rendering, and responsiveness to deliver
a seamless and immersive AR experience to users across different devices and platforms. By
addressing these components comprehensively, developers can create compelling AR applications
that captivate and delight users with their innovative and immersive experiences.
Non-Functional Requirements:
Performance is key, requiring the application to run smoothly with minimal latency to provide users
with a seamless experience. Scalability is crucial to accommodate varying levels of complexity and
a growing user base or data load.
14
System Requirements Chapter 3
of the AR application.
Usability is essential, demanding an intuitive and easy-to-navigate user interface that promotes a
positive user experience. Compatibility with a range of devices is crucial, considering differences in
screen sizes, resolutions, and hardware capabilities.
Planning for easy updates and maintenance is essential, allowing for future improvements and bug
fixes to be implemented seamlessly. By addressing these considerations comprehensively,
developers can create a robust and successful AR application that meets the needs of users while
ensuring a positive and engaging experience.
3.4 SUMMARY:
The collaborative workflow between Blender and Unity in the context of AR Foundation
involves a seamless integration of 3D content creation and game development. Blender, a versatile
open-source 3D graphics software, serves as a comprehensive tool for modeling, sculpting, and
animating objects and scenes. Artists use Blender to craft detailed 3D assets, characters, and
platform for integrating these assets into AR applications using AR Foundation. This workflow
enables developers to import Blender-created assets into Unity, where they can be arranged,
scripted, and deployed for augmented reality experiences. Unity's AR Foundation extends its
capabilities to incorporate AR features, such as image recognition and tracking. The synergy
between Blender and Unity within the AR Foundation framework facilitates a streamlined process
for creating immersive AR content, showcasing the collaborative power of these tools in reshaping
15
System Design Chapter 4
CHAPTER 4
SYSTEM DESIGN
Content enrichment stands out as a significant feature, as AR applications augment textbooks and
learning materials with additional multimedia content. By simply scanning specific pages or markers
with a mobile device, students gain access to 3D models, videos, and interactive simulations,
enriching their understanding of the subject matter.
Virtual labs and simulations bring scientific concepts to life, enabling students to conduct
experiments virtually. For instance, in chemistry, students can interact with virtual chemical
reactions, observe molecular structures, and grasp complex scientific concepts within a controlled
digital environment.
Historical and cultural immersion takes learners on captivating journeys to different time periods
and locations through AR applications. Students can explore historical sites, ancient civilizations,
and cultural landmarks via immersive 3D reconstructions and interactive experiences, fostering a
deeper understanding of history and culture.
Interactive educational games provide engaging learning experiences, offering students interactive
challenges, quizzes, and puzzles. These games not only make learning enjoyable but also promote
active participation and knowledge retention.
Language learning and vocabulary building are facilitated through AR applications that leverage
visual cues in the real world. By pointing their devices at objects, learners can associate words with
objects, facilitating vocabulary acquisition through translations or audio pronunciations, thus
enhancing language proficiency.
16
System Design Chapter 4
Augmented Reality (AR) applications have transformed education across various disciplines,
providing innovative tools and immersive experiences for students. In art education, AR allows
students to create and manipulate virtual sculptures, paintings, and designs, fostering creativity
and enabling experimentation with different artistic styles. Moreover, AR simulates virtual field
trips, overcoming accessibility barriers and allowing students to explore museums, historical
sites, and natural landmarks from within the classroom. In STEM subjects, AR facilitates
interactive learning experiences in physics, biology, mathematics, and engineering, enhancing
understanding and engagement. These applications can be customized to accommodate
different learning styles and abilities, providing personalized learning experiences that adapt to
individual needs. By integrating gamification elements, AR further enhances engagement and
turns educational experiences into immersive adventures. Additionally, AR supports
collaborative learning by enabling multiple users to interact with the same augmented content
simultaneously, promoting teamwork and communication skills. Beyond traditional subjects,
AR offers a unique approach to geography education, allowing students to explore geographical
features, ecosystems, and environmental changes through interactive maps and overlays.
Overall, AR is reshaping the educational landscape, making learning more accessible,
engaging, and interactive for students of all ages and backgrounds. Its multifaceted applications
continue to revolutionize teaching and learning, paving the way for a more dynamic and
inclusive educational experience.
The existing system of Augmented Reality (AR) applications in education offers numerous benefits
that significantly enhance the learning experience. Firstly, AR captivates students' attention through
interactive, dynamic content, surpassing traditional textbooks and making learning more engaging
and enjoyable. This enhanced engagement fosters a deeper level of interest in the subject matter,
leading to increased participation and motivation among students.
17
System Design Chapter 4
Finally, AR promotes inclusivity by accommodating diverse learning styles and needs. It offers
personalized learning experiences tailored to individual preferences and paces, ensuring that each
student can access educational content in a way that suits them best. Additionally, AR facilitates
collaborative learning environments where students can work together on projects and activities,
fostering teamwork, communication skills, and peer-to-peer learning. Through these capabilities,
AR not only transforms the educational experience but also equips students with valuable skills and
knowledge essential for success in the modern workforce.
Augmented Reality (AR) applications offer significant advantages in education, particularly in terms
of cost-efficiency and adaptability across diverse subjects. Despite initial investment considerations,
AR technology can lead to long-term cost reductions by minimizing the need for physical resources
such as lab equipment or printed educational materials. This financial efficiency makes AR an
attractive option for educational institutions seeking innovative ways to deliver content while
managing budgetary constraints.
Furthermore, AR's adaptability to a wide range of subjects makes it a valuable tool in various
educational contexts. Whether in science, history, art, mathematics, or beyond, AR applications
provide versatile learning experiences that engage students and enhance comprehension. By
leveraging AR across multiple disciplines, educators can offer dynamic and interactive lessons that
cater to diverse learning styles, fostering deeper understanding and retention of key concepts.
Additionally, AR applications offer data-driven insights that empower educators to make informed
decisions about instructional strategies and student progress. By collecting data on student
interactions, AR platforms enable educators to assess learning outcomes, identify areas for
improvement, and personalize instruction to meet individual needs effectively. Moreover, by
embracing AR technology in education, students are better prepared to navigate and embrace new
technologies in the future, equipping them with essential skills for success in a rapidly evolving
digital landscape.
18
System Design Chapter 4
Integrating AR into existing curricula may also pose challenges, requiring careful planning and
alignment with educational standards. Not all subjects may seamlessly lend themselves to AR
applications, limiting its potential for widespread curriculum integration. Additionally,
inequities in access to AR technology may exacerbate existing disparities, with some students
lacking access to the required devices, leading to disparities in access and opportunities for
learning. Thus, while AR offers exciting possibilities for education, addressing these challenges
is crucial to ensure equitable access and effective integration into educational practices.
4.2PROPOSED SYSTEM
19
System Design Chapter 4
the environment is set up, create a new Unity project and define the essential ARFoundation
components, such as ARSession and ARSessionOrigin, which manage the AR experience and
represent the tracked space in the real world.
Design an intuitive user interface (UI) with elements such as buttons and labels, enhancing user
engagement. Implement user interactions, such as tapping or dragging, to manipulate AR
objects seamlessly. Consider incorporating gestures and touch controls to make the experience
more immersive. Provide visual feedback to users for successful interactions, like highlighting
selected objects or displaying relevant information.
Testing is a critical phase in AR app development. Test your app on ARKit and ARCore
supported devices to identify and resolve any platform-specific issues. Debug and optimize your
app for performance, ensuring a smooth user experience. Configure build settings and deploy
your AR app to the App Store for iOS and Google Play for Android, making it accessible to a
broader audience.
In the iterative process, gather user feedback to enhance your AR app continually. Stay informed
about updates to AR Foundation, ARKit, and ARCore for potential improvements and new
features. Thoroughly document your code and provide clear instructions for users and future
developers, facilitating understanding and future development. As AR technology evolves, this
comprehensive approach ensures your AR app remains at the forefront of innovation and user
satisfaction.
20
System Design Chapter 4
21
System Design Chapter 4
A Data Flow Diagram (DFD) is a graphical representation of the flow of data within a
system. While AR applications, especially those using AR Foundation in Unity, might not have
traditional data flow in the same way as information systems, we can represent the flow of
information and interactions within the AR system. Here's a simplified DFD for an AR application
using AR Foundation in Unity:
Description:
User Input:
Represents any input from the user, such as gestures, taps, or other interactions.
22
System Design Chapter 4
AR Foundation Module:
This module encompasses the AR Foundation framework in Unity, responsible for handling
AR functionalities like tracking, rendering, and interactions.
This block represents algorithms responsible for recognizing images or patterns in the real-
world environment.
Denotes the rendering of virtual objects in the AR scene based on the image recognition results.
4.4 SUMMARY:
The existing system may lack certain features, exhibit performance bottlenecks, or have
limitations in terms of user experience. In the context of AR, the image processing may be less
accurate, the interaction with virtual objects might be limited, and the overall system may not adapt
well to different environmental conditions. The need for enhanced features, improved performance,
and a more user-friendly interface becomes apparent through the limitations of the existing
system.
The proposed system outlines the improvements and new features introduced to overcome the
shortcomings of the existing system. This may include advancements in image recognition
algorithms for more accurate tracking, enhanced AR interactions for users, improved rendering
quality, and adaptability to diverse real-world environments. Additionally, the proposed system
could address issues related to performance, security, and usability, providing a more robust and
satisfying AR experience. The introduction of new technologies, optimized code, and a refined user
interface contributes to the overall effectiveness of the proposed AR application.
In summary, the proposed system represents an evolution from the limitations of the existing
system, introducing advancements in image processing, AR interactions, and overall system
performance to deliver an upgraded and more feature-rich AR experience. The proposed system
aims to address user needs and expectations, providing a solution that not only overcomes existing
challenges but also sets the stage for future improvements and innovations in AR technology.
23
Implementation Chapter 5
CHAPTER 5
IMPLEMENTATION
5.1 WORKING OF 3D MODEL:
Blender, an open-source 3D creation suite, offers a versatile platform for modeling, sculpting,
and rendering. After installing Blender, you're greeted by a dynamic interface. The 3D viewport is
central, allowing you to navigate with ease using the mouse and keyboard shortcuts. Adding a
mesh, such as a cube or sphere, is as simple as pressing Shift + A or using the "Add" menu.
Transitioning to Edit Mode provides granular control, enabling manipulation of vertices, edges,
and faces.
The transformative capabilities of Blender shine through in Edit Mode, where you can move,
rotate, and scale components using shortcuts like G, R, and S. Constrain these transformations
along specific axes by appending X, Y, or Z after the action. To refine the model's surfaces,
consider applying a Subdivision Surface modifier, accessible through the modifier panel. This step
introduces a level of smoothness, crucial for achieving realistic and visually appealing 3D models.
In the "Shading" workspace, the creative process expands to materials and textures. Assign
materials to the model, and add textures for a more nuanced appearance. UV mapping, crucial for
precise texture application, involves unwrapping the model's UVs in Edit Mode and adjusting them
in the dedicated UV Editor. For those delving into more organic forms, Blender's Sculpt Mode
offers a suite of brushes for detailed sculpting, expanding the creative possibilities beyond
traditional geometric shapes.
As the 3D model takes shape, attention turns to lighting, a critical factor for achieving realism in
renders. In the "Layout" or "Rendering" workspace, lights are added and configured in the "Object
Data Properties" panel. Proper lighting emphasizes the model's details and establishes a mood
within the scene. With a lit stage set, the next step involves camera setup. Placing and adjusting the
camera's position, rotation, and focal length define the viewpoint for rendering.
24
Implementation Chapter 5
The rendering process in Blender involves choosing between the Eevee and Cycles rendering
engines, each offering unique strengths. Eevee excels in real-time rendering and is ideal for quick
previews, while Cycles focuses on accurate light interaction and is suitable for high- quality final
renders. Render settings are configured in the respective engines' panels, allowing customization
according to project requirements.
With the environment configured, introduce AR Foundation's core components into the scene.
Drag and drop the AR Session prefab, which acts as the central manager for the AR experience,
and the AR Session Origin prefab, representing the tracked space in the real world. This initial
setup is pivotal for building an AR application that seamlessly integrates with the physical
environment. It forms the canvas upon which the AR experience will unfold, combining virtual and
real-world elements.
25
Implementation Chapter 5
Stability is paramount in AR applications. Integrate AR Anchors into the design to attach virtual
objects to real-world points, ensuring they remain fixed in space relative to the physical
environment. This step enhances the user experience, providing a sense of persistence and stability
to virtual elements within the dynamic context of the AR environment.
Designing the user interface (UI) is the next crucial step. Consider the user experience and
implement intuitive controls, buttons, or other interactive elements to facilitate engagement with
the AR content. Implementation of AR interactions involves incorporating gestures and touch
controls, making the user experience more natural and immersive. Visual feedback mechanisms are
essential for guiding users through the AR experience, providing cues for successful interactions,
or conveying information relevant to the virtual elements.
As the application matures through testing and iteration, prepare it for deployment. Configure build
settings within Unity to target specific platforms and publish the AR app to the App Store for iOS
and Google Play for Android. Ensure that the app adheres to platform-specific guidelines and
requirements for a seamless deployment process.
Continuously stay informed about updates to AR Foundation, ARKit, and ARCore for potential
enhancements and new features. Regularly iterate on the application based on user feedback,
emerging technologies, and evolving best practices. Robust documentation of code and application
features facilitates future development and ensures that the AR application remains at the forefront
of innovation and user satisfaction.
26
Implementation Chapter 5
Embarking on the journey of enabling Unity on a phone involves a comprehensive set of steps
beginning with the installation of Unity Hub and a compatible Unity version. Unity Hub serves as a
centralized management tool for Unity versions, streamlining the process of project creation and
version control. Once a Unity version supporting the target platform—whether Android or iOS— is
installed, the creation of a new project unfolds within Unity Hub. This initial phase is crucial for
project setup, requiring decisions on templates, such as 3D or 2D, and meticulous configuration of
project settings.
As the development environment takes shape within the Unity editor, the subsequent step involves
crafting the desired application. This process entails importing assets, shaping scenes, and
implementing functionality through the use of C# scripts. Unity's intuitive interface empowers
developers to visualize and fine-tune their creations, fostering an iterative and creative development
cycle. The unity of design and functionality coalesce as the project matures, setting the stage for the
application's deployment.
With the foundation laid, the focus shifts to configuring build settings within Unity. The Build
Settings menu becomes the gateway to specifying the target platform, initiating a switch to Android
or iOS, and adjusting platform-specific settings such as package names or bundle identifiers.
Building the application finalizes the encapsulation of the project into a standalone file, ready for
deployment.
The deployment phase involves the transfer of the built application to the intended mobile device.
For Android, this necessitates connecting the device to the development machine, enabling USB
debugging, and potentially installing required drivers. iOS deployment, on the other hand, demands
a Mac with Xcode installed and, if necessary, enrollment in the Apple Developer Program to
deploy on a physical iOS device. The actual running of the application on the mobile device marks
the fruition of the development efforts, bringing the Unity-powered creation to life.
27
Implementation Chapter 5
Testing and debugging become paramount in this iterative process. Unity offers a suite of
development tools for real-time testing and debugging, facilitating the identification and resolution
of issues. For Android devices, establishing a direct connection between the Unity editor and the
phone enhances the efficiency of this phase. Meanwhile, for iOS, integrating the project with
Xcode provides developers with a deeper level of insight into performance and behavior.
Security considerations play a role, especially on Android devices, where ensuring the phone's
security settings permit the installation of applications from external sources is essential. Finally, a
commitment to documentation and staying informed about Unity updates ensures that the
development workflow remains aligned with best practices and platform-specific nuances. This
comprehensive approach, from initial setup to deployment, testing, and refinement, empowers
developers to harness the full potential of Unity on mobile devices, creating immersive and
engaging applications.
5.3 SUMMARY:
The collaborative workflow between Blender and Unity in the context of AR Foundation involves
a seamless integration of 3D content creation and game development. Blender, a versatile open-
source 3D graphics software, serves as a comprehensive tool for modeling, sculpting, and
animating objects and scenes. Artists use Blender to craft detailed 3D assets, characters, and
animations. Unity, a popular game development engine, complements Blender by providing a
platform for integrating these assets into AR applications using AR Foundation. This workflow
enables developers to import Blender-created assets into Unity, where they can be arranged,
scripted, and deployed for augmented reality experiences. Unity's AR Foundation extends its
capabilities to incorporate AR features, such as image recognition and tracking. The synergy
between Blender and Unity within the AR Foundation framework facilitates a streamlined process
for creating immersive AR content, showcasing the collaborative power of these tools in reshaping
interactive and visually compelling augmented reality experiences.
28
System Testing Chapter 6
CHAPTER 6
SYSTEM TESTING
6.1 TESTING LEVELS:
Integration testing for an Augmented Reality (AR) application utilizing image processing in
Unity involves validating the seamless interaction and cooperation between different components
or modules within the application. The objective is to ensure that these integrated elements work
harmoniously to deliver the intended functionality. In the context of image processing, this
includes testing the flow of data and operations between modules responsible for image
recognition, tracking, rendering, and other AR features. By assessing how these components
collaborate, integration testing helps identify potential issues such as data inconsistencies,
communication errors, or interoperability challenges. Through this process, developers can catch
and address integration-related issues early in the development lifecycle, ensuring that the AR
application functions cohesively and delivers a unified and reliable user experience.
System testing for an Augmented Reality (AR) application using image processing in Unity
is a comprehensive evaluation of the entire application as a unified system. This testing phase aims
to verify that all individual components, including image recognition, tracking, rendering, and user
29
System Testing Chapter 6
interaction, work harmoniously to meet specified requirements. System testing involves testing the
application in various scenarios and conditions, assessing its behavior under different
environmental settings and user interactions. It ensures that the AR application functions as
intended, placing virtual objects accurately based on image recognition results, and adapting to
real-world changes. Additionally, system testing addresses aspects such as performance, security,
and overall user satisfaction, providing a thorough examination of the application's reliability and
functionality before deployment. Any discovered issues are addressed to guarantee a robust and
user-friendly AR experience.
Acceptance testing for an Augmented Reality (AR) application with image processing in
Unity is the final phase of testing before deployment, focusing on validating that the application
meets the specified business requirements and user expectations. This testing assesses whether the
AR application delivers the intended features and functionalities in a real-world context. Typically
involving end-users or stakeholders, acceptance testing evaluates the application's usability,
performance, and adherence to predefined criteria. Users interact with the AR features, such as
image recognition and object rendering, providing feedback on the overall user experience. Any
necessary adjustments are made based on this feedback to ensure that the application aligns with
business goals and fulfills user needs. Successful acceptance testing indicates that the AR
application is ready for deployment, having undergone thorough evaluation from the perspective of
those who will ultimately use and benefit from it.
30
System Testing Chapter 6
compromising the device's resources. By measuring and optimizing key performance indicators,
developers can enhance the application's efficiency and responsiveness, ensuring it meets the
required standards and provides a seamless AR experience for users.
6.2 SUMMARY:
31
Coding And Output Chapter 7
CHAPTER 7
7.1.1 CODING:
using System.Collections;
using System.Collections.Generic; using UnityEngine;
using UnityEngine.XR;
using UnityEngine.XR.ARFoundation;
32
Coding And Output Chapter 7
7.1.2 DESCRIPTION:
The provided C# script is designed for Unity using the AR Foundation package, facilitating image
tracking in augmented reality (AR) applications. Upon initialization, the script creates a dictionary
to manage GameObjects corresponding to images and instantiates them based on a specified array.
The script subscribes to the `trackedImagesChanged` event of the ARTrackedImageManager in the
`OnEnable` method and unsubscribes in the `OnDisable`
33
Coding And Output Chapter 7
7.2 OUTPUT:
EARTH:
34
Coding And Output Chapter 7
URANUS:
MARS:
35
Coding And Output Chapter 7
MERCURY:
NEPTUNE:
36
Coding And Output Chapter 7
SUN:
SATURN:
37
Coding And Output Chapter 7
VENUS:
7.2.2 SONOMETER:
38
Coding And Output Chapter 7
7.2.3 LATHE:
7.2.4 MOTOR:
39
Coding And Output Chapter 7
PROJECT DEVELOPMENT
The " Augmented Horizons of Exploring New Realities in Education" project represents a
groundbreaking endeavor poised to redefine the educational landscape. By harnessing the
robust capabilities of the Unity game engine, this initiative aims to seamlessly integrate
augmented reality technology into the learning process. Through dynamic digital overlays,
students will gain access to a wealth of interactive 3D models, simulations, and virtual field
trips, transcending the limitations of traditional educational materials. The core objective of
this project is to foster deeper understanding and engagement among learners of varied
backgrounds and learning styles.
Meticulous attention has been given to ensuring that the educational content aligns seamlessly
with established curriculum standards, enhancing the relevance and applicability of the
augmented reality experiences. Moreover, a user-centric approach to interface design
guarantees an intuitive and immersive learning experience. Rigorous testing protocols,
including functional and usability testing, have been implemented to ensure the application's
robustness and user-friendliness. Additionally, the project adheres to strict legal and ethical
considerations, safeguarding user privacy and intellectual property rights.
As the project nears its completion, there is a keen anticipation for the transformative impact it
is poised to make in educational spheres. Not only does it promise to enhance traditional
learning methods, but it also lays the foundation for future advancements in augmented reality
technology. Through this project, education is poised to transcend conventional boundaries,
ushering in a new era of interactive and engaging learning experiences.
CHAPTER 8
CONCLUSION
In conclusion, the integration of Augmented Reality (AR) into education using the Unity
platform is a remarkable stride towards revolutionizing the learning experience. This innovative
approach marries advanced technology with educational content, providing students with
interactive and immersive lessons that transcend the boundaries of traditional teaching materials.
By seamlessly merging the physical and digital realms, AR in education stimulates deeper
understanding, engagement, and retention of knowledge, accommodating diverse learning styles.
The project's meticulous attention to aligning educational content with established curriculum
standards underscores its commitment to enhancing the educational process. Furthermore, the
user-centric design ensures an intuitive and accessible learning experience for all students.
Rigorous testing protocols, alongside adherence to legal and ethical considerations, affirm the
project's dedication to delivering a high-quality, responsible, and secure application.
40
Future Scope
FUTURE SCOPE
The future scope of Augmented Reality (AR) in education is incredibly promising, poised to
revolutionize the way knowledge is acquired and assimilated. As AR technology continues to
advance, it is expected to bring about a paradigm shift in traditional learning methods. Students
will soon find themselves immersed in interactive educational experiences where virtual objects
seamlessly blend with the physical world, enhancing comprehension and retention of complex
concepts. With the advent of AR glasses and wearables, the learning process will become even
more seamless, offering a hands-free and intuitive augmented learning environment. This
technology's potential for personalized learning paths is particularly exciting, as AR applications
can adapt content to suit individual learning styles and preferences.
Moreover, AR's capacity for real-time collaboration among students, regardless of their physical
location, opens up new frontiers for cooperative learning and problem-solving. As the boundaries
of AR's capabilities continue to expand, we can anticipate its integration into an even wider range
of subjects and disciplines, creating opportunities for more interactive and engaging lessons. In
essence, the future of AR in education promises a dynamic and inclusive learning environment
that caters to the diverse needs and preferences of students, ultimately reshaping the landscape of
education.
41
Bibliography
BIBLIOGRAPHY
[1] Billinghurst, M., & Dunser, A. (2012). Augmented Reality in the Classroom. In Mixed
and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 441- 442). IEEE.
[2] Dede, C., & Richards, J. (2017). 21st century skills, education, and competitiveness: A
resource and policy guide. Routledge.
[3] FitzGerald, E., Adams, S., Ferguson, R., Gaved, M., Herodotou, C., Hillaire, G., ... &
Wimpenny, K. (2018). Augmented reality and mobile learning: The state of the art. International
Journal of Mobile and Blended Learning, 10(4), 1-17.
[5] Liarokapis, F., & White, M. (2016). Educational augmented reality applications: Where
we are and what is next. Journal of
Interactive Learning Research, 27(4), 325-343.
42