Ar Learning App Development: Project REPORT 2024-2025
Ar Learning App Development: Project REPORT 2024-2025
PROJECT
REPORT 2024-
2025
COMPUTER ENGINEERING
Submitted by
SANJAY V 23507823
YAHIYA S 23507837
who carried out the project work under my supervision during the academic
year 2024-2025.
We would like to express our sincere thanks to our principal Mr. Rev. Fr. J. JOHN
JOSEPH, M.E., for enabling us to do our project and offering adequate time in completing
our report.
We would like to also remember and thank our guide Mrs. R. RAMYA, M.Tech., for
his able guidance to complete our report successfully.
We owe a deep sense of gratitude to our Mrs. R. RAMYA, M.Tech., Head of the
department of computer engineering for his valuable guidance and motivation which helped us
to complete this project in time.
We are very thankful to our teaching staffs to the Computer Department for their
passionate support and also for the appreciation given by them in achieving our goal. We
heartily thank our library staff and management for their extensive support by providing the
information and resource that helped as to completed the report successfully.
Also, we would like to record my deepest gratitude to our parents for their constant
encouragement and which motivated us to complete our report on time last but not the least,
our sincere thanks to OUR PARENTS AND FRIENDS who have been the source of our
strength of throughout our life.
III
ABSTRACT
Augmented Reality (AR) has emerged as a transformative force in education,
revolutionizing traditional learning paradigms. This abstract introduces an innovative educational
application developed with Unity, harnessing AR's potential to create an immersive and interactive
learning environment.
The project's primary goal is to augment the learning process by seamlessly integrating
digital information into the physical world. Leveraging Unity's powerful development
environment, the application aims to provide learners with a dynamic three-dimensional
educational experience.
Methodologically, Unity is chosen for its versatility and widespread adoption, facilitating
the seamless integration of AR elements across platforms. The development process focuses on
creating 3D models, animations, and interactive elements, meticulously aligned with specific
learning objectives.
IV
LIST OF FIGURES
V
LIST OF ABBREVIATION
ABBREVIATION MEANING
AR Augmented Reality
SDK Software Development Kit
API Application Programming Interface
DFD Data Flow Diagram
GPU Graphics Processing Unit
RAM Random Access Memory
SSD Solid State Drive
IDE Integrated Development Environment
UI User Interface
VR Virtual Reality
iOS Apple’s Operating System
3D Three-Dimensional
2D Two-Dimensional
CPU Central Processing Unit
OS Operating System
HIPAA Health Insurance Portability and Accountability Act
GDPR General Data Protection Regulation
Xcode Apple’s Integrated Development Environment for macOS
C# C-Sharp Programming Language
ARKit Apple’s AR Development Framework
ARCore Google’s AR Development Framework
ML Machine Learning
VI
TABLE OF CONTENTS
LIST OF FIGURES V
LIST OF ABBREVIATION VI
1 INTRODUCTION 1
1.1 INTRODUCTION TO PROJECT 1
1.2 OBJECTTIVE 2
1.3 PROBLEM STATEMENT 4
1.4 SUMMARY 4
2 SYSTEM REQUIREMENTS 5
2.1 HARDWARE REQUIREMENTS 5
2.2 SOFTWARE REQUIREMENTS 5
2.2.1 SOFTWARE DESCRIPTION 7
2.2.1.1 UNITY 7
2.2.1.2 BLENDER 10
2.3 FUNCTIONAL AND NON-FUNCTIONAL 11
REQUIREMENTS
2.4 SUMMARY 13
3 SYSTEM DESIGN 14
3.1 EXISTING SYSTEM 14
3.1.1 ADVANTAGE OF EXISTING SYSTEM 15
3.1.2 DRAWBACK OF EXISTING SYSTEM 17
3.2 PROPOSED SYSTEM 17
3.3 DATA FOLOW DIGRAM 20
3.4 SUMMARY 21
4 IMPLEMENTATION 22
4.1 WORKING OF 3D MODEL 22
4.2 WORKING IN UNTIY 23
4.2.1 UNTIY IN PHONE 25
4.3 SUMMARY 26
5 SYSTEM TESTING 27
5.1 TESTING LEVELS 27
5.1.1 UNIT TESTING 27
5.1.2 INTEGRATION TESTING 27
5.1.3 SYSTEM TESTING 27
5.1.4 ACCEPTANCE TESTING 28
5.1.5 PERFORMANCE TESTING 28
5.2 SUMMARY 29
7 LITERATURE SURVEY 39
7.1 LITERATURE SURVEY 39
7.2 SUMMARY 40
8 CONCLUSION 41
CHAPTER 1
INTRODUCTION
1
AR also addresses the diverse learning needs of students by providing customizable
experiences. to different learning styles, ensuring that visual, auditory, and kinesthetic
learners all have opportunities to excel. For example, a history lesson on ancient civilizations
could incorporate AR reconstructions of ancient cities, allowing visual learners to explore,
auditory learners to listen to historical narratives, and kinesthetic learners to physically interact
with virtual artifacts.
1.2 OBJECTIVE
2
Customized Learning Experiences: AR technology can be tailored to accommodate diverse learning
styles and preferences. It offers flexibility in content delivery, allowing
educators to adapt materials to suit individual student needs, ensuring a more inclusive and
effective learning environment.
Accessible Experiential Learning: AR brings virtual field trips, simulations, and experiments
into the classroom, making experiential learning more accessible and inclusive. Students can
explore places and scenarios that would otherwise be impractical or impossible to visit physically.
Inspiring Curiosity and Lifelong Learning: AR stimulates curiosity and a sense of wonder in
students. It encourages exploration, inquiry, and a desire to seek out knowledge independently,
fostering a lifelong love for learning.
Measurable Learning Outcomes: AR in education allows for the collection of data on student
interactions and performance. This data can be used to assess learning outcomes, identify
areasfor improvement, and tailor instructional approaches for better results.
3
1.3 PROBLEM STATEMENT
1.4 SUMMARY
4
CHAPTER 2
SYSTEM
REQUIREMENTS
AR Device:
Operating System
AR SDKs/Frameworks
5
Programming Languages
For iOS: Swift or Objective-C
For Android: Kotlin or Java (Java is required for ARCore support)
For cross-platform: C# (Unity) or C++ (Unreal Engine)
Version Control
Blender, Maya, 3ds Max, or other 3D modeling tools for creating assets.
AR Content Creation
Tools for creating AR content, including 3D models, animations, and ARmarkers.
In the development and maintenance of the AR application, paramount attention must be given
to key facets such as data security, privacy compliance, and comprehensive documentation and
support. Robust data security measures, including encryption and secure authentication protocols,
must be implemented to safeguard user data from unauthorized access or breaches, while ensuring
compliance with data protection regulations like GDPR and HIPAA is essential to uphold user
privacy and trust. User documentation in the form of manuals or guides should be provided to
facilitate user navigation and usage of the application, while developer documentation detailing the
codebase, APIs, and custom features is crucial for future maintenance and development efforts. By
addressing these aspects holistically, the AR application can ensure a seamless user experience,
maintain data integrity, and support sustainable growth and innovation.
6
2.2.1 SOFTWARE DESCRIPTION
2.2.1.1 UNITY
Unity Engine is a versatile and user-friendly platform renowned for game and interactive
application development. Its cross-platform capabilities streamline development across various
devices and operating systems. With support for multiple languages, including C#, Unity offers a
flexible coding environment. The Unity Asset Store provides a wealth of pre-made assets and
tools, expediting the development process. Whether crafting 2D games orcomplex 3D
simulations, Unity's toolset is adaptable to various dimensions. Its robust physics engine and
animation systems allow for lifelike movements and interactions. With advanced graphics
features and dedicated frameworks for VR and AR, Unity is a go-to choose for immersive
experiences and modern interactive applications.
UNITY IN AR
Unity is a powerful engine for developing AR (Augmented Reality) applications due to its
specialized framework called Unity AR Foundation. Here's how Unity works in AR
applications:
7
Unity's AR Foundation emerges as a groundbreaking tool, streamlining AR development by
offering a unified API compatible with diverse platforms such as ARKit for iOS and ARCore for
Android. This seamless compatibility enables developers to effortlessly create AR experiences,
extending their applications across a wide range of AR-capable devices like smartphones, tablets,
and AR glasses.
Taking the lead in tracking and detecting real-world objects and surfaces, AR Foundation
utilizes the device's camera to comprehend the geometry of the environment. This advanced
functionality enables virtual objects to interact convincingly with their surroundings, elevating the
immersion of the AR experience to new heights.
Harnessing Unity's scripting capabilities, often implemented in C#, developers script the
behavior of virtual objects. These objects can be anchored to physical surfaces or dynamically placed
within the AR environment, offering users an immersive and engaging experience.
8
Unity's comprehensive suite extends beyond mere visual fidelity, ensuring virtual objects are
seamlessly integrated into the real-world environment through its powerful rendering engine. With
meticulous attention to lighting, shadows, and reflections, Unity enhances the AR experience by
imbuing it with a heightened sense of realism.
In addition to graphics and rendering, Unity offers robust tools for crafting intuitive AR user
interfaces (UI). Developers can seamlessly integrate elements like information displays, menus, and
interactive components, enriching the user experience and facilitating seamless interaction within the
AR environment.
Once the AR application is meticulously developed, tested, and debugged, Unity facilitates
seamless deployment across various target platforms, including iOS and Android. With support for
building and deploying applications through their respective app stores,
9
2.2.1.2 BLENDER
Blender is a versatile and open-source 3D modeling and animation software renowned
forits robust capabilities. It empowers users to create intricate 3D models, animate characters,
andsimulate dynamic environments. Its intuitive interface and extensive documentation make it
accessible to professionals and novices alike. With real-time rendering and an integrated game
engine, Blender caters to game developers and filmmakers, offering a comprehensive suite of
tools. Moreover, its scripting capabilities enable customization, enhancing its adaptability for
various projects. From architectural visualization to visual effects, Blender stands as a cost-
effective and powerful solution for 3D content creation.
WORKING IN BLENDER
Blender 3D is a powerful open-source 3D creation suite that is used for various
purposes,including 3D modeling, animation, rendering, and even game development. If you're
working in Blender 3D, you can do a wide range of tasks, from creating 3D models to producing
animated films or games. Here are some common tasks and features you might encounter
whileworking in Blender 3D:
Animation in Blender transcends simple movement, delving into the realm of character
rigging and keyframe animation. Users can breathe life into their creations with dynamic character
animations and complex rigging systems, paving the way for immersive storytelling and compelling
10
Texture and material application in Blender add depth and realism to 3D models, allowing users to
emulate various materials like wood, metal, or glass with astonishing accuracy. Lighting tools further
enhance scenes, enabling users to set up and control different types of lighting to achieve desired
atmospheres and visual effects.
Blender's physics simulation capabilities extend the creative possibilities, offering simulations
for smoke, fluid, cloth, and more. Users can create lifelike physical interactions, adding an extra layer
of realism to their scenes.
Compositing features in Blender provide users with a powerful node-based compositor for
post- processing renders, adding effects, and manipulating the final output. Additionally, Blender
serves as a comprehensive tool for video editing and post-production work, including cutting, splicing,
and adding effects to videos.
For game developers, Blender's integrated game engine facilitates the creation of interactive
3D games, allowing for the building, animating, and scripting of game assets directly within
Blender's interface.
Blender also caters to 3D printing enthusiasts with tools to prepare models for 3D printing,
ensuring models are checked and fixed for printability.
With a Python scripting API, Blender offers extensive scripting capabilities, enabling users to
automate tasks, create custom tools, and extend Blender's functionality to suit their specific needs.
Furthermore, Blender's flexibility is further enhanced by its vast library of add-ons, allowing users to
expand its capabilities through community-developed or custom-built add-ons. In essence, Blender
stands as a comprehensive and indispensable tool for 3D modeling, animation, rendering, and beyond,
empowering users to bring their creative visions to life with unparalleled depth and precision.
11
Markerless tracking capabilities are essential for providing AR experiences without the need
for predefined markers, enhancing flexibility and usability. Object manipulation through gestures or
touch interactions adds another layer of interactivity, empowering users to interact naturally with
virtual objects within the AR environment.
Scene understanding is crucial for recognizing the environment and adapting AR content
accordingly, ensuring seamless integration with real-world surroundings. Cross-platform
compatibility is paramount, requiring integration with AR Foundation to ensure the application works
seamlessly on various platforms such as iOS and Android.
Non-Functional Requirements
Performance is key, requiring the application to run smoothly with minimal latency to
provide users with a seamless experience. Scalability is crucial to accommodate varying levels of
complexity and a growing user base or data load.
12
screen sizes, resolutions, and hardware capabilities.
Planning for easy updates and maintenance is essential, allowing for future improvements
and bug fixes to be implemented seamlessly. By addressing these considerations comprehensively,
developers can create a robust and successful AR application that meets the needs of users while
ensuring a positive and engaging experience.
2.4 SUMMARY
The collaborative workflow between Blender and Unity in the context of AR Foundation
involves a seamless integration of 3D content creation and game development. Blender, a versatile
open-source 3D graphics software, serves as a comprehensive tool for modeling, sculpting, and
animating objects and scenes. Artists use Blender to craft detailed 3D assets, characters, and
platform for integrating these assets into AR applications using AR Foundation.This workflow
enables developers to import Blender- created assets into Unity, where they can be arranged,
scripted, and deployed for augmented reality experiences. Unity's AR Foundation extends its
capabilities to incorporate AR features, such as image recognition and tracking. The synergy
between Blender and Unity within the AR Foundation framework facilitates a streamlined process
for creating immersive AR content, showcasing the collaborative power ofthese tools in reshaping
13
CHAPTER 3
SYSTEM
DESIGN
Virtual labs and simulations bring scientific concepts to life, enabling students to conduct
experiments virtually. For instance, in chemistry, students can interact with virtual chemical
reactions, observe molecular structures, and grasp complex scientific concepts within a controlled
digital environment.
Historical and cultural immersion takes learners on captivating journeys to different time
periods and locations through AR applications. Students can explore historical sites, ancient
civilizations, and cultural landmarks via immersive 3D reconstructions and interactive experiences,
fostering a deeper understanding of history and culture.
Language learning and vocabulary building are facilitated through AR applications that
leverage visual cues in the real world. By pointing their devices at objects, learners can associate
words with objects, facilitating vocabulary acquisition through translations or audio pronunciations,
14
thus enhancing language proficiency.
15
Geography and Environmental Education
Augmented Reality (AR) applications have transformed education across various disciplines,
providing innovative tools and immersive experiences for students. In art education, AR allows
students to create and manipulate virtual sculptures, paintings, and designs, fostering creativity
and enabling experimentation with different artistic styles. Moreover, AR simulates virtual field
trips, overcoming accessibility barriers and allowing students to explore museums, historical
sites, and natural landmarks from within the classroom. In STEM subjects, AR facilitates
interactive learning experiences in physics, biology, mathematics, and engineering, enhancing
understanding and engagement. These applications can be customized to accommodate different
learning styles and abilities, providing personalized learning experiences that adapt to individual
needs. By integrating gamification elements, AR further enhances engagement and turns
educational experiences into immersive adventures. Additionally, AR supports collaborative
learning by enabling multiple users to interact with the same augmented content simultaneously,
promoting teamwork and communication skills. Beyond traditional subjects, AR offers a unique
approach to geography education, allowing students to explore geographical features,
ecosystems, and environmental changes through interactive maps and overlays. Overall, AR is
reshaping the educational landscape, making learning more accessible, engaging, and interactive
for students of all ages and backgrounds. Its multifaceted applications continue to revolutionize
teaching and learning, paving the way for a more dynamic and inclusive educational experience.
The existing system of Augmented Reality (AR) applications in education offers numerous
benefits that significantly enhance the learning experience. Firstly, AR captivates students' attention
through interactive, dynamic content, surpassing traditional textbooks and making learning more
engaging and enjoyable. This enhanced engagement fosters a deeper level of interest in the subject
16
Personalized learning experiences tailored to individual preferences and paces, ensuring that
each student can access educational content in a way that suits them best. Additionally, AR
facilitates collaborative learning environments where students can work together on projects and
activities, fostering teamwork, communication skills, and peer-to-peer learning. Through these
capabilities, AR not only transforms the educational experience but also equips students with
valuable skills and knowledge essential for success in the modern workforce.
Furthermore, AR's adaptability to a wide range of subjects makes it a valuable tool in various
educational contexts. Whether in science, history, art, mathematics, or beyond, AR applications
provide versatile learning experiences that engage students and enhance comprehension. By
leveraging AR across multiple disciplines, educators can offer dynamic and interactive lessons that
cater to diverse learning styles, fostering deeper understanding and retention of key concepts.
17
3.1.2 DRAWBACK OF EXISTING SYSTEM
Integrating AR into existing curricula may also pose challenges, requiring careful planning
and alignment with educational standards. Not all subjects may seamlessly lend themselves to AR
applications, limiting its potential for widespread curriculum integration. Additionally, inequities
in access to AR technology may exacerbate existing disparities, with some students lacking
access to the required devices, leading to disparities in access and opportunities for learning.
Thus, while AR offers exciting possibilities for education, addressing these challenges is crucial
to ensure equitable access and effective integration into educational practices.
Design an intuitive user interface (UI) with elements such as buttons and labels, enhancing
user engagement. Implement user interactions, such as tapping or dragging, to manipulate AR
objects seamlessly. Consider incorporating gestures and touch controls to make the experience more
immersive. Provide visual feedback to users for successful interactions, like highlighting selected
objects or displaying relevant information.
Testing is a critical phase in AR app development. Test your app on ARKit and ARCore
supported devices to identify and resolve any platform-specific issues. Debug and optimize your app
for performance, ensuring a smooth user experience. Configure build settings and deploy your
AR app to the App Store for iOS and Google Play for Android, making it accessible to a broader
audience.
In the iterative process, gather user feedback to enhance your AR app continually. Stay
informed about updates to AR Foundation, ARKit, and ARCore for potential improvements and
new features. Thoroughly document your code and provide clear instructions for users and future
developers, facilitating understanding and future development. As AR technology evolves, this
comprehensive approach ensures your AR app remains at the forefront of innovation and user
satisfaction.
19
Fig 3.1 Block Diagram
20
3.3 DATA FLOW DIAGRAM
A Data Flow Diagram (DFD) is a graphical representation of the flow of data within a
system. While AR applications, especially those using AR Foundation in Unity, might not have
traditional data flow in the same way as information systems, we can represent the flow of
information and interactions within the AR system. Here's a simplified DFD for an AR application
using AR Foundation in Unity:
Description
User Input
Represents any input from the user, such as gestures, taps, or other interactions.
21
AR Foundation Module
Denotes the rendering of virtual objects in the AR scene based on the image
recognition results.
3.4 SUMMARY
The existing system may lack certain features, exhibit performance bottlenecks, or have
limitations in terms of user experience. In the context of AR, the image processing may be less
accurate, the interaction with virtual objects might be limited, and the overall system may not adapt
well to different environmental conditions. The need for enhanced features, improved performance,
and a more user-friendly interface becomes apparent through the limitations of the existing
system.
The proposed system outlines the improvements and new features introduced to overcome
the shortcomings of the existing system. This may include advancements in image recognition
algorithms for more accurate tracking, enhanced AR interactions for users, improved rendering
quality, and adaptability to diverse real-world environments. Additionally, the proposed system
could address issues related to performance, security, and usability, providing a more robust and
satisfying AR experience. The introduction of new technologies, optimized code, and a refined user
interface contributes to the overall effectiveness of the proposed AR application.
In summary, the proposed system represents an evolution from the limitations of the existing
system, introducing advancements in image processing, AR interactions, and overall system
performance to deliver an upgraded and more feature-rich AR experience. The proposed system
aimsto address user needs and expectations, providing a solution that not only overcomes existing
challenges but also sets the stage for future improvements and innovations in AR technology.
22
CHAPTER 4
IMPLEMENTATION
The transformative capabilities of Blender shine through in Edit Mode, where you can
move, rotate, and scale components using shortcuts like G, R, and S. Constrain these
transformations along specific axes by appending X, Y, or Z after the action. To refine the model's
surfaces, consider applying a Subdivision Surface modifier, accessible through the modifier panel.
This step introduces a level of smoothness, crucial for achieving realistic and visually appealing
3Dmodels.
In the "Shading" workspace, the creative process expands to materials and textures. Assign
materials to the model, and add textures for a more nuanced appearance. UV mapping, crucial for
precise texture application, involves unwrapping the model's UVs in Edit Mode and adjusting them
in the dedicated UV Editor. For those delving into more organic forms, Blender's Sculpt Mode
offers a suite of brushes for detailed sculpting, expanding the creative possibilities beyond
traditional geometric shapes.
As the 3D model takes shape, attention turns to lighting, a critical factor for achieving
realism in renders. In the "Layout" or "Rendering" workspace, lights are added and configured in
the "Object Data Properties" panel. Proper lighting emphasizes the model's details and establishes a
mood within the scene. With a lit stage set, the next step involves camera setup. Placing and
adjusting the camera's position, rotation, and focal length define the viewpoint for rendering.
23
The rendering process in Blender involves choosing between the Eevee and Cycles
renderingengines, each offering unique strengths. Eevee excels in real-time rendering and is ideal
for quick previews, while Cycles focuses on accurate light interaction and is suitable for high-
quality final renders. Render settings are configured in the respective engines' panels,
allowingcustomization according to project requirements.
With the environment configured, introduce AR Foundation's core components into the
scene.Drag and drop the AR Session prefab, which acts as the central manager for the AR
experience,and the AR Session Origin prefab, representing the tracked space in the real world. This
initialsetup is pivotal for building an AR application that seamlessly integrates with the physical
environment. It forms the canvas upon which the AR experience will unfold, combining virtualand
real-world elements.
24
Stability is paramount in AR applications. Integrate AR Anchors into the design to attach
virtual objects to real-world points, ensuring they remain fixed in space relative to the physical
environment. This step enhances the user experience, providing a sense of persistence and stability
to virtual elements within the dynamic context of the AR environment.
Designing the user interface (UI) is the next crucial step. Consider the user experience and
implement intuitive controls, buttons, or other interactive elements to facilitate engagement with the
AR content. Implementation of AR interactions involves incorporating gestures and touch controls,
making the user experience more natural and immersive. Visual feedback mechanisms are essential
for guiding users through the AR experience, providing cues for successful interactions, or
conveying information relevant to the virtual elements.
As the application matures through testing and iteration, prepare it for deployment.
Configure build settings within Unity to target specific platforms and publish the AR app to the
App Store for iOS and Google Play for Android. Ensure that the app adheres to platform-specific
guidelines and requirements for a seamless deployment process.
Continuously stay informed about updates to AR Foundation, ARKit, and ARCore for
potential enhancements and new features. Regularly iterate on the application based on user
feedback, emerging technologies, and evolving best practices. Robust documentation of code
andapplication features facilitates future development and ensures that the AR application
remainsat the forefront of innovation and user satisfaction.
25
4.1.1 UNITY IN PHONE
As the development environment takes shape within the Unity editor, the subsequent step
involves crafting the desired application. This process entails importing assets, shaping scenes, and
implementing functionality through the use of C# scripts. Unity's intuitive interface empowers
developers to visualize and fine-tune their creations, fostering an iterative and creative development
cycle. The unity of design and functionality coalesce as the project matures, setting the stage for the
application's deployment.
With the foundation laid, the focus shifts to configuring build settings within Unity. The
Build Settings menu becomes the gateway to specifying the target platform, initiating a switch to
Android or iOS, and adjusting platform-specific settings such as package names or bundle
identifiers. Building the application finalizes the encapsulation of the project into a standalone file,
ready for deployment.
The deployment phase involves the transfer of the built application to the intended mobile
device. For Android, this necessitates connecting the device to the development machine, enabling
USB debugging, and potentially installing required drivers. iOS deployment, on the other hand,
demandsa Mac with Xcode installed and, if necessary, enrollment in the Apple Developer Program
to deployon a physical iOS device. The actual running of the application on the mobile device
marks the fruition of the development efforts, bringing the Unity-powered creation to life.
26
Testing and debugging become paramount in this iterative process. Unity offers a suite of
development tools for real-time testing and debugging, facilitating the identification and resolution
of issues. For Android devices, establishing a direct connection between the Unity editor and the
phone enhances the efficiency of this phase. Meanwhile, for iOS, integrating the project with
Xcode provides developers with a deeper level of insight into performance and behavior.
Security considerations play a role, especially on Android devices, where ensuring the
phone's security settings permit the installation of applications from external sources is essential.
Finally,a commitment to documentation and staying informed about Unity updates ensures that the
development workflow remains aligned with best practices and platform-specific nuances. This
comprehensive approach, from initial setup to deployment, testing, and refinement, empowers
developers to harness the full potential of Unity on mobile devices, creating immersive and
engaging applications.
4.3 SUMMARY
The collaborative workflow between Blender and Unity in the context of AR Foundation
involves a seamless integration of 3D content creation and game development. Blender, a versatile
open-source 3D graphics software, serves as a comprehensive tool for modeling, sculpting, and
animating objects and scenes. Artists use Blender to craft detailed 3D assets, characters, and
animations. Unity, a popular game development engine, complements Blender by providing a
platform for integrating these assets into AR applications using AR Foundation. This workflow
enables developers to import Blender-created assets intoUnity, where they can be arranged,
scripted, and deployed for augmented reality experiences. Unity's AR Foundation extends its
capabilities to incorporate AR features, such as image recognition and tracking. Thesynergy
between Blender and Unity within the AR Foundation framework facilitates a streamlined process
for creating immersive AR content, showcasing the collaborative power of these tools in reshaping
interactive and visually compelling augmented reality experiences.
27
CHAPTER 5
SYSTEM TESTING
5.1 TESTING LEVEL
5.1.1 UNIT TESTING
Integration testing for an Augmented Reality (AR) application utilizing image processing in
Unity involves validating the seamless interaction and cooperation between different components or
modules within the application. The objective is to ensure that these integrated elements work
harmoniously to deliver the intended functionality. In the context of image processing, this includes
testing the flow of data and operations between modules responsible for image recognition, tracking,
rendering, and other AR features. By assessing how these components collaborate, integration testing
helps identify potential issues such as data inconsistencies, communication errors, or interoperability
challenges. Through this process, developers can catch and address integration-related issues early in
the development life cycle, ensuring that the AR application functions cohesively and delivers a
unified and reliable user experience.
System testing for an Augmented Reality (AR) application using image processing in Unity
is a comprehensive evaluation of the entire application as a unified system. This testing phase aims to
28
interaction, work harmoniously to meet specified requirements. System testing involves testing the
application in various scenarios and conditions, assessing its behavior under different
environmental settings and user interactions. It ensures that the AR application functions as
intended, placing virtual objects accurately based on image recognition results, and adapting to real-
world changes. Additionally, system testing addresses aspects such as performance, security, and
overall user satisfaction, providing a thorough examination of the application's reliability and
functionality before deployment. Any discovered issues are addressed to guarantee a robust and
user-friendly AR experience.
29
compromising the device's resources. By measuring and optimizing key performance
indicators, developers can enhance the application's efficiency and responsiveness,
ensuring it meets the required standards and provides a seamless AR experience for
users.
5.2 SUMMARY
30
CHAPTER 6
CODING AND
OUTPUT
6.1.1 CODING
using System.Collections;
using System.Collections.Generic;using
UnityEngine; using UnityEngine.XR;
using UnityEngine.XR.ARFoundation;
31
private void imageChanged(ARTrackedImagesChangedEventArgs eventArgs)
{
foreach (ARTrackedImage trackedImage in eventArgs.added)
{
updateImage(trackedImage);
}
foreach (ARTrackedImage trackedImage in eventArgs.updated)
{
updateImage(trackedImage);
}
foreach (ARTrackedImage trackedImage in eventArgs.removed)
{
spawnedPrefab[trackedImage.name].SetActive(false);
}
}
private void updateImage(ARTrackedImage trackedImage)
{
string name = trackedImage.referenceImage.name;Vector3 position = trackedImage.transform.position;
6.1.2 DESCRIPTION
The provided C# script is designed for Unity using the AR Foundation package, facilitating image
tracking in augmented reality (AR) applications. Upon initialization, the script creates a
dictionary to manage GameObjects corresponding to images and instantiates them based on a
specified array. The script subscribes to the `tracked Images Changed` event of the ARTracked
Image Manager in the
`OnEnable` method and unsubscribes in the `OnDisable`
32
method to dynamically respond to changes in tracked images.The `imageChanged` method handles
added, updated, and removed tracked images, calling the`updateImage` method to position and
activate the associated prefab. The `updateImage` method ensures that only the relevant prefab is
visible by deactivating others. Overall, this script establishes a foundation for AR image tracking,
enabling the dynamic placement and manipulation of GameObjects in response to changes in
the AR environment.
6.2 OUTPUT
6.2.1 SOLAR SYSTEM
EARTH
33
URANUS
MARS
34
MERCURY
NEPTUNE
35
SUN
SATURN
36
VENUS
6.1.1 SONOMETER
37
6.1.2 LATHE
6.1.3 MOTOR
38
PROJECTDEVELOPMENT
The " Augmented Horizons of Exploring New Realities in Education" project representsa
groundbreaking endeavor poised to redefine the educational landscape. By harnessing therobust
capabilities of the Unity game engine, this initiative aims to seamlessly integrate augmented
reality technology into the learning process. Through dynamic digital overlays,students will gain
access to a wealth of interactive 3D models, simulations, and virtual fieldtrips, transcending the
limitations of traditional educational materials. The core objective of this project is to foster
deeper understanding and engagement among learners of varied backgrounds and learning
styles.
Meticulous attention has been given to ensuring that the educational content aligns
seamlessly with established curriculum standards, enhancing the relevance and applicability of
the augmented reality experiences. Moreover, a user-centric approach to interface
designguarantees an intuitive and immersive learning experience. Rigorous testing protocols,
including functional and usability testing, have been implemented to ensure the
application'srobustness and user- friendliness. Additionally, the project adheres to strict legal and
ethical considerations, safeguarding user privacy and intellectual property rights.
As the project nears its completion, there is a keen anticipation for the transformative
impactit is poised to make in educational spheres. Not only does it promise to enhance traditional
learning methods, but it also lays the foundation for future advancements in augmented reality
technology. Through this project, education is poised to transcend conventional boundaries,
ushering in a new era of interactive and engaging learning experiences.
39
CHAPTER 7
LITERATURE
REVIEW
Title: "Augmented Reality in Education and Training: Pedagogical Approaches and Illustrative
Case Studies"
Abstract: Parsons and Caris provide a comprehensive exploration of pedagogical approaches and
present illustrative case studies that demonstrate the integration of AR in education and training.
They emphasize how AR enhances learning experiences by providing interactive andengaging
content, leading to deeper understanding and retention of educational material.
Abstract: This review offers a comprehensive overview of AR technology and suggests five
pivotal directions for its application in education. The authors highlight how AR has the potential
to revolutionize educational practices by providing immersive, interactive, andcontextual
learning experiences, ultimately leading to improved learning outcomes.
Title: "A Review on the Use of Augmented Reality in Education: From the Perspective of
Motivation, Cognitive Load, Presence, and Practical Implementation"
Authors: Bacca, J., Baldiris, S., Fabregat, R., Graf, S., & Kinshuk
40
Abstract: Bacca et al. consider crucial factors such as motivation, cognitive load, and presence
when evaluating the effectiveness of AR in education. The review discusses practical
41
implementation strategies and offers a nuanced understanding of how AR influences the
learning environment. It highlights the potential of AR to create immersive and interactive
learning experiences.
Authors: Azuma, R. T.
Journal: ISMAR
7.2 SUMMARY
The literature survey on Augmented Reality (AR) in education reveals a growing body of
research and practical applications that highlight the transformative impact of AR on the
learning experience. Studies consistently emphasize the potential of AR to enhance student
engagement, understanding, and retention of educational content. AR is found to be
particularlyeffective in science, technology, engineering, and mathematics (STEM) subjects,
providing interactive simulations and 3D visualizations that facilitate deeper comprehension.
Additionally, literature suggests that AR fosters a student-centric learning environment,
catering to diverse learning styles and encouraging active participation. Challenges such as
hardware limitations, integration into curricula, and the need for teacher training are also
acknowledged. Overall, the literature survey underscores the significant role of AR in
reshaping education by offering innovative and immersive learning opportunities that go
beyond traditional teaching methods. As the field of AR in education continues to advance,
research and practical implementations contribute to a growing understanding of its benefits,
challenges, and the potential for revolutionizing the educational landscape.
42
CHAPTER 8
CONCLUSION
In conclusion, the integration of Augmented Reality (AR) into education using the
Unity platform is a remarkable stride towards revolutionizing the learning experience. This
innovative approach marries advanced technology with educational content, providing
students with interactive and immersive lessons that transcend the boundaries of traditional
teaching materials. By seamlessly merging the physical and digital realms, AR in education
stimulates deeper understanding, engagement, and retention of knowledge, accommodating
diverse learning styles.
43
FUTURE SCOPE
44