Arvr Lab Manual
Arvr Lab Manual
2
DHIRAJLAL GANDHI COLLEGE OF TECHNOLOGY
Salem Airport (Opp.) Salem–636309 Ph:(04290) 233333 www.dgct.ac.in
BONAFIDE CERTIFICATE
Name: …………………………………………………………
Degree: …………………………………………………………
Branch: …………………………………………………………
Semester: …….… Year: ……….… Section: ………..
Reg No: …………………………………………………………
Certified that this is the Bonafide record of the work done by the above
student in
…………………………………………………………………………………………………………………..
Laboratory during the academic year …………………………………
INTERNALEXAMINER EXTERNALEXAMINER
3
4
LABMANNERS
Students must be present in proper dress code and wear the ID card.
Students should enter the log-in and log-out time in the log register without
fail.
Students are not allowed to download pictures, music, videos, or files
without the permission of the respective lab in-charge.
Students should wear their own lab coats and bring observation notebooks
to the laboratory classes regularly.
Record of experiments done in a particular class should be submitted in the
next lab class.
Students who do not submit the record notebook in time will not be
allowed to do the next experiment and will not be given attendance for that
laboratory class.
Students will not be allowed to leave the laboratory until they complete the
experiment.
Students are advised to switch off the monitors and CPU when they leave
the lab.
Students are advised to arrange the chairs properly when they leave the
lab.
5
6
7
8
9
CCS333 AUGMENTED REALITY / VIRTUAL REALITY
OBJECTIVES:
Develop simple AR and VR applications using development tools.
Explore real-world applications of AR and VR across various domains.
Study future trends, challenges, and ethical issues in AR/VR.
LIST OF EXPERIMENTS:
1. Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and Blender.
2. Use the primitive objects and apply various projection by handling camera.
3. Download objects from asset store and apply various lighting and effects.
4. Model three dimensional objects using various modelling techniques and
apply textures over them.
5. Create three dimensional realistic scenes and develop simple virtual reality
enabled mobile applications which have limited interactivity.
6. Add audio and text special effects to the developed application.
7. Develop VR enabled applications using motion trackers and sensors
incorporating full haptic interactivity.
8. Develop AR enabled applications with interactivity like E learning
environment, Virtual walkthroughs and visualization of historic places.
9. Develop AR enabled simple applications like human anatomy visualization,
DNA/RNA structure visualization and surgery simulation.
10.Develop simple MR enabled gaming applications
COURSE OUTCOMES:
C01: Understand the basic concepts of AR and VR
C02: Understand the tools and technologies related to AR/VR
C03: Know the working principle of AR/VR related Sensor devices
C04: Design of various models using modeling techniques
C05: Develop AR/VR applications in different domains
10
EX DATE NAME OF THE EXPERIMENT PAGE DATE OF MARKS STAFF REMARKS
NO: NO COMPLETION SIGNATURE
11
12
EXNO: 1
Study of tools like Unity, Maya, 3DS MAX, AR
DATE: toolkit, Vuforia and Blender
AIM:
To study tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and Blender.
PROCEDURE:
1)Unity:
ii. Explore the Scene view, Hierarchy, Inspector, and Game view.
iii. Use the Asset Store to browse and import basic 3D assets.
2)Maya:
ii. Use primitive tools (cube, sphere, etc.) to model simple shapes.
iii. Explore object transformation (move, scale, rotate) and apply basic materials.
13
3)3DS MAX:
4)AR Toolkit:
ii. Print a sample marker and test camera tracking by overlaying 3D content.
14
5)Vuforia:
iii. Integrate Vuforia with Unity and test image tracking using webcam or mobile.
6)Blender:
15
iii. Navigate the 3D viewport using mouse and shortcut keys.
RESULT:
Thus, Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and
Blender was successfully completed.
16
EXNO: 2
Use the primitive objects and apply various
DATE: projection by handling camera
AIM:
To use the primitive objects and apply various projection types by handling camera.
ALGORITHM:
i. Step 1 : Create primitive objects
ii. Step 2 : Manipulate primitive objects
iii. Step 3 : Camera Handling
iv. Step 4 : Applying projection types
v. Step 5 : Stop the program
PROCEDURE:
a. Open blender and select with a new project
b. To add primitive object like cubes ,sphere , etc. ...................... press shift + A ( or ) click
on the 'ADD' menu at the top -left of the 3D viewpoint
c. Select the type of primitive you want to add ( eg : cube , sphere, cone )
d. To move an object select it and press G then move your mouse (or) type
in specific co ordinate
e. To scale an object select it and press S, then move your mouse (or)
type in specific scale values
f. To rotate an object select it and press R , then move your mouse (or)
type in specific rotation angle
g. Select the camera in the 3D veiwpoint (right -click on it )
h. To move the camera , press G, then move your mouse (or) type in
specific co ordinate
i. To rotate the camera press R then move your mouse (or)type in
specific rotation angles
j. To adjust the camera focal length , select the camera and in " object
data" tab
k. Is in the camera settings (under the object data tab for the camera), you can adjust the
"Lens" value for different perspectives.
l. For orthographic projection, set the orthographic scale value to control the
17
size of objects, making them appear uniform in size regardless of distance.
OUTPUT:
RESULT:
Thus Use the primitive objects and apply various projection types by handling
camera was successfully completed
18
EXNO: 3
Download objects from asset store and apply
DATE: various lighting and shading effects
AIM:
To Download objects from asset store and apply various lighting and shading effects
using Blender.
ALGORITHM:
i. Step 1 : Create primitive download assets from Unity hub
ii. Step 2 : Export assets from unity
iii. Step 3 : Import assets from blender
iv. Step 4 : Apply Lighting and Shading Effects
v. Step 5 : Configure Rendering settings
vi. Step 6 : Render the scene
vii. Step 7 : Post process the scene
viii. Step 8 : Export the rendered image
ix. Step 9 : Stop the program
PROCEDURE:
a. Open unity and navigate to the unity asset store within the editor
b. Search for and download the object /assets you want to use
c. After downloading import, the assets into your unity project
d. Select the assets in the project tab
e. Right-click and choose " export package ". this will create a unity package
file (unity package) that contain the assets
f. Open blender and ensure you have necessary add -ones enabled like "import
export unity package "
g. Go to file > import > unity (unity package) and select the package you
exported from unity
h. Choose the object you want to import into blender
i. Once the object is in blender you can apply various lighting and shading effects
j. Step-up lighting by adding lamps (or) has to your scene to illuminate the
object
19
k. Adjust the rendering settings in blender, including resolution output format
and sampling settings, to achieve desired quality.
OUTPUT :
20
RESULT:
Thus Use the primitive objects and apply various projection types by handling camera was
successfully completed.
21
EXNO: 4 Model three dimensional objects using various modelling
AIM:
To Model three dimensional objects using various modelling techniques and apply
textures over them.
ALGORITHM:
PROCEDURE:
a. Launch blender and delete the default cube.
b. Add a new mesh (eg:a sphere, cube,(or)custom shape)using the “Add” menu (or) shift+A.
c. Use various tools like extrude, scale, rotate and subdivide to shape our objects.
d. You can also sculpt, use modifiers(or) create objects from curves (or)text.
h. Add a texture node (eg: image texture and connect it to the materials shade nodes
22
k. Go to the edit mode ,select all vertices and unwrap using the “UV” menu >”unwrap”.
l. In the shader editor, add a texture co-ordinate node and a mapping node.
m. Connect the UV output of the texture co-ordinate node to the vector input of the image
texture node.
o. For more advanced textures, you can use the texture paint workspace to
directly paint on your model.
q. Go to the “render” tab to see how our object looks in different lightning conditions.
r. Render the final image using the “render button
OUTPUT:
23
TEXTURING :
TEXTURE PAINTING :
24
EXTRUDE :
UV UNWRAPPING:
RESULT:
Thus, the model three dimensional objects using various modelling techniques and
apply textures over them was successfully.
25
Create three dimensional realistic scenes and develop simple
EXNO: 5
virtual reality enabled mobile applications which have limited
DATE:
interactivity.
AIM:
To Create three dimensional realistic scenes and develop simple virtual reality enable
mobile applications which have limited interactivity.
ALGORITHM:
i. Step 1 : Model the scene
ii. Step 2 : Add texture to the scene
iii. Step 3 : Add Lighting and rendering to the scene
iv. Step 4 : Animate the scene
v. Step 5 : Export the Scene to VR
vi. Step 6 : Use Unity for Unreal Integration
PROGRAM:
import bpy
if bpy.context.space_data is None or bpy.context.space_data.type != 'TEXT_EDITOR':
raise Exception("This script is intended to be run in the Text Editor")
def vr_interaction(scene):
controller = bpy.context.active_object
if controller:
if hasattr(controller, "button_pressed") and controller["button_pressed"]:
print("Button 0 is pressed!")
position = controller.location
print("Controller position:", position)
if vr_interaction not in bpy.app.handlers.frame_change_pre:
bpy.app.handlers.frame_change_pre.append(vr_interaction)
26
OUTPUT:
RESULT:
Thus the 3D scenes are to be created and develop simple VR applications.
27
Add audio and text special effects to the developed application
EXNO: 6
DATE:
AIM:
To Add audio and text special effects to the developed application.
ALGORITHM:
i. Step 1 : Open Blender
ii. Step 2 : Switch to video editing layout
iii. Step 3 : Import Audio
iv. Step 4 : Add text special effect
v. Step 5 : Render the final result
vi. Step 6 : Stop the program
PROGRAM:
import bpy
bpy.ops.wm.read_factory_settings(use_empty=True)
bpy.ops.object.text_add(location=(0, 0, 0))
txt = bpy.context.object
txt.data.body = "HELLO, WORLD"
txt.data.extrude = 0.1
mat = bpy.data.materials.new(name="TextMat")
mat.diffuse_color = (1, 0, 0, 1)
txt.data.materials.append(mat)
txt.location.x = -5
txt.keyframe_insert(data_path="location", index=0, frame=1)
txt.location.x = 5
txt.keyframe_insert(data_path="location", index=0, frame=50)
snd = bpy.data.sounds.load("C:/path/to/audio.mp3", check_existing=True)
seq = bpy.context.scene.sequence_editor_create()
seq.sequences.new_sound("Audio", snd.filepath, channel=1, frame_start=1)
scene = bpy.context.scene
scene.render.filepath = "//text_anim.mp4"
scene.render.image_settings.file_format = 'FFMPEG'
bpy.ops.render.render(animation=True)
28
OUTPUT:
RESULT:
Thus the audio and text special effects to the developed application was
successfully Created.
29
Develop VR enabled applications using motion trackers and
EXNO: 7
sensors incorporating full haptic interactivity
DATE:
AIM:
To Develop VR enabled applications using motion trackers and sensors incorporating
full haptic interactivity using Unity.
ALGORITHM:
i. Step 1 : Open Blender
ii. Step 2 : Setup VR Environment
iii. Step 3 : Use motion tracking and sensors in blender
iv. Step 4 : Implement haptic feedback through blender’s game engine
v. Step 5 : Develop interactive scenes by blender’s logic editor
vi. Step 6 : Create VR friendly UI elements for user interaction
vii. Step 7 : Test and deploy VR Application
viii. Step 8 : Stop the program
PROGRAM:
import bpy
import HapticDevice
class HapticDevice:
def connect(self):
print("Haptic device
connected.")
def get_position(self):
return (1.0, 2.0, 3.0)
device = HapticDevice()
device.connect()
obj = bpy.context.active_object
def
update_object_from_haptic(scene):
pos = device.get_position()
obj.location = pos
print(f"Object moved to: {pos}")
if update_object_from_haptic not in
30
bpy.app.handlers.frame_change_pre:
bpy.app.handlers.frame_change_pre.
append(update_object_from_haptic)
OUTPUT:
RESULT:
Thus the VR enabled applications using motion trackers and sensors incorporating haptic
interactivity was developed successfully.
31
Develop AR enabled applications with interactivity like E
EXNO: 8
learning environment, Virtual walkthroughs and visualization
DATE:
of historic places
AIM:
To Develop AR enabled applications with interactivity like E learning environment,
Virtual walkthroughs and visualization of historic places.
PROCEDURE:
i. Step 1: Creating an E-learning environment within blender
ii. Step 2: Designed for 3D modeling and animation in an web development
iii. Step 3: Create a scene with educational content
iv. Step 4: Write a python scripting in blender
v. Step 5: In this script, a cube is created in the scene and text is added to it.
vi. Step 6: The camera and lighting are setup for rendering and then scene is
rendered as a PNG image.
PROGRAM:
import bpy
bpy.ops.object.select_by_type(type='MESH')
bpy.ops.object.delete()
bpy.ops.mesh.primitive_cube_add(size=2)
bpy.ops.object.camera_add(location=(0, -5, 2), rotation=(1.0472, 0, 0.7854))
cam = bpy.context.object
cam.keyframe_insert(data_path="location", frame=1)
cam.location = (0, -2, 3)
cam.keyframe_insert(data_path="location", frame=50)
scene = bpy.context.scene
scene.render.resolution_x = 1920
32
scene.render.resolution_y = 1080
scene.frame_end = 50
scene.render.image_settings.file_format = 'FFMPEG'
scene.render.ffmpeg.format = 'MPEG4'
scene.render.filepath = "//short_camera_anim.mp4"
bpy.ops.render.render(animation=True)
OUTPUT:
RESULT:
Thus the AR enabled applications with interactivity like E learning environment, Virtual
walkthroughs and visualization of historic places was developed successfully
33
Develop AR enabled simple applications like human anatomy
EXNO: 9
visualization, DNA/RNA structure visualization and surgery
DATE:
simulation
AIM:
To Develop AR enabled simple applications like human anatomy visualization,
DNA/RNA structure visualization and surgery simulation using Unity.
PROCEDURE:
i. Step 1: Gather references of reference images or diagrams
ii. Step 2: Model the anatomical structures using Blender
iii. Step 3: Sculpt the tools for organic shapes and details
iv. Step 4: Apply textures to simulate skin, muscled, and other tissues.
v. Step 5: Rig muscles for realistic deformation during motion.
vi. Step 6: Animate your model to demonstrate movement or physiological processes.
vii. Step 7: Render your scene to produce the final images or animations
viii. Step 8: Use Blender’s compositing features for any necessary post-processing effects
ix. Step 9: Document your process and include any relevant notes for education purposes.
PROGRAM:
import bpy ;
bpy.ops.object.select_all(action='DESELECT')
bpy.ops.object.select_by_type(type='MESH')
bpy.ops.object.delete()
bpy.ops.mesh.primitive_uv_sphere_add(radius=1, location=(0, 0, 0))
34
OUTPUT :
RESULT:
Thus the Developed AR enabled simple applications like human anatomy
visualization, DNA, RNA structure visualization and surgery simulation was executed
successfully.
35
EXNO: 9
Develop simple MR enabled gaming applications
DATE:
AIM:
To Develop simple MR enabled gaming applications using Unity.
PROCEDURE:
PROGRAM:
import bpy
bpy.ops.object.select_all(action='DESELECT')
bpy.ops.object.select_by_type(type='MESH')
bpy.ops.object.delete()
bpy.ops.mesh.primitive_cube_add(size=2, location=(0, 0, 1))
cube = bpy.context.object
bpy.ops.mesh.primitive_plane_add(size=10, location=(0, 0, 0))
plane = bpy.context.object
mat_cube = bpy.data.materials.new(name="CubeMat")
mat_cube.use_nodes = False
mat_cube.diffuse_color = (1, 0.5, 0.2, 1)
cube.data.materials.append(mat_cube)
mat_plane = bpy.data.materials.new(name="PlaneMat")
mat_plane.use_nodes = False
mat_plane.diffuse_color = (0.7, 0.7, 0.7, 1)
plane.data.materials.append(mat_plane)
world = bpy.context.scene.world
36
world.use_nodes = True
bg = world.node_tree.nodes["Background"]
bg.inputs[0].default_value[:3] = (0.2, 0.2, 0.2)
bpy.ops.object.camera_add(location=(5, -5, 3), rotation=(1, 0, 0.8))
bpy.context.scene.camera = bpy.context.object
scene = bpy.context.scene
scene.render.engine = 'CYCLES'
scene.cycles.samples = 200
bpy.ops.render.render(write_still=True)
OUTPUT:
37
RESULT:
Thus Develop simple MR enabled gaming applications is successfully completed
38
39
40