0% found this document useful (0 votes)
90 views40 pages

Arvr Lab Manual

The document outlines the Bonafide Certificate format for students at Dhirajlal Gandhi College of Technology, detailing requirements for laboratory attendance and submission of records. It includes guidelines for laboratory conduct and a comprehensive list of experiments related to Augmented Reality (AR) and Virtual Reality (VR), along with their objectives and outcomes. The document also specifies procedures for various experiments using tools like Unity, Maya, and Blender, emphasizing the development of AR/VR applications.

Uploaded by

nidhafarheen2005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
90 views40 pages

Arvr Lab Manual

The document outlines the Bonafide Certificate format for students at Dhirajlal Gandhi College of Technology, detailing requirements for laboratory attendance and submission of records. It includes guidelines for laboratory conduct and a comprehensive list of experiments related to Augmented Reality (AR) and Virtual Reality (VR), along with their objectives and outcomes. The document also specifies procedures for various experiments using tools like Unity, Maya, and Blender, emphasizing the development of AR/VR applications.

Uploaded by

nidhafarheen2005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

1

2
DHIRAJLAL GANDHI COLLEGE OF TECHNOLOGY
Salem Airport (Opp.) Salem–636309 Ph:(04290) 233333 www.dgct.ac.in
BONAFIDE CERTIFICATE

Name: …………………………………………………………
Degree: …………………………………………………………
Branch: …………………………………………………………
Semester: …….… Year: ……….… Section: ………..
Reg No: …………………………………………………………
Certified that this is the Bonafide record of the work done by the above
student in
…………………………………………………………………………………………………………………..
Laboratory during the academic year …………………………………

LAB-IN-CHARGE HEAD OF THE DEPARTMENT

Submitted for University Practical Examination held on……………………………………

INTERNALEXAMINER EXTERNALEXAMINER

3
4
LABMANNERS

 Students must be present in proper dress code and wear the ID card.
 Students should enter the log-in and log-out time in the log register without
fail.
 Students are not allowed to download pictures, music, videos, or files
without the permission of the respective lab in-charge.
 Students should wear their own lab coats and bring observation notebooks
to the laboratory classes regularly.
 Record of experiments done in a particular class should be submitted in the
next lab class.
 Students who do not submit the record notebook in time will not be
allowed to do the next experiment and will not be given attendance for that
laboratory class.
 Students will not be allowed to leave the laboratory until they complete the
experiment.
 Students are advised to switch off the monitors and CPU when they leave
the lab.
 Students are advised to arrange the chairs properly when they leave the
lab.

5
6
7
8
9
CCS333 AUGMENTED REALITY / VIRTUAL REALITY
OBJECTIVES:
 Develop simple AR and VR applications using development tools.
 Explore real-world applications of AR and VR across various domains.
 Study future trends, challenges, and ethical issues in AR/VR.
LIST OF EXPERIMENTS:
1. Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and Blender.
2. Use the primitive objects and apply various projection by handling camera.
3. Download objects from asset store and apply various lighting and effects.
4. Model three dimensional objects using various modelling techniques and
apply textures over them.
5. Create three dimensional realistic scenes and develop simple virtual reality
enabled mobile applications which have limited interactivity.
6. Add audio and text special effects to the developed application.
7. Develop VR enabled applications using motion trackers and sensors
incorporating full haptic interactivity.
8. Develop AR enabled applications with interactivity like E learning
environment, Virtual walkthroughs and visualization of historic places.
9. Develop AR enabled simple applications like human anatomy visualization,
DNA/RNA structure visualization and surgery simulation.
10.Develop simple MR enabled gaming applications
COURSE OUTCOMES:
 C01: Understand the basic concepts of AR and VR
 C02: Understand the tools and technologies related to AR/VR
 C03: Know the working principle of AR/VR related Sensor devices
 C04: Design of various models using modeling techniques
 C05: Develop AR/VR applications in different domains

10
EX DATE NAME OF THE EXPERIMENT PAGE DATE OF MARKS STAFF REMARKS
NO: NO COMPLETION SIGNATURE

Study of tools like Unity, Maya, 3DS


1 MAX, AR toolkit, Vuforia and Blender.

Use the primitive objects and apply


2 various projection by handling
camera

Download objects from asset store


3 and apply various lighting and
effects.

Model three dimensional objects


4 using various modelling techniques
and apply textures over them

Create three dimensional realistic


scenes and develop simple virtual
5 reality enabled mobile applications
which have limited interactivity.

Add audio and text special effects to


6 the developed application.

Develop VR enabled applications


using motion trackers and sensors
7 incorporating full haptic interactivity

Develop AR enabled applications


with interactivity like E learning
8 environment, Virtual walkthroughs
and visualization of historic places

Develop AR enabled simple


applications like human anatomy
9 visualization, DNA/RNA structure
visualization and surgery simulation

Develop simple MR enabled gaming


10 applications

RECORD COMPLETION DATE :

AVERAGE MARK SCORED : INCHARGE :

11
12
EXNO: 1
Study of tools like Unity, Maya, 3DS MAX, AR
DATE: toolkit, Vuforia and Blender

AIM:
To study tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and Blender.

PROCEDURE:
1)Unity:

i. Launch Unity Hub and create a new 3D project.

ii. Explore the Scene view, Hierarchy, Inspector, and Game view.

iii. Use the Asset Store to browse and import basic 3D assets.

2)Maya:

i. Open Maya and create a new project.

ii. Use primitive tools (cube, sphere, etc.) to model simple shapes.

iii. Explore object transformation (move, scale, rotate) and apply basic materials.

13
3)3DS MAX:

i. Launch 3DS MAX and open a new scene.

ii. Create objects using standard primitives.

iii. Apply modifiers from the modifier stack to alter shapes.

4)AR Toolkit:

i. Install AR Toolkit and launch a sample marker-based tracking project.

ii. Print a sample marker and test camera tracking by overlaying 3D content.

14
5)Vuforia:

i. Sign in to the Vuforia Developer Portal.

ii. Create a license key and target database.

iii. Integrate Vuforia with Unity and test image tracking using webcam or mobile.

6)Blender:

i. Open Blender and explore Object Mode and Edit Mode.

ii. Add a primitive (e.g., cube) and apply transformations.

15
iii. Navigate the 3D viewport using mouse and shortcut keys.

RESULT:
Thus, Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and
Blender was successfully completed.

16
EXNO: 2
Use the primitive objects and apply various
DATE: projection by handling camera

AIM:

To use the primitive objects and apply various projection types by handling camera.

ALGORITHM:
i. Step 1 : Create primitive objects
ii. Step 2 : Manipulate primitive objects
iii. Step 3 : Camera Handling
iv. Step 4 : Applying projection types
v. Step 5 : Stop the program

PROCEDURE:
a. Open blender and select with a new project
b. To add primitive object like cubes ,sphere , etc. ...................... press shift + A ( or ) click
on the 'ADD' menu at the top -left of the 3D viewpoint
c. Select the type of primitive you want to add ( eg : cube , sphere, cone )
d. To move an object select it and press G then move your mouse (or) type
in specific co ordinate
e. To scale an object select it and press S, then move your mouse (or)
type in specific scale values
f. To rotate an object select it and press R , then move your mouse (or)
type in specific rotation angle
g. Select the camera in the 3D veiwpoint (right -click on it )
h. To move the camera , press G, then move your mouse (or) type in
specific co ordinate
i. To rotate the camera press R then move your mouse (or)type in
specific rotation angles
j. To adjust the camera focal length , select the camera and in " object
data" tab
k. Is in the camera settings (under the object data tab for the camera), you can adjust the
"Lens" value for different perspectives.
l. For orthographic projection, set the orthographic scale value to control the

17
size of objects, making them appear uniform in size regardless of distance.

OUTPUT:

RESULT:
Thus Use the primitive objects and apply various projection types by handling
camera was successfully completed

18
EXNO: 3
Download objects from asset store and apply
DATE: various lighting and shading effects

AIM:
To Download objects from asset store and apply various lighting and shading effects
using Blender.

ALGORITHM:
i. Step 1 : Create primitive download assets from Unity hub
ii. Step 2 : Export assets from unity
iii. Step 3 : Import assets from blender
iv. Step 4 : Apply Lighting and Shading Effects
v. Step 5 : Configure Rendering settings
vi. Step 6 : Render the scene
vii. Step 7 : Post process the scene
viii. Step 8 : Export the rendered image
ix. Step 9 : Stop the program

PROCEDURE:
a. Open unity and navigate to the unity asset store within the editor
b. Search for and download the object /assets you want to use
c. After downloading import, the assets into your unity project
d. Select the assets in the project tab
e. Right-click and choose " export package ". this will create a unity package
file (unity package) that contain the assets
f. Open blender and ensure you have necessary add -ones enabled like "import
export unity package "
g. Go to file > import > unity (unity package) and select the package you
exported from unity
h. Choose the object you want to import into blender
i. Once the object is in blender you can apply various lighting and shading effects
j. Step-up lighting by adding lamps (or) has to your scene to illuminate the
object

19
k. Adjust the rendering settings in blender, including resolution output format
and sampling settings, to achieve desired quality.

l. Set up your camera and angle to frame your scene


m. Click the "Render" button to render the scene.
n. You can further enhance your rendered image by applying post processing effect using
the compositors in blender.
o. After rendering is complete and satisfied with the result, you can export the image (or)
animation to your desired format.

OUTPUT :

20
RESULT:
Thus Use the primitive objects and apply various projection types by handling camera was
successfully completed.

21
EXNO: 4 Model three dimensional objects using various modelling

DATE: techniques and apply textures over them

AIM:
To Model three dimensional objects using various modelling techniques and apply
textures over them.

ALGORITHM:

i. Step 1 : Create a new object


ii. Step 2 : Use modelling techniques for the new object
iii. Step 3 : Add texture to the object
iv. Step 4 : Use UV unwrapping to the object
v. Step 5 : Use texture painting to the object
vi. Step 6 : Use materials and shading to the object
vii. Step 7 : Render the final image
viii. Step 8 : Stop the program

PROCEDURE:
a. Launch blender and delete the default cube.

b. Add a new mesh (eg:a sphere, cube,(or)custom shape)using the “Add” menu (or) shift+A.

c. Use various tools like extrude, scale, rotate and subdivide to shape our objects.

d. You can also sculpt, use modifiers(or) create objects from curves (or)text.

e. To apply textures, switch to the “shading” workspace.

f. Select your object in the 3D view port

g. In the shades editor ,create (or)select a material for your object.

h. Add a texture node (eg: image texture and connect it to the materials shade nodes

i. Load an image texture by clicking “open” and selecting an image file.

j. for precise texture placement ,UV unwrap our object.

22
k. Go to the edit mode ,select all vertices and unwrap using the “UV” menu >”unwrap”.

l. In the shader editor, add a texture co-ordinate node and a mapping node.

m. Connect the UV output of the texture co-ordinate node to the vector input of the image
texture node.

n. Adjust the mapping node settings (translation, rotation, scale) to control


the texture placement.

o. For more advanced textures, you can use the texture paint workspace to
directly paint on your model.

p. Adjust material properties like roughness, specular and normal maps to


achieve the desired surface appearance

q. Go to the “render” tab to see how our object looks in different lightning conditions.
r. Render the final image using the “render button

OUTPUT:

23
TEXTURING :

TEXTURE PAINTING :

24
EXTRUDE :

UV UNWRAPPING:

RESULT:

Thus, the model three dimensional objects using various modelling techniques and
apply textures over them was successfully.

25
Create three dimensional realistic scenes and develop simple
EXNO: 5
virtual reality enabled mobile applications which have limited
DATE:
interactivity.

AIM:

To Create three dimensional realistic scenes and develop simple virtual reality enable
mobile applications which have limited interactivity.

ALGORITHM:
i. Step 1 : Model the scene
ii. Step 2 : Add texture to the scene
iii. Step 3 : Add Lighting and rendering to the scene
iv. Step 4 : Animate the scene
v. Step 5 : Export the Scene to VR
vi. Step 6 : Use Unity for Unreal Integration

PROGRAM:

import bpy
if bpy.context.space_data is None or bpy.context.space_data.type != 'TEXT_EDITOR':
raise Exception("This script is intended to be run in the Text Editor")
def vr_interaction(scene):
controller = bpy.context.active_object
if controller:
if hasattr(controller, "button_pressed") and controller["button_pressed"]:
print("Button 0 is pressed!")
position = controller.location
print("Controller position:", position)
if vr_interaction not in bpy.app.handlers.frame_change_pre:
bpy.app.handlers.frame_change_pre.append(vr_interaction)

26
OUTPUT:

RESULT:
Thus the 3D scenes are to be created and develop simple VR applications.

27
Add audio and text special effects to the developed application
EXNO: 6

DATE:

AIM:
To Add audio and text special effects to the developed application.

ALGORITHM:
i. Step 1 : Open Blender
ii. Step 2 : Switch to video editing layout
iii. Step 3 : Import Audio
iv. Step 4 : Add text special effect
v. Step 5 : Render the final result
vi. Step 6 : Stop the program

PROGRAM:
import bpy
bpy.ops.wm.read_factory_settings(use_empty=True)
bpy.ops.object.text_add(location=(0, 0, 0))
txt = bpy.context.object
txt.data.body = "HELLO, WORLD"
txt.data.extrude = 0.1
mat = bpy.data.materials.new(name="TextMat")
mat.diffuse_color = (1, 0, 0, 1)
txt.data.materials.append(mat)
txt.location.x = -5
txt.keyframe_insert(data_path="location", index=0, frame=1)
txt.location.x = 5
txt.keyframe_insert(data_path="location", index=0, frame=50)
snd = bpy.data.sounds.load("C:/path/to/audio.mp3", check_existing=True)
seq = bpy.context.scene.sequence_editor_create()
seq.sequences.new_sound("Audio", snd.filepath, channel=1, frame_start=1)
scene = bpy.context.scene
scene.render.filepath = "//text_anim.mp4"
scene.render.image_settings.file_format = 'FFMPEG'
bpy.ops.render.render(animation=True)

28
OUTPUT:

RESULT:
Thus the audio and text special effects to the developed application was
successfully Created.

29
Develop VR enabled applications using motion trackers and
EXNO: 7
sensors incorporating full haptic interactivity
DATE:

AIM:
To Develop VR enabled applications using motion trackers and sensors incorporating
full haptic interactivity using Unity.

ALGORITHM:
i. Step 1 : Open Blender
ii. Step 2 : Setup VR Environment
iii. Step 3 : Use motion tracking and sensors in blender
iv. Step 4 : Implement haptic feedback through blender’s game engine
v. Step 5 : Develop interactive scenes by blender’s logic editor
vi. Step 6 : Create VR friendly UI elements for user interaction
vii. Step 7 : Test and deploy VR Application
viii. Step 8 : Stop the program

PROGRAM:
import bpy
import HapticDevice
class HapticDevice:
def connect(self):
print("Haptic device
connected.")
def get_position(self):
return (1.0, 2.0, 3.0)
device = HapticDevice()
device.connect()
obj = bpy.context.active_object
def
update_object_from_haptic(scene):
pos = device.get_position()
obj.location = pos
print(f"Object moved to: {pos}")
if update_object_from_haptic not in

30
bpy.app.handlers.frame_change_pre:
bpy.app.handlers.frame_change_pre.
append(update_object_from_haptic)
OUTPUT:

RESULT:
Thus the VR enabled applications using motion trackers and sensors incorporating haptic
interactivity was developed successfully.

31
Develop AR enabled applications with interactivity like E
EXNO: 8
learning environment, Virtual walkthroughs and visualization
DATE:
of historic places

AIM:
To Develop AR enabled applications with interactivity like E learning environment,
Virtual walkthroughs and visualization of historic places.

PROCEDURE:
i. Step 1: Creating an E-learning environment within blender
ii. Step 2: Designed for 3D modeling and animation in an web development
iii. Step 3: Create a scene with educational content
iv. Step 4: Write a python scripting in blender
v. Step 5: In this script, a cube is created in the scene and text is added to it.
vi. Step 6: The camera and lighting are setup for rendering and then scene is
rendered as a PNG image.

PROGRAM:

import bpy
bpy.ops.object.select_by_type(type='MESH')
bpy.ops.object.delete()
bpy.ops.mesh.primitive_cube_add(size=2)
bpy.ops.object.camera_add(location=(0, -5, 2), rotation=(1.0472, 0, 0.7854))
cam = bpy.context.object
cam.keyframe_insert(data_path="location", frame=1)
cam.location = (0, -2, 3)
cam.keyframe_insert(data_path="location", frame=50)

scene = bpy.context.scene
scene.render.resolution_x = 1920

32
scene.render.resolution_y = 1080
scene.frame_end = 50
scene.render.image_settings.file_format = 'FFMPEG'
scene.render.ffmpeg.format = 'MPEG4'
scene.render.filepath = "//short_camera_anim.mp4"
bpy.ops.render.render(animation=True)

OUTPUT:

RESULT:
Thus the AR enabled applications with interactivity like E learning environment, Virtual
walkthroughs and visualization of historic places was developed successfully

33
Develop AR enabled simple applications like human anatomy
EXNO: 9
visualization, DNA/RNA structure visualization and surgery
DATE:
simulation

AIM:
To Develop AR enabled simple applications like human anatomy visualization,
DNA/RNA structure visualization and surgery simulation using Unity.

PROCEDURE:
i. Step 1: Gather references of reference images or diagrams
ii. Step 2: Model the anatomical structures using Blender
iii. Step 3: Sculpt the tools for organic shapes and details
iv. Step 4: Apply textures to simulate skin, muscled, and other tissues.
v. Step 5: Rig muscles for realistic deformation during motion.
vi. Step 6: Animate your model to demonstrate movement or physiological processes.
vii. Step 7: Render your scene to produce the final images or animations
viii. Step 8: Use Blender’s compositing features for any necessary post-processing effects
ix. Step 9: Document your process and include any relevant notes for education purposes.

PROGRAM:
import bpy ;

bpy.ops.object.select_all(action='DESELECT')

bpy.ops.object.select_by_type(type='MESH')

bpy.ops.object.delete()
bpy.ops.mesh.primitive_uv_sphere_add(radius=1, location=(0, 0, 0))

bpy.ops.object.light_add(type='SUN', radius=1, location=(5, 5, 5))

bpy.ops.object.camera_add(location=(0, -5, 2), rotation=(1.0472, 0, 0)

34
OUTPUT :

RESULT:
Thus the Developed AR enabled simple applications like human anatomy
visualization, DNA, RNA structure visualization and surgery simulation was executed
successfully.

35
EXNO: 9
Develop simple MR enabled gaming applications
DATE:

AIM:
To Develop simple MR enabled gaming applications using Unity.

PROCEDURE:

i. Step 1: Collect anatomical reference images or diagrams to guide your modeling.


ii. Step 2 : Use Blender’s modeling tools to create 3D representations of anatomical
structures
iii. Step 3 : Utilize Blender’s sculpting tools for organic shapes and fine details
iv. Step 4 : Apply textures to simulate skin, muscled, and other tissues.
v. Step 5 : Rig muscles for realistic deformation during motion.
vi. Step 6 : Animate your model to demonstrate movement or physiological processes.
vii. Step 7 : Render your scene to produce the final images or animations.
viii. Step 8 : Use Blender’s compositing features for any necessary post-processing effects.
ix. Step 9 : Document your process and include any relevant notes for education purposes.

PROGRAM:
import bpy
bpy.ops.object.select_all(action='DESELECT')
bpy.ops.object.select_by_type(type='MESH')
bpy.ops.object.delete()
bpy.ops.mesh.primitive_cube_add(size=2, location=(0, 0, 1))
cube = bpy.context.object
bpy.ops.mesh.primitive_plane_add(size=10, location=(0, 0, 0))
plane = bpy.context.object
mat_cube = bpy.data.materials.new(name="CubeMat")
mat_cube.use_nodes = False
mat_cube.diffuse_color = (1, 0.5, 0.2, 1)
cube.data.materials.append(mat_cube)
mat_plane = bpy.data.materials.new(name="PlaneMat")
mat_plane.use_nodes = False
mat_plane.diffuse_color = (0.7, 0.7, 0.7, 1)
plane.data.materials.append(mat_plane)

world = bpy.context.scene.world

36
world.use_nodes = True
bg = world.node_tree.nodes["Background"]
bg.inputs[0].default_value[:3] = (0.2, 0.2, 0.2)
bpy.ops.object.camera_add(location=(5, -5, 3), rotation=(1, 0, 0.8))
bpy.context.scene.camera = bpy.context.object
scene = bpy.context.scene
scene.render.engine = 'CYCLES'
scene.cycles.samples = 200
bpy.ops.render.render(write_still=True)

OUTPUT:

37
RESULT:
Thus Develop simple MR enabled gaming applications is successfully completed

38
39
40

You might also like