0% found this document useful (0 votes)
629 views47 pages

Ar VR Lab Manual

Augmented reality & virtual reality

Uploaded by

Syed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
629 views47 pages

Ar VR Lab Manual

Augmented reality & virtual reality

Uploaded by

Syed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

CONTENTS

NAME OF THE
EXP.NO. DATE MARKS STAFF SIGNATURE
EXPERIMENT
Study of tools like Unity,
1. Maya, 3DS MAX, AR
toolkit, Vuforia and Blender
Use the primitive objects
2. and apply various projection
types by handling camera.
Download objects from
3. asset store and apply various
lighting and shading effects.
Model three dimensional
objects using various
4. modelling techniques and
apply textures
over them.
Create three dimensional
realistic scenes and develop
simple virtual reality
5.
enabled mobile
applications which have
limited interactivity.
Add audio and text special
6. effects to the developed
application.
Develop VR enabled
applications using motion
7. trackers and sensors
incorporating full
haptic interactivity.
Develop AR enabled
applications with
interactivity like E learning
8. environment, Virtual
walkthroughs and
visualization of historic
places
Develop AR enabled simple
applications like human
anatomy visualization,
9.
DNA/RNA structure
visualization and surgery
simulation
Develop simple MR enabled
10.
gaming applications
EXNO: 1 Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia
DATE: and Blender

AIM:

To Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and Blender.

PROCEDURE:

Blender offers a wide range of tools and features to help you to create stunning 3D
project.

1. VIEWPOINT NAVIGATION:

Learn how to navigate in 3D viewpoint using tools like pan(shift + middle


mouse button) zoom (scroll wheel) rotate (middle mouse button)

2. SELECT:

Contain tools for selecting objects

SELECT MENU
All
select all selectable object
reference
Mode: All modes
Menu: select & all
Shortcut: A

3. DESELECT OBJECT:
Deselect all the object, but the active object stays the same.

Reference
Mode: All modes
Menu: Select & None
Shortcut: Alt + A

4. BOX SELECT:
Intractive box selection

Reference
Mode: All modes
Menu: Select & box select
Shortcut: B

5. OBJECT:
Contain tools for operating an object such as duplicating them.
6. MOVE:
In object mode, the move option lets you move objects. Translation means
changing location of object.
Reference
Mode: Object mode, edit mode and pose mode
Menu: Object/Mesh/Curve/Surface & transform & move
Shortcut: G

7. ROTATE:
• It is also known as a spin, twist, orbit, pivot
• Changing the orientation of element (vertices, edges, faces, object)

Reference
Mode: object and edit modes
Menu: object / mesh / curve / surface & transform & rotate
Shortcut: R

8. SCALE:
It means changing proportions of object

Reference
Mode: object and edit modes
Menu: object / mesh / curve / surface & transform & scale
Shortcut: S
9. TOOLBAR:
The tool bar contains list of tools

OBJECT MODE:
Object Mode
EDIT MODE:
Mesh Edit Mode
Curve Edit Mode
Surface Edit Mode
Meta ball Edit Mode
PRINT MODES:
Script Mode
Texture Paint Mode
Vertex Paint Mode
Weight Paint Mode
GREASE PENCIL:
Grease Pencil Edit
Grease Pencil Draw
Grease Pencil Scripting
Grease Pencil Weight Paint
OUTPUT
RESULT:
Thus, Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and Blender was
successfully completed.
EXNO: 2 Use the primitive objects and apply various projection types by
DATE: handling camera

AIM:

To Use the primitive objects and apply various projection types by handling camera

PROCEDURE:

STEP1 : CREATE PRIMITIVE OBJECTS

(i) Open blender and select with a new project


(ii) To add primitive object like cubes , sphere ,etc... . press shift + A ( or )
click on the 'ADD' menu at the top -left of the 3D viewpoint
(iii) Select the type of primitive you want to add ( eg : cube , sphere ,
cone )

STEP2 : MANIPULATE PRIMITIVE OBJECTS

(i) To move an object ,select it and press G ,then move your mouse (or)
type in specific co ordinate
(ii) To scale an object ,select it and press S , then move your mouse (or)
type in specific scale values
(iii) To rotate an object , select it and press R , then move your mouse (or)
type in specific rotation angle

STEP3 : CAMERA HANDLING

(i) Select the camera in the 3D veiwpoint (right -click on it )


(ii) To move the camera , press G, then move your mouse (or) type in
specific co ordinate
(iii) To rotate the camera ,pressR , then move your mouse (or) type in
specific rotation angles
(iv) To adjust the camera focal lenght , select the camera and in " object
data" tab
STEP4 : APPLYING PROJECTION TYPES

(i) Is in the camera settings (under the object


data" tab for the camera), you can adjust the "Lens" value for different
perspectives.

(ii) For orthographic projection, set the orthographic scale value to control the
size of objects, making them appear uniform in size regardless of distance.

Reference
Mode : All modes
Menu : View & View Perspective/ Orthographic
shortcut : Numpad 5
OUTPUT
RESULT:
Thus Use the primitive objects and apply various projection types by handling camera
was successfully completed.
EXNO: 3 Download objects from asset store and apply various lighting and
shading effects
DATE:

AIM:

To Download objects from asset store and apply various lighting and shading effects using
Blender.

PROCEDURE:

STEP1 : CREATE PRIMITIVEDOWNLOAD ASSETS FROM UNITY


ASSET STORE

(i) Open unity and navigate to the unity asset store within the editor
(ii) Search for and download the object /assets you want to use

STEP2 : EXPORT ASSETS FROM UNITY

(i) After downloading ,import the assets into your unity project
(ii) Select the assets in the project tab
(iii) Right-click and choose " export package ". this will create a unity package
file (unity package) that contain the assets

STEP3: IMPORT ASSETS INTO BLENDER

(i) Open blender and ensure you have necessary add -ones enabled like "import
- export unity package "
(ii) Go to file > import > unity (unity package) and select the package you
exported from unity
(iii) Choose the object you want to import into blender

STEP4: APPLYING LIGHTING AND SHADING EFFECTS

(i) Once the object are in blender you can apply various lighting and shading
effects
(ii) Step-up lighting by adding lamps (or) has to your scene to illuminate the
object

STEP5 : CONFIGURE RENDREING SETTINGS

(i) Adjust the rendering settings in blender, including resolution, output format
and sampling settings, to achieve desired quality.

STEP6 : RENDER THE SCENE


(i) Set up your camera and angle to frame your scene
(ii) Click the "Render" button to render the scene.

STEP7 : POST PROCESSING


(i) You can further enhance your rendered image by applying post processing
effect using the compositors in blender.

STEP8 : EXPORT THE RENDERED IMAGE (OR) ANIMATION

(i) After rendering is complete and satisfied with the result, you can export the
image (or) animation to your desired format.
OUTPUT
RESULT:

Thus Download objects from asset store and apply various lighting and shading
effects was successfully completed.
EXNO: 4 Model three dimensional objects using various modelling techniques and
DATE: apply textures over them

AIM:
To Model three dimensional objects using various modelling techniques and apply
textures over them.

PROCEDURE:

STEP1: OBJECT CREATION

• launch blender and delete the default cube.

• Add a new mesh (eg:a sphere, cube,(or)custom shape)using the “Add” menu(or) shift
+A.

STEP2: MODELING TECHNIQUES

• use various tools like extrude, scale, rotate and subdivide to shape our objects.

• You can also sculpt, use modifiers(or) create objects from curves (or)text.

STEP3: TEXTURING

• To apply textures, switch to the “shading” workspace.

• Select your object in the 3D view port

• In the shades editor ,create (or)select a material for your object.

• Add a texture node (eg: image texture and connect it to the materials shade nodes

• Load an image texture by clicking “open” and selecting an image file.

STEP4: UV UNWRAPPING

• for precise texture placement ,UV unwrap our object.

• Go to the edit mode ,select all vertices and unwrap using the “UV” menu >”unwrap”.

STEP5: TEXTURE

• In the shader editor, add a texture co-ordinate node and a mapping node.

• Connect the UV output of the texture co-ordinate node to the vector input of the
image texture node.

• Adjust the mapping node settings (translation, rotation, scale) to control the texture
placement.
STEP6: TEXTURE PAINTING

• For more advanced textures, you can use the texture paint workspace to directly paint
on your model.

STEP7: MATERIALS AND SHADING

• Adjust material properties like roughness, specular and normal maps to achieve the
desired surface appearance

STEP8: RENDER

• Go to the “render” tab to see .how our object looks in different lightning conditions.

• Render the final image using the “render button “


OUTPUT

TEXTURING
TEXTURE PAINTING:

EXTRUDE:
UV UNWRAPPING:
RESULT:

Thus, the model three dimensional objects using various modelling techniques and apply
textures over them was successfully.
EXNO: 5 Create three dimensional realistic scenes and develop simple virtual
DATE: reality enabled mobile applications which have limited interactivity.

AIM:

To Create three dimensional realistic scenes and develop simple virtual reality enabled
mobile applications which have limited interactivity.

Procedure:
STEP 1: MODELLING:
• Use blender to create 3D modules of the objects, characters or
environment you want in your VR scene.
STEP 2: TEXTURING:
• Apply textures to your 3D models to make them look realistic.
STEP 3: LIGHTING AND RENDERING:
• Set up lighting in your scene.
• Use blender’s rendering capabilities to create high quality images or
video sequences.
STEP 4: ANIMATION:
• Use blender’s animation tools to bring your models include characters
animation object movements.
STEP 5: EXPORT FOR VR:
• Export your blender project to a format compatible with your chosen
VR platform.
STEP 6: UNITY OR UNREAL INTEGRATION:
• Import your blender assets into a game development platform like
unity.
• Set up VR controls and interactions using the platform scripting
capabilities.
Using unity engine:
PROGRAM:
Import bpy
#check if running in blender’s Texteditor
If bpy.context.space_data is None or bpy.context.space_datatype !=’TEXT_EDITOR’:
Raise exception (“This Script is intended to be run in the Text Editor”)
# Function to handle VR interaction logic
def Vr_interaction():
#Get the active object(assuming it’s the VR controller)
Controller= bpy.context.active_object
If controller:
#Example: check if a button is pressed
If controller.data.buttons [0].is_pressed:
Print(“Button 0 is pressed !”)
# Example : Get the position of the controller
Position =controller.location
Print(“controller position:”, position)
#Register the function to be called on each frame update
Bpy.app.handless.frame_change_pre.append(Vr_interaction)
OUTPUT:
RESULT:
Thus the 3D scenes are to be created and develop simple VR applications.
EXNO: 6
Add audio and text special effects to the developed application
DATE:

AIM:

To Add audio and text special effects to the developed application.

PROCEDURE:
STEP 1: Open Blender
STEP 2: Switch to video editing layout
STEP 3: Import Audio
STEP 4: Add text special effect
STEP 5: Rendering
• Click
• ‘Render Animation’ to render the final result.
PROGRAM:
Import bpy
#clear existing data
Bpy.ops.wm.read_factory_settings(use_empty=True )
#create a text object
Bpy.ops.object.text_add(enter_editmode=false,align=’WORLD’,location=(0,0,0))
text_object=bpy.context.object
text_object.data.body=”HELLO,WORLD”
#Add Audio
Bpy.ops.sound.bake(filepath=”path/to/audio/file.mps”,length=100,bake_channel=”N
OIS E”)
#Create a material for the text
Material=bpy.data.materials.new(name=”Text material”)
text_object.data.materials.append(materials)
#Set material properties(eg.color)
Material.diffuse_color=(1,0,0,1)#red color
#Animate text
text_object.location.x=-5 #initial position
text_object.keyframe_insert(data path=”location”,index=0,frame=1) #keyframe of
frame
text_object.location.x=5 #final position
text_object.keyframe_insert(data_path=”location”,index=0,frame=50)
#Add text effect(extrude)
text_object.data.extrude=0.1
#Render animation
bpy.ops.render.render(animation=True)
OUTPUT:
RESULT:

Thus the audio and text special effects to the developed application was successfully
Created.
EXNO: 7 Develop VR enabled applications using motion trackers and sensors
DATE: incorporating full haptic interactivity

AIM:
To Develop VR enabled applications using motion trackers and sensors incorporating
full haptic interactivity using Unity.

PROCEDURE:

STEP1: open blender.

STEP2: setup VR environment.

STEP3: motion tracking and sensors in blender.

STEP4: implement haptic feedback through blender’s game engine.

STEP5: develop interactive scenes by using blender’s logic editor.

STEP6: create VR friendly UI elements for user interaction

STEP7: programming with python.

STEP8: testing VR application .

STEP9: deployment on VR platform.

PROGRAM 1:

Import bpy

#Assuming ‘sensor_data’ is a placeholders for your actual sensor data.

Sensor_data=[0.1,0.2,0.3]

#Access the active object (assuming it’s the object you want to animate).

Obj=bpy.context .active _objects

#Animate the object based on sensor data .

Obj.location=(sensor_data[0],sensor_data[1],sensor_data[2])
PROGRAM 2:

Import bpy

From obpython import HapticDevice.

#this is a hypothetical import: replace with our actual library

#connect to the haptic device.

device=Haptic Device()

device.connect()

#Access the active object

Obj= bpy.context.active_object

#mainloop for haptic interaction.

While true:

#Get haptic device position data

Position_data=device.get position()

#update object position based on haptic device data

Obj.location=position _data[0],position_data[1],position_data[2])

#add more logic for interaction eg:force feedback

If bpy.context.scene.halt_haptic interaction

Break
OUTPUT:
RESULT:

Thus the VR enabled applications using motion trackers ,and sensors incorporating haptic
interactivity was developed successfully.
EXNO: 8 Develop AR enabled applications with interactivity like E learning
DATE: environment, Virtual walkthroughs and visualization of historic places

AIM:

To Develop AR enabled applications with interactivity like E learning environment,


Virtual walkthroughs and visualization of historic places using Unity.

PROCEDURE:
STEP-1: Creating an E-learning environment within blender.
STEP-2: Designed for 3D modeling and animation in an web development
STEP-3: Create a scene with educational content
STEP-4: Write a python scripting in blender
STEP-5: In this script, a cube is created in the scene,and text is added to it.
STEP-6: The camera and lighting are setup for rendering and then scene is rendered
as a PNG image.

PROGRAM:

import bpy

# Set up the scene


bpy.ops.object.select_all(action='DESELECT')
bpy.ops.object.select_by_type(type='MESH')
bpy.ops.object.delete()

# Add a cube as a placeholder object


bpy.ops.mesh.primitive_cube_add(size=2)

# Set camera location and rotation


bpy.ops.object.camera_add(location=(0, -5, 2), rotation=(1.0472, 0, 0.7854))
# Set the render resolution
bpy.context.scene.render.resolution_x = 1920
bpy.context.scene.render.resolution_y = 1080

# Create animation frames


bpy.context.scene.frame_start = 1
bpy.context.scene.frame_end = 250

# Define camera movement


bpy.ops.object.camera_to_view_selected()

# Create keyframes for camera movement


bpy.ops.anim.keyframe_insert_menu(type='Location')
bpy.ops.anim.keyframe_insert_menu(type='Rotation')

# Set up animation parameters


bpy.context.scene.render.image_settings.file_format = 'FFMPEG'
bpy.context.scene.render.ffmpeg.format = 'MPEG4'
bpy.context.scene.render.filepath = "//output_animation.mp4"

# Render animation
bpy.ops.render.render(animation=True)
OUTPUT:
RESULT:
Thus the AR enabled applications with interactivity like E learning environment,
Virtual walkthroughs and visualization of historic places was developed successfully.
EXNO: 9 Develop AR enabled simple applications like human anatomy
DATE: visualization, DNA/RNA structure visualization and surgery simulation

AIM:

To Develop AR enabled simple applications like human anatomy visualization,


DNA/RNA structure visualization and surgery simulation using Unity.

PROCEDURE:
STEP 1: REFERENCE GATHERING

● Collect anatomical reference images or diagrams to guide your modeling.


STEP 2: MODELING

● Use Blender’s modeling tools to create 3D representations of anatomical structures.

●Start with a basic mesh and gradually refine details.


STEP 3: SCULPTING

● Utilize Blender’s sculpting tools for organic shapes and fine details.

● Pay attention to proportions and anatomical accuracy.


STEP 4: TEXTURING

● Apply textures to simulate skin, muscled, and other tissues.

● UV unwrap your model for accurate texture placement.


STEP 5: RIGGING

● If animating, create an armature(skeleton) to enable movement.

● Rig muscles for realistic deformation during motion.


STEP 6: ANIMATION(OPTIMAL)

● Animate your model to demonstrate movement or physiological processes.


STEP 7: LIGHTING AND RENDERING

● Set up appropriate lighting to highlight features.

● Render your scene to produce the final images or animations.


STEP 8: POST-PROCESSING

●Use Blender’s compositing features for any necessary post-processing effects.


STEP 9: DOCUMENTATION

● Document your process and include any relevant notes for education purposes.
PROGRAM:

import bpy

# Clear existing mesh objects


bpy.ops.object.select_all(action='DESELECT')
bpy.ops.object.select_by_type(type='MESH')
bpy.ops.object.delete()

# Create a basic human figure (you may need to import a human model)
bpy.ops.mesh.primitive_uv_sphere_add(radius=1, location=(0, 0, 0))

# Customize and modify the figure as needed


# Example: bpy.ops.transform.resize(value=(1, 1, 2))

# Set up lighting for better visualization


bpy.ops.object.light_add(type='SUN', radius=1, location=(5, 5, 5))

# Set the camera location and orientation


bpy.ops.object.camera_add(location=(0, -5, 2), rotation=(1.0472, 0, 0)
OUTPUT:
RESULT:
Thus the Developed AR enabled simple applications like human anatomy
visualization, DNA, RNA structure visualization and surgery simulation was executed successfully.
EXNO: 10
Develop simple MR enabled gaming applications
DATE:

AIM:
To Develop simple MR enabled gaming applications using Unity.

PROCEDURE:

STEP 1: REFERENCE GATHERING

● Collect anatomical reference images or diagrams to guide your modeling.


STEP 2: MODELING

● Use Blender’s modeling tools to create 3D representations of anatomical structures.

●Start with a basic mesh and gradually refine details.


STEP 3: SCULPTING

● Utilize Blender’s sculpting tools for organic shapes and fine details.

● Pay attention to proportions and anatomical accuracy.


STEP 4: TEXTURING

● Apply textures to simulate skin, muscled, and other tissues.

● UV unwrap your model for accurate texture placement.


STEP 5: RIGGING

● If animating, create an armature(skeleton) to enable movement.

● Rig muscles for realistic deformation during motion.


STEP 6: ANIMATION(OPTIMAL)

● Animate your model to demonstrate movement or physiological processes.


STEP 7: LIGHTING AND RENDERING

● Set up appropriate lighting to highlight features.

● Render your scene to produce the final images or animations.


STEP 8: POST-PROCESSING

●Use Blender’s compositing features for any necessary post-processing effects.


STEP 9: DOCUMENTATION
● Document your process and include any relevant notes for education purposes.

PROGRAM:

import bpy

# Clear existing mesh objects in the scene


bpy.ops.object.select_all(action='DESELECT')
bpy.ops.object.select_by_type(type='MESH')
bpy.ops.object.delete()

# Create a cube
bpy.ops.mesh.primitive_cube_add(size=2, enter_editmode=False, align='WORLD',
location=(0, 0, 1))

# Create a plane
bpy.ops.mesh.primitive_plane_add(size=10, enter_editmode=False, align='WORLD',
location=(0, 0, 0))

# Create a material for the cube


material_cube = bpy.data.materials.new(name="MaterialCube")
material_cube.use_nodes = False # Disable node-based shading for simplicity
material_cube.diffuse_color = (1, 0.5, 0.2, 1) # RGB color values
bpy.context.object.data.materials.append(material_cube)

# Create a material for the plane


material_plane = bpy.data.materials.new(name="MaterialPlane")
material_plane.use_nodes = False
material_plane.diffuse_color = (0.7, 0.7, 0.7, 1)
bpy.context.object.data.materials.append(material_plane)
# Set up environment lighting
world = bpy.context.scene.world
world.use_nodes = True
bg_shader = world.node_tree.nodes["Background"]
bg_shader.inputs[0].default_value[:3] = (0.2, 0.2, 0.2) # RGB color values for background

# Set up camera
bpy.ops.object.camera_add(enter_editmode=False, align='WORLD', location=(5, -5, 3),
rotation=(1, 0, 0.8))
bpy.context.scene.camera = bpy.context.object

# Set render settings


bpy.context.scene.render.engine = 'CYCLES'
bpy.context.scene.cycles.samples = 200

# Render the scene


bpy.ops.render.render(write_still=True)import bpy

# Clear existing mesh objects in the scene


bpy.ops.object.select_all(action='DESELECT')
bpy.ops.object.select_by_type(type='MESH')
bpy.ops.object.delete()

# Create a cube
bpy.ops.mesh.primitive_cube_add(size=2, enter_editmode=False, align='WORLD',
location=(0, 0, 1))

# Create a plane
bpy.ops.mesh.primitive_plane_add(size=10, enter_editmode=False, align='WORLD',
location=(0, 0, 0))
# Create a material for the cube
material_cube = bpy.data.materials.new(name="MaterialCube")
material_cube.use_nodes = False # Disable node-based shading for simplicity
material_cube.diffuse_color = (1, 0.5, 0.2, 1) # RGB color values
bpy.context.object.data.materials.append(material_cube)

# Create a material for the plane


material_plane = bpy.data.materials.new(name="MaterialPlane")
material_plane.use_nodes = False
material_plane.diffuse_color = (0.7, 0.7, 0.7, 1)
bpy.context.object.data.materials.append(material_plane)

# Set up environment lighting


world = bpy.context.scene.world
world.use_nodes = True
bg_shader = world.node_tree.nodes["Background"]
bg_shader.inputs[0].default_value[:3] = (0.2, 0.2, 0.2) # RGB color values for background

# Set up camera
bpy.ops.object.camera_add(enter_editmode=False, align='WORLD', location=(5, -5, 3),
rotation=(1, 0, 0.8))
bpy.context.scene.camera = bpy.context.object

# Set render settings


bpy.context.scene.render.engine = 'CYCLES'
bpy.context.scene.cycles.samples = 200
OUTPUT:
RESULT:

Thus Develop simple MR enabled gaming applications is successfully completed.

You might also like