Ar VR Lab Manual
Ar VR Lab Manual
NAME OF THE
EXP.NO. DATE MARKS STAFF SIGNATURE
EXPERIMENT
Study of tools like Unity,
1. Maya, 3DS MAX, AR
toolkit, Vuforia and Blender
Use the primitive objects
2. and apply various projection
types by handling camera.
Download objects from
3. asset store and apply various
lighting and shading effects.
Model three dimensional
objects using various
4. modelling techniques and
apply textures
over them.
Create three dimensional
realistic scenes and develop
simple virtual reality
5.
enabled mobile
applications which have
limited interactivity.
Add audio and text special
6. effects to the developed
application.
Develop VR enabled
applications using motion
7. trackers and sensors
incorporating full
haptic interactivity.
Develop AR enabled
applications with
interactivity like E learning
8. environment, Virtual
walkthroughs and
visualization of historic
places
Develop AR enabled simple
applications like human
anatomy visualization,
9.
DNA/RNA structure
visualization and surgery
simulation
Develop simple MR enabled
10.
gaming applications
EXNO: 1 Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia
DATE: and Blender
AIM:
To Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and Blender.
PROCEDURE:
Blender offers a wide range of tools and features to help you to create stunning 3D
project.
1. VIEWPOINT NAVIGATION:
2. SELECT:
SELECT MENU
All
select all selectable object
reference
Mode: All modes
Menu: select & all
Shortcut: A
3. DESELECT OBJECT:
Deselect all the object, but the active object stays the same.
Reference
Mode: All modes
Menu: Select & None
Shortcut: Alt + A
4. BOX SELECT:
Intractive box selection
Reference
Mode: All modes
Menu: Select & box select
Shortcut: B
5. OBJECT:
Contain tools for operating an object such as duplicating them.
6. MOVE:
In object mode, the move option lets you move objects. Translation means
changing location of object.
Reference
Mode: Object mode, edit mode and pose mode
Menu: Object/Mesh/Curve/Surface & transform & move
Shortcut: G
7. ROTATE:
• It is also known as a spin, twist, orbit, pivot
• Changing the orientation of element (vertices, edges, faces, object)
Reference
Mode: object and edit modes
Menu: object / mesh / curve / surface & transform & rotate
Shortcut: R
8. SCALE:
It means changing proportions of object
Reference
Mode: object and edit modes
Menu: object / mesh / curve / surface & transform & scale
Shortcut: S
9. TOOLBAR:
The tool bar contains list of tools
OBJECT MODE:
Object Mode
EDIT MODE:
Mesh Edit Mode
Curve Edit Mode
Surface Edit Mode
Meta ball Edit Mode
PRINT MODES:
Script Mode
Texture Paint Mode
Vertex Paint Mode
Weight Paint Mode
GREASE PENCIL:
Grease Pencil Edit
Grease Pencil Draw
Grease Pencil Scripting
Grease Pencil Weight Paint
OUTPUT
RESULT:
Thus, Study of tools like Unity, Maya, 3DS MAX, AR toolkit, Vuforia and Blender was
successfully completed.
EXNO: 2 Use the primitive objects and apply various projection types by
DATE: handling camera
AIM:
To Use the primitive objects and apply various projection types by handling camera
PROCEDURE:
(i) To move an object ,select it and press G ,then move your mouse (or)
type in specific co ordinate
(ii) To scale an object ,select it and press S , then move your mouse (or)
type in specific scale values
(iii) To rotate an object , select it and press R , then move your mouse (or)
type in specific rotation angle
(ii) For orthographic projection, set the orthographic scale value to control the
size of objects, making them appear uniform in size regardless of distance.
Reference
Mode : All modes
Menu : View & View Perspective/ Orthographic
shortcut : Numpad 5
OUTPUT
RESULT:
Thus Use the primitive objects and apply various projection types by handling camera
was successfully completed.
EXNO: 3 Download objects from asset store and apply various lighting and
shading effects
DATE:
AIM:
To Download objects from asset store and apply various lighting and shading effects using
Blender.
PROCEDURE:
(i) Open unity and navigate to the unity asset store within the editor
(ii) Search for and download the object /assets you want to use
(i) After downloading ,import the assets into your unity project
(ii) Select the assets in the project tab
(iii) Right-click and choose " export package ". this will create a unity package
file (unity package) that contain the assets
(i) Open blender and ensure you have necessary add -ones enabled like "import
- export unity package "
(ii) Go to file > import > unity (unity package) and select the package you
exported from unity
(iii) Choose the object you want to import into blender
(i) Once the object are in blender you can apply various lighting and shading
effects
(ii) Step-up lighting by adding lamps (or) has to your scene to illuminate the
object
(i) Adjust the rendering settings in blender, including resolution, output format
and sampling settings, to achieve desired quality.
(i) After rendering is complete and satisfied with the result, you can export the
image (or) animation to your desired format.
OUTPUT
RESULT:
Thus Download objects from asset store and apply various lighting and shading
effects was successfully completed.
EXNO: 4 Model three dimensional objects using various modelling techniques and
DATE: apply textures over them
AIM:
To Model three dimensional objects using various modelling techniques and apply
textures over them.
PROCEDURE:
• Add a new mesh (eg:a sphere, cube,(or)custom shape)using the “Add” menu(or) shift
+A.
• use various tools like extrude, scale, rotate and subdivide to shape our objects.
• You can also sculpt, use modifiers(or) create objects from curves (or)text.
STEP3: TEXTURING
• Add a texture node (eg: image texture and connect it to the materials shade nodes
STEP4: UV UNWRAPPING
• Go to the edit mode ,select all vertices and unwrap using the “UV” menu >”unwrap”.
STEP5: TEXTURE
• In the shader editor, add a texture co-ordinate node and a mapping node.
• Connect the UV output of the texture co-ordinate node to the vector input of the
image texture node.
• Adjust the mapping node settings (translation, rotation, scale) to control the texture
placement.
STEP6: TEXTURE PAINTING
• For more advanced textures, you can use the texture paint workspace to directly paint
on your model.
• Adjust material properties like roughness, specular and normal maps to achieve the
desired surface appearance
STEP8: RENDER
• Go to the “render” tab to see .how our object looks in different lightning conditions.
TEXTURING
TEXTURE PAINTING:
EXTRUDE:
UV UNWRAPPING:
RESULT:
Thus, the model three dimensional objects using various modelling techniques and apply
textures over them was successfully.
EXNO: 5 Create three dimensional realistic scenes and develop simple virtual
DATE: reality enabled mobile applications which have limited interactivity.
AIM:
To Create three dimensional realistic scenes and develop simple virtual reality enabled
mobile applications which have limited interactivity.
Procedure:
STEP 1: MODELLING:
• Use blender to create 3D modules of the objects, characters or
environment you want in your VR scene.
STEP 2: TEXTURING:
• Apply textures to your 3D models to make them look realistic.
STEP 3: LIGHTING AND RENDERING:
• Set up lighting in your scene.
• Use blender’s rendering capabilities to create high quality images or
video sequences.
STEP 4: ANIMATION:
• Use blender’s animation tools to bring your models include characters
animation object movements.
STEP 5: EXPORT FOR VR:
• Export your blender project to a format compatible with your chosen
VR platform.
STEP 6: UNITY OR UNREAL INTEGRATION:
• Import your blender assets into a game development platform like
unity.
• Set up VR controls and interactions using the platform scripting
capabilities.
Using unity engine:
PROGRAM:
Import bpy
#check if running in blender’s Texteditor
If bpy.context.space_data is None or bpy.context.space_datatype !=’TEXT_EDITOR’:
Raise exception (“This Script is intended to be run in the Text Editor”)
# Function to handle VR interaction logic
def Vr_interaction():
#Get the active object(assuming it’s the VR controller)
Controller= bpy.context.active_object
If controller:
#Example: check if a button is pressed
If controller.data.buttons [0].is_pressed:
Print(“Button 0 is pressed !”)
# Example : Get the position of the controller
Position =controller.location
Print(“controller position:”, position)
#Register the function to be called on each frame update
Bpy.app.handless.frame_change_pre.append(Vr_interaction)
OUTPUT:
RESULT:
Thus the 3D scenes are to be created and develop simple VR applications.
EXNO: 6
Add audio and text special effects to the developed application
DATE:
AIM:
PROCEDURE:
STEP 1: Open Blender
STEP 2: Switch to video editing layout
STEP 3: Import Audio
STEP 4: Add text special effect
STEP 5: Rendering
• Click
• ‘Render Animation’ to render the final result.
PROGRAM:
Import bpy
#clear existing data
Bpy.ops.wm.read_factory_settings(use_empty=True )
#create a text object
Bpy.ops.object.text_add(enter_editmode=false,align=’WORLD’,location=(0,0,0))
text_object=bpy.context.object
text_object.data.body=”HELLO,WORLD”
#Add Audio
Bpy.ops.sound.bake(filepath=”path/to/audio/file.mps”,length=100,bake_channel=”N
OIS E”)
#Create a material for the text
Material=bpy.data.materials.new(name=”Text material”)
text_object.data.materials.append(materials)
#Set material properties(eg.color)
Material.diffuse_color=(1,0,0,1)#red color
#Animate text
text_object.location.x=-5 #initial position
text_object.keyframe_insert(data path=”location”,index=0,frame=1) #keyframe of
frame
text_object.location.x=5 #final position
text_object.keyframe_insert(data_path=”location”,index=0,frame=50)
#Add text effect(extrude)
text_object.data.extrude=0.1
#Render animation
bpy.ops.render.render(animation=True)
OUTPUT:
RESULT:
Thus the audio and text special effects to the developed application was successfully
Created.
EXNO: 7 Develop VR enabled applications using motion trackers and sensors
DATE: incorporating full haptic interactivity
AIM:
To Develop VR enabled applications using motion trackers and sensors incorporating
full haptic interactivity using Unity.
PROCEDURE:
PROGRAM 1:
Import bpy
Sensor_data=[0.1,0.2,0.3]
#Access the active object (assuming it’s the object you want to animate).
Obj.location=(sensor_data[0],sensor_data[1],sensor_data[2])
PROGRAM 2:
Import bpy
device=Haptic Device()
device.connect()
Obj= bpy.context.active_object
While true:
Position_data=device.get position()
Obj.location=position _data[0],position_data[1],position_data[2])
If bpy.context.scene.halt_haptic interaction
Break
OUTPUT:
RESULT:
Thus the VR enabled applications using motion trackers ,and sensors incorporating haptic
interactivity was developed successfully.
EXNO: 8 Develop AR enabled applications with interactivity like E learning
DATE: environment, Virtual walkthroughs and visualization of historic places
AIM:
PROCEDURE:
STEP-1: Creating an E-learning environment within blender.
STEP-2: Designed for 3D modeling and animation in an web development
STEP-3: Create a scene with educational content
STEP-4: Write a python scripting in blender
STEP-5: In this script, a cube is created in the scene,and text is added to it.
STEP-6: The camera and lighting are setup for rendering and then scene is rendered
as a PNG image.
PROGRAM:
import bpy
# Render animation
bpy.ops.render.render(animation=True)
OUTPUT:
RESULT:
Thus the AR enabled applications with interactivity like E learning environment,
Virtual walkthroughs and visualization of historic places was developed successfully.
EXNO: 9 Develop AR enabled simple applications like human anatomy
DATE: visualization, DNA/RNA structure visualization and surgery simulation
AIM:
PROCEDURE:
STEP 1: REFERENCE GATHERING
● Utilize Blender’s sculpting tools for organic shapes and fine details.
● Document your process and include any relevant notes for education purposes.
PROGRAM:
import bpy
# Create a basic human figure (you may need to import a human model)
bpy.ops.mesh.primitive_uv_sphere_add(radius=1, location=(0, 0, 0))
AIM:
To Develop simple MR enabled gaming applications using Unity.
PROCEDURE:
● Utilize Blender’s sculpting tools for organic shapes and fine details.
PROGRAM:
import bpy
# Create a cube
bpy.ops.mesh.primitive_cube_add(size=2, enter_editmode=False, align='WORLD',
location=(0, 0, 1))
# Create a plane
bpy.ops.mesh.primitive_plane_add(size=10, enter_editmode=False, align='WORLD',
location=(0, 0, 0))
# Set up camera
bpy.ops.object.camera_add(enter_editmode=False, align='WORLD', location=(5, -5, 3),
rotation=(1, 0, 0.8))
bpy.context.scene.camera = bpy.context.object
# Create a cube
bpy.ops.mesh.primitive_cube_add(size=2, enter_editmode=False, align='WORLD',
location=(0, 0, 1))
# Create a plane
bpy.ops.mesh.primitive_plane_add(size=10, enter_editmode=False, align='WORLD',
location=(0, 0, 0))
# Create a material for the cube
material_cube = bpy.data.materials.new(name="MaterialCube")
material_cube.use_nodes = False # Disable node-based shading for simplicity
material_cube.diffuse_color = (1, 0.5, 0.2, 1) # RGB color values
bpy.context.object.data.materials.append(material_cube)
# Set up camera
bpy.ops.object.camera_add(enter_editmode=False, align='WORLD', location=(5, -5, 3),
rotation=(1, 0, 0.8))
bpy.context.scene.camera = bpy.context.object