Design Document0
Design Document0
Design Review
Team:
Aabhas Sharma (asharm36)
Michael Congdon (mcongdo2)
Siddharth Murali (smurali2)
TA:
Ankit Jain
2
1. Introduction:
● Statement:
Currently there's a lot we can do with VR, but can we actually be in an entirely different
place in a matter of seconds? We’re not quite there yet. The idea behind Project Periscope
is to be able to create a more engaging virtual reality experience - one where we can
Our plan, in its element, is quite simple – connect a camera to an Oculus Rift. We want to
view a camera’s live feed on the Oculus’ display while being able to physically navigate
the camera based on the Oculus wearer's head movements – give the whole experience a
more realistic feel. What this essentially means is that the camera will follow the wearer’s
movements, allowing a user to not just see a live feed of someplace else, but also be able
to freely navigate through the view with a broader field of vision – all this in real time.
Objective:
Consumer Benefits
● Physically navigate through their field of vision - simulating human vision and
● Multiple applications:
3
○ Can be used as a remote surveillance system (with mobile capabilities as
well)
○ Exploring the world - see places where you can’t otherwise go to. For
○ Can bridge telepresence with media events - such as football games and
conventions or conferences like ComicCon - where the user can explore the
event with their own freedom - look wherever they find most interesting
field, allowing doctors to see inside the human body like never before
● Product Features
● Wireless Streaming
4
2. Design:
Block Diagram
Figure 1: top level block diagram showing data and power connections between various modules.
5
Figure 2: Software flow chart detailing process flow between Camera Unit, Oculus Unit and Swivel Unit
6
Figure 3: top level schematic showing basic interconnections of motor controller to drivers and motors
7
Figure 4: Detailed view of connections to Raspberry Pi
Figure 5: detailed view of motor driver circuit, including peripheral components necessary to regulate voltage and
boost switching capacity
8
Figure 6: buck converter to provide 5 volts for the Raspberry Pi from 12 volt battery
9
Block Descriptions
Camera
The camera is the “eyes” of our project. It has its own WiFi hotspot, which the laptop
connects to in order to stream the video. The video feed is passed from the camera to an
image processing program, which will stitch the two images together and then pass it to
Microcontroller 1
Will be using this to conduct the image processing required to modify and stitch
together the camera feed to align with the Oculus display’s expected format.
The Oculus SDK is what controls what you see as well as which way you turn. The video
feed is transmitted from the image processing software into the SDK, and this transmits
the feed into the Oculus display, showing you what the cameras see. The SDK also
receives head movement data from the accelerometer/gyroscope and uses this to
This consists of three parts – display, power supply, and motion tracking
The display gets the video from the SDK and shows it to the user as a 360 degree view.
These three combined relay the headset orientation data to the SDK.
10
Swivel Unit
Inputs: none
Components:
Buck converter
Inputs: orientation data from Oculus SDK, encoder pulses from motor encoders
Components:
Isolation circuitry
Logic level converter will protect the 3.3 volt logic of the
3.3 volt Zener diodes will be used as a last line of defense if one of
Motor drivers
L6203 DMOS full bridge drivers receive pwm pulses from Pi and
The motors each have 2-pin hall sensor quadrature encoders with a
The gear box has a 70:1 gearing ratio, which gives a total of 4,480
12
3. Requirements and Verifications
● Modular Functionality
must correspond 1:1 with 2. SDK 3D position vectors (Pitch, Yaw, and Roll)
User’s movements with a and User movements must align to the exact
2ms
Camera unit
tolerance of ~ 3.65 mm
13
● Image processor barrel 1. Use test 2D images with available Oculus output 20
modified image with exact 2. Compare our processed output with sample
formula: 0.24r4 +0.22r2 + 1 3. Output should be the same as the sample image,
● Tolerance ~2px
the image processing must 2. When program finishes, the timestamp at the end
return output image within of the program must be within 15 ± 5ms of the
14
3. Measure input and output voltage and current
rating is met
● Motors are able to produce 1. Attach simulated load calculated to require 1.5 10
● Motors should be able to 1. Power on system, with either the real camera 20
actuate camera unit with unit or simulated loads attached to each motor
less than 7% overshoot. (At 2. Using step input (to simulate worst case
and the video will oscillate) 3. Measure encoder outputs and analyze system
Total 100
15
● Tolerance Analysis
A major challenge for us to be able to map a 2D camera image feed to the concave
Oculus display. The Oculus offers the user a very immersive experience by presenting a
virtual image that appears larger than it actually is. Although this is a key positive aspect
that enables the complete VR experience, it could result in major health hazards if not
done right. If the Oculus’ display is screening images that are distorted or have
unexpected skips in motion, the user could become vulnerable to seizures, epilepsy
Given this situation, it is important for us to ensure we are significantly accurate in our
image processing. We will be running our camera feed through barrel distortion
algorithms2 that will map our two dimensional feed to a concave screen using the
following formula: 0.24r4 +0.22r2 + 1. Basically how this works is that for any pixel on
the Oculus screen that is distance r away from the centre of the concave screen, the
appropriate pixel to be copied over from the 2D camera feed is given by 0.24r4 +0.22r2 +
1. 2
Our tolerance to this has to be significantly small. As highlighted earlier, we can’t have a
poorly distorted image. Therefore, we are only accepting a tight tolerance of ± 2px offset
The way we will ensure this tolerance is using a fundamental method of verification that
is used in most image comparison software - we will use sample Oculus images as the
16
input for our image processor. We will then compare our output image to the Oculus
version of the sample image. While comparing, we will not just check each 2D pixel with
its corresponding barrel pixel location, we will check all barrel pixels within a 2 pixel
radius of the expected barrel location. Our verification program will run through the
entire output image and compare each pixel with nearest neighbors (2px) on the sample
output, if there is a match, we would consider this phase a success, else we will have to
17
4. Cost and Schedule:
Microcontrollers, etc…
controllers Raspberry Pi $ 50.00 2
vision Oculus Rift $ 350.00 1
cameras GoPro Hero3+ Silver $ 300.00 2
motors 70:1 Metal Gearmotor $ 40.00 3
battery sealed lead-acid $ 20.00 1
battery charger UPG D1724 $ 9.47 1
$ 1,199.47
Total $ 112,500.00
Section Total
Parts $ 1,246.91
Labor $ 112,500.00
Total $ 113,746.91
19
Schedule
processing
processing
20
21-Oct Characterize system dynamics, write control logic Michael
for Simulink
tracking)
processing
Oculus
21
11-Nov Final assembly of components Michael
Control
requirements
2-Dec Create final paper content for Swivel unit and ensure Michael
uniform formatting
Create final paper content for Oculus unit and ensure Siddharth
uniform formatting
22
Create final paper content for Camera unit and Aabhas
23
5. Ethical Guidelines and Safety requirements
a. Ethical Guidelines
We will adhere to the following guidelines from the IEEE Code of Ethics as they are relevant
1. To accept responsibility in making decisions consistent with the safety, health, and
welfare of the public, and to disclose promptly factors that might endanger the
We will be applying high safety standards to ensure that our product will be easy to use.
We will have warnings that list the potential consequences of using our product for too
long, and safety methods to be followed to ensure that the user has a enjoyable and safe
We will cite all sources used in formulating our claims, and we will mention the areas
where we have chosen to make our own claims and give our reasoning behind them.
tasks for others only if qualified by training or experience, or after full disclosure of
pertinent limitations;
We will have taken all the required courses in order to complete this project and we will
24
4. To seek, accept, and offer honest criticism of technical work, to acknowledge and
We are working closely with the TAs and professors in the Design Review and peer
evaluations of our project, and we will work to incorporate their criticism and suggestions
5. To treat fairly all persons and to not engage in acts of discrimination based on race,
religion, gender, disability, age, national origin, sexual orientation, gender identity,
or gender expression;
We are a team and we will distribute work and give honest evaluations of team members
without discriminating against anyone. If there is a problem we will meet with the
malicious action;
We do not wish to cause harm to any person or their property in any way. Our product is
intended to broaden the scope of a particular area and is not intended to harm anyone
intentionally. We will mark all dangers of our product with safety warnings if required.
We will do our best to support other teams and projects in this class by offering help that
is within our area of expertise. We will participate in peer design reviews in order to aid
this process.
25
Safety requirements
● There are a few common safety hazards connected with the usage of an Oculus
Rift. Before using this system, go through the safety documentation created by
● Pinching Hazard - Our product has motors that will rotate depending on user
movement. If someone gets their finger caught it could cause injury. Individuals
need to be careful while working with the motors and the camera mount as they
may have pinch points. Also keep your hands away from the motor when the
● Our product has many components that cannot be exposed to water. If they are,
there is the chance of them shorting out and destroying the circuit or injuring the
user.
26
Appendix:
http://scholar.lib.vt.edu/theses/available/etd-08182005-222028/unrestricted/thesis.pdf
https://www.youtube.com/watch?v=B7qrgrrHry0&feature=youtu.be&t=11m26s
Barrel Distortion
http://stackoverflow.com/questions/28130618/what-ist-the-correct-oculus-rift-barrel-
distortion-radius-function
27