0% found this document useful (0 votes)
35 views10 pages

Unit3 Edt1

The document outlines an experiment conducted by a student to design a two-wheel line-following robot using infrared sensors in CoppeliaSim. It includes the aim, tools used, basic commands, code implementation, observations, and learning outcomes related to sensor integration and robot control. The experiment successfully demonstrates the robot's ability to follow a predefined path, with performance improvements noted through optimized control algorithms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views10 pages

Unit3 Edt1

The document outlines an experiment conducted by a student to design a two-wheel line-following robot using infrared sensors in CoppeliaSim. It includes the aim, tools used, basic commands, code implementation, observations, and learning outcomes related to sensor integration and robot control. The experiment successfully demonstrates the robot's ability to follow a predefined path, with performance improvements noted through optimized control algorithms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 10

University Institute of Engineering

Department of Computer Science & Engineering

Experiment: 3.1

Student Name: Omika Gupta UID: 24BAI70535


Branch: Computer Science & Engineering Section/Group:406 A
Semester: 02
Subject Name: EMERGING & DISRUPTIVE TECHNOLOGIES WORKSHOP (IOT,
AR-VR, ROBOTICS)

Subject Code: CONT_24ECP-103

1. Aim of the practical:: Design a two-wheel line following robot integrated with infrared
sensors.

2. Tool Used: Coppeliasim.

3. Basic Concept/ Command Description: A two-wheel line-following robot in CoppeliaSim


uses infrared sensors to detect and follow a predefined path. Basic commands include:

sim.getObjectHandle(): Retrieve object handles.

sim.readVisionSensor(): Read IR sensor values.

sim.setJointTargetVelocity(): Control wheel speed.

while loop: Continuously adjust movement based on sensor input.

4. Code:
CODE FOR BUBBLE ROB:

function sysCall_init()
University Institute of Engineering
Department of Computer Science & Engineering

bubbleRobBase=sim.getObject('.')

leftMotor=sim.getObject("./leftMotor")

rightMotor=sim.getObject("./rightMotor")

noseSensor=sim.getObject("./sensingNose")

minMaxSpeed={50*math.pi/180,300*math.pi/180}

backUntilTime=-1 -- Tells whether bubbleRob is in forward or backward mode

floorSensorHandles={-1,-1,-1}

floorSensorHandles[1]=sim.getObject("./leftSensor")

floorSensorHandles[2]=sim.getObject("./middleSensor")

floorSensorHandles[3]=sim.getObject("./rightSensor")

robotTrace=sim.addDrawingObject(sim.drawing_linestrip+sim.drawing_cyclic,2,0,-1,200,
{1,1,0},nil,nil,{1,1,0})

-- Create the custom UI:

xml = '<ui title="'..sim.getObjectAlias(bubbleRobBase,1)..' speed" closeable="false"


resizeable="false" activate="false">'..[[

<hslider minimum="0" maximum="100" on-change="speedChange_callback"


id="1"/>

<label text="" style="* {margin-left: 300px;}"/>

</ui>

]]

ui=simUI.create(xml)
University Institute of Engineering
Department of Computer Science & Engineering

speed=(minMaxSpeed[1]+minMaxSpeed[2])*0.5

simUI.setSliderValue(ui,1,100*(speed-minMaxSpeed[1])/(minMaxSpeed[2]-
minMaxSpeed[1]))

end

function sysCall_sensing()

local p=sim.getObjectPosition(bubbleRobBase,-1)

sim.addDrawingObjectItem(robotTrace,p)

end

function speedChange_callback(ui,id,newVal)

speed=minMaxSpeed[1]+(minMaxSpeed[2]-minMaxSpeed[1])*newVal/100

end

function sysCall_actuation()

result=sim.readProximitySensor(noseSensor)

if (result>0) then backUntilTime=sim.getSimulationTime()+4 end

-- read the line detection sensors:


University Institute of Engineering
Department of Computer Science & Engineering

sensorReading={false,false,false}

for i=1,3,1 do

result,data=sim.readVisionSensor(floorSensorHandles[i])

if (result>=0) then

sensorReading[i]=(data[11]<0.5) -- data[11] is the average of intensity of the image

end

end

-- compute left and right velocities to follow the detected line:

rightV=speed

leftV=speed

if sensorReading[1] then

leftV=0.03*speed

end

if sensorReading[3] then

rightV=0.03*speed

end

if sensorReading[1] and sensorReading[3] then

backUntilTime=sim.getSimulationTime()+2

end
University Institute of Engineering
Department of Computer Science & Engineering

if (backUntilTime<sim.getSimulationTime()) then

-- When in forward mode, we simply move forward at the desired speed

sim.setJointTargetVelocity(leftMotor,leftV)

sim.setJointTargetVelocity(rightMotor,rightV)

else

-- When in backward mode, we simply backup in a curve at reduced speed

sim.setJointTargetVelocity(leftMotor,-speed/2)

sim.setJointTargetVelocity(rightMotor,-speed/8)

end

end

function sysCall_cleanup()

simUI.destroy(ui)

end

CODE FOR PATH:

path=require('path_customization')
University Institute of Engineering
Department of Computer Science & Engineering

function path.shaping(path,pathIsClosed,upVector)

local section={-0.02,0.001,0.02,0.001}

local color={0.3,0.3,0.3}

local options=0

if pathIsClosed then

options=options|4

end

local shape=sim.generateShapeFromPath(path,section,options,upVector)

sim.setShapeColor(shape,nil,sim.colorcomponent_ambient_diffuse,color)

return shape

end

5. Observations, Simulation Screen Shots and Discussions:

Sensor Accuracy – The infrared sensors effectively detect line contrast but may struggle
with reflective surfaces or dim lighting.

Control Precision – Fine-tuning motor speed improves smooth path tracking, preventing
oscillations or overshooting curves.

Response Time – Faster sensor readings enhance real-time corrections, ensuring stable
movement along the track.

Environmental Factors – External light sources or surface inconsistencies can impact


sensor readings, affecting performance.
University Institute of Engineering
Department of Computer Science & Engineering

Algorithm Optimization – Implementing PID control or adaptive thresholding improves


robot responsiveness and accuracy.
University Institute of Engineering
Department of Computer Science & Engineering
University Institute of Engineering
Department of Computer Science & Engineering

6. Result and Summary:

The two-wheel line-following robot successfully detects and follows a predefined path using
infrared sensors. Performance improves with optimized control algorithms, ensuring smooth
and accurate navigation.

7. Additional Creative Inputs (If Any):

Learning outcomes (What I have learnt):

1.Sensor Integration – Understanding how infrared sensors detect and respond to line
contrast.

2.Robot Control – Gaining knowledge of motor speed adjustments for smooth


navigation.
University Institute of Engineering
Department of Computer Science & Engineering

3.Algorithm Implementation – Applying logic like conditional statements and PID


control for better path-following.

4.Simulation in CoppeliaSim – Learning to use commands for object handling, sensor


reading, and movement control.

5.Problem-Solving – Identifying and addressing challenges like sensor errors, lighting


effects, and path deviations.

Evaluation Grid:

Sr. No. Parameters Marks Obtained Maximum


Marks
1. Student Performance 12
(Conduct of experiment)
2. Viva Voce 10
3. Submission of Work Sheet 8
(Record)
Signature of Faculty (with Date): Total Marks 30
Obtained:

You might also like