0% found this document useful (0 votes)
370 views84 pages

Jupiter Robot Software Module User Guide (v0.3.0) - 20210809

This document provides instructions for using various functions of the Jupiter robot software module including speech recognition and synthesis, robot arm control, image processing, indoor navigation, mapping and navigation simulation, and remote control. It describes how to initialize positions for the robot arm, control movement of the arm through commands, and use speech recognition to control movement of the robot in a simulation.

Uploaded by

Mohamed Elsayed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
370 views84 pages

Jupiter Robot Software Module User Guide (v0.3.0) - 20210809

This document provides instructions for using various functions of the Jupiter robot software module including speech recognition and synthesis, robot arm control, image processing, indoor navigation, mapping and navigation simulation, and remote control. It describes how to initialize positions for the robot arm, control movement of the arm through commands, and use speech recognition to control movement of the robot in a simulation.

Uploaded by

Mohamed Elsayed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 84

DOCUMENTATION

©2020 Jupiter Robot Technology Co., Ltd.


All rights reserved. Do not share externally without permission.

Technical Support:[email protected]
Jupiter Robot Software Module User Guide
V0.3.0

Jupiter Robot Technology Co., Ltd.

Email: [email protected]
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Content
INTRODUCTION................................................................................................................................. 2
SPEECH RECOGNITION & SYNTHESIS .......................................................................................... 3
2.1. Speech Recognition ..................................................................................................................... 3
2.2. Control Gazebo for Jupiter Robot ................................................................................................ 5
ROBOT ARM CONTROL .................................................................................................................... 8
3.1. Robot Arm Initial Position ............................................................................................................. 8
3.2. Activate Robot Arm Debugger ................................................................................................... 10
3.3. Robot Arm Simulation ................................................................................................................ 14
IMAGE PROCESSING ..................................................................................................................... 18
4.1. Object Tracking .......................................................................................................................... 18
4.2. Facial Detection ......................................................................................................................... 23
4.3. Facial Recognition ..................................................................................................................... 24
4.4. People Detection ........................................................................................................................ 28
4.5. Yolo Object Recognition ............................................................................................................. 30
4.6. OpenPose Body Posture Recognition ....................................................................................... 32
4.7. Interactive-face Facial Recognition............................................................................................ 34
4.8. MaskRCNN Area Segmentation ................................................................................................ 36
INDOOR NAVIGATION ..................................................................................................................... 38
5.1. Mapping ..................................................................................................................................... 38
5.2. Rviz for Indoor Navigation.......................................................................................................... 47
5.3. Coordinates to Locate Target Position ....................................................................................... 54
5.4. Edit Map Using Image Processing Software ............................................................................. 57
MAPPING & NAVIGATION SIMULATION ........................................................................................ 60
6.1. Mapping Simulation ................................................................................................................... 60
6.2. Navigation Simulation ................................................................................................................ 68
6.3. Coordinates for Robot Positioning ............................................................................................. 72
REMOTE CONTROL ........................................................................................................................ 76
TeamViewer....................................................................................................................................... 76
F710 (GAMEPAD) CONFIGURATION.............................................................................................. 78

1 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

INTRODUCTION

This User Manual is intended for Jupiter Robot V0.3.0 The functional modules included are Speech
Recognition & Synthesis, Robot Arm Manipulation, Image Processing, Indoor Navigation and Simulation.

2 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

SPEECH RECOGNITION & SYNTHESIS

2.1. Speech Recognition

Functional Description: Able to recognise offline robot movement instructions.

Step 1: Launch a new Terminal, activate English key words recognition module
roslaunch pocketsphinx kws.launch
dict:=/home/mustar/catkin_ws/src/basic_function_packages/pocketsphinx/demo/voice_cmd.dic
kws:=/home/mustar/catkin_ws/src/basic_function_packages/pocketsphinx/demo/voice_cmd.kwlist
Note: Connect the above command by space, not carriage return. Copy the command to Terminal as
following:

Once executed:

3 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Step 2: Launch a new Terminal to show recognition result

rostopic echo /kws_data

List of recognised words can be found in the voice_cmd.dic files under “catkin_ws/src/
basic_function_packages /pocketsphinx/demo/”, including “BACK, FORWARD, FULL, HALF, LEFT,
MOVE, RIGHT, SPEED, STOP”. Combined words such as “FULL SPEED, HALF SPEED” can be
recognised too. The recognition rate is not high, as it is offline.

4 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

2.2. Control Gazebo for Jupiter Robot

Functional Description: Able to execute movements for Jupiter Robot by a few specific English
commands in Gazebo.

Step 1: Launch a new Terminal, activate Jupiter Robot in Gazebo


roslaunch jupiterobot_gazebo jupiterobot_world.launch

There are four settings for the robot under jupiterrobot_world.launch:

“stacks”: stack type, h = hexagon, c = circle; default setting is h


“lasers”: laser radar, n = none, r = Rplidar-A1, h = Hokuyo; default setting is r
“arms”: arm configuration, n = none, 5 = 5 degr. of freedom, 7 = 7 degr. of freedom; default is 5
“heads”: upper camera configuration, n = none, 1 = half fixed, 2 = 2 degr. of freedom; default is 1

Note: There might be a short waiting time, until following image appears on the screen. If it fails to activate,
please attempt several times.

Enlarge the Jupiter Robot model as following:

5 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Step 2: Launch a new Terminal, activate English key words recognition module
roslaunch pocketsphinx kws.launch
dict:=/home/mustar/catkin_ws/src/basic_function_packages/pocketsphinx/demo/voice_cmd.dic
kws:=/home/mustar/catkin_ws/src/basic_function_packages/pocketsphinx/demo/voice_cmd.kwlist

Step 3: Launch a new Terminal, activate voice control command


rosrun pocketsphinx voice_control_example.py

Once activated:

6 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Using the microphone, speak out the 9 commands under 2.1. Recognition results can be seen as
following:

It shows that Jupiter Robot acts accordingly in Gazebo, but it is possible that its action is not entirely
triggered by the command. As long as there is robot movement, it shows that voice command works.

7 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

ROBOT ARM CONTROL

3.1. Robot Arm Initial Position

To ensure the safety of robot arm operation, it is necessary to place the arm as below.
Note: Before placing the arm manually, make sure that the arm power supply is switched off. Never
manipulate the arm position while the power is on to avoid damage.

Check the steering position at the bottom - the data cables should be tangle-free as following:

8 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

The position of hand motors is as following:

9 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

3.2. Activate Robot Arm Debugger

Functional Description: Able to control robot arm movement via commands.

Step 1: Launch a new Terminal, activate robot arm education command


roslaunch rchomeedu_arm arm.launch

Once activated, the results are as following:

5 controllers (from bottom to top) are waist_controller, shoulder_controller, elbow_controller,


wrist_controller & hand_controller. Corresponding motor IDs are 1, 2, 3, 4 & 5.

Step 2: Launch a new Terminal, activate controller debugger

Use the following source codes to rotate the motor slightly to change the position of robot arm
rostopic pub -1 /waist_controller/command std_msgs/Float64 -- 0.3

rostopic pub -1 /shoulder_controller/command std_msgs/Float64 -- 0.3

rostopic pub -1 /elbow_controller/command std_msgs/Float64 -- 0.3

rostopic pub -1 /wrist_controller/command std_msgs/Float64 -- 0.3

10 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

rostopic pub -1 /hand_controller/command std_msgs/Float64 -- 0.2

In the given source codes, “Float64 – 0.3” represents controller’s target position. When the motor reaches
this position, it would stop. The number could be changed to 0.0 to return the motor to its initial position.
A negative number indicates a turn in an opposite direction. Note that the speed of the robot arm was set
low in the control parameter. Any changes of target position data should be done gradually to avoid
collision of motor with the robot itself, screen etc. When movement is completed, the window appears as
following:

Robot arm’s position is as following:

11 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Note that if a large parameter is set, motor may get stuck. Red warning would appear on the Terminal of
activated robot arm as following:

12 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

At the same time, the stuck motor would flash red light as following:

If such situation occurs, immediately press the emergency stop switch to cut off the electricity supply to
the mobile platform and robot arm. Manually adjust the controller’s position to normal. Then switch on the
electricity supply again and launch the Terminal for robot arm.

13 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

3.3. Robot Arm Simulation

Functional Description: Able to control a simulated robot arm in Gazebo with MoveIt!
Note: The simulated robot arm is of different configurations than the one on Jupiter Robot.
Step 1: Launch a new Terminal, activate robot arm simulation in Gazebo
roslaunch jupiterobot_arm_moveit_config demo_gazebo.launch

Once activated:

14 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Two windows appear. One is the Gazebo simulation interface as following:

The other is the MoveIt! Interface at Rviz as following:

15 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Step 2: Select Goal State to plan robot arm movement

At Rviz, select “Planning” under “MoveitPlanning”. Under “Query”, choose “Select Goal State”. There are
four different options available under the pull-down menu: “random valid” (any reachable random position),
“random” (random position), “current” (current position), “same as start” (same initial position), as well as
three preset postures “up, rest, catch”. Once the posture is selected, press the “Update” button to update
the target posture.

Then, press “Plan” under “Commands” and observe the planned path of the robot arm of Jupiter Robot
at the right. Once “Execute” is clicked and the interface is switched to Gazebo, one can see the simulated
robot arm in Gazebo moving to the targeted position based on the planned path by Moveit!.

16 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Step 3: Select Goal State to plan robot arm movement

Besides utilising preset position as the target point for robot arm movement planning, one can use the
appeared blue ball of the robot arm within Rviz to indicate the target positioning as following:

In the picture above, the blue ball represents the actuator position. The red arrow shows the X direction,
the green arrow indicates the Y direction and the blue arrow points to the Z direction. The red ring is roll,
the green ring is pitch and the blue ring is yaw. Using mouse click to move the blue ball or the three balls
of different colour, one can change the position of the end effector. Moving the three different rings could
change the posture of the end effector. Once adjusted, click “Plan” and “Execute” to plan and execute.

17 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

IMAGE PROCESSING

4.1. Object Tracking

Functional Description: Able to detect and track an object of specific colour in an image.

Step 1: Launch a new Terminal, activate camera launch command


roslaunch rchomeedu_vision multi_astra.launch
Once activated:

Step 2: Launch a new Terminal, activate object tracking command


roslaunch opencv_apps camshift.launch image:=/camera_top/rgb/image_raw
Once activated:

18 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

At the same time, two windows on the screen appear – one showing the image as captured by the camera
and one showing a distribution histogram of selected colour as following:

19 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Choose a colour object with a distinct differentiation from background. Use the mouse to select a
rectangle region on the object. The rectangle region should be fully within the compound of the colour
object as following:

20 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Once the mouse is moved, there will be a red ellipse circling the object, which will follow the movement
of the object as following (The red object was changed to a yellow one for better contrast purpose):

21 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Note that it is possible for the camera to wrongly recognise other objects or regions of similar colour, once
the colour object leaves the view area of the camera. If the colour object cannot be correctly tracked once
it returns to the view area, one can reselect the rectangle region or bring the colour object near to the
wrongly recognised area to detect and track it again.

22 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

4.2. Facial Detection

Functional Description: Able to detect human face.

Step 1: Launch a new Terminal, activate camera launch command


roslaunch rchomeedu_vision multi_astra.launch

Step 2: Launch a new Terminal, activate face detection command


roslaunch opencv_apps face_detection.launch image:=/camera_top/rgb/image_raw

Once activated:

In the image windows, detected face is marked by an ellipse as following:

Depends on facial impressions, distance to camera and eyelid’s position, eyes can be further identified.

23 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

4.3. Facial Recognition

Functional Description: Based on learning results, able to recognise human faces.

Step 1: Launch a new Terminal, activate camera launch command


roslaunch rchomeedu_vision multi_astra.launch

Step 2: Launch a new Terminal, activate face recognition command


roslaunch opencv_apps face_recognition.launch image:=/camera_top/rgb/image_raw

Once activated:

Two windows appear on screen – one showing the camera view, one showing a conversation window
asking for name as following:

24 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Enter name and press “enter” to confirm. Do keep a person in front of the camera with the face size ratio
as large as possible as following:

25 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

After the first photograph, there will be an enquiry on the conversation window if further photograph should
be taken – select “y” and continue to have 3 – 5 more photographs taken.

Once done, choose “n” and end.

Step 3: Select facial recognition result interface


The interface is a small image window which is activated by default. It is blocked by the photograph taking
window and can be seen by removing the photograph taking window. The interface shows the result of
facial recognition – a red rectangle surrounding the recognised face and the name of the recognised
person at the left bottom corner as following:

26 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

At the conversation window, photos for the second person can be taken, once the one for the first person
has been completed. Once both completed, the facial recognition result window would display the result
of recognising both persons at the same time as following:

Note that algorithim saves the photograph specimens under .ros hidden folder under “Home” folder. Use
“Ctrl + H” to show all hidden folders under “Home” folder to find the .ros folder. In “opencv_apps” inside,
there are individual named folders with photographs taken. For a complete new start, all named folders
under “face_data” would be deleted before taking new photos.

27 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

4.4. People Detection

Functional Description: Able to detect the number of people in an image.

Step 1: Launch a new Terminal, activate camera launch command


roslaunch rchomeedu_vision multi_astra.launch

Step 2: Launch a new Terminal, activate people detection command


roslaunch opencv_apps people_detect.launch image:=/camera_top/rgb/image_raw
Once activated:

An image interface appears at the same time. When a person appears in complete in the image, a green
rectangle encompasses the area surrounding the person as following:

28 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

If a person does not appear in complete (e.g. head or leg being cropped), the person may not be detected
as following:

29 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

4.5. Yolo Object Recognition

Step 1: Launch a new Terminal, activate camera launch command


roslaunch rchomeedu_vision multi_astra.launch

Step 2: Launch a new Terminal, activate Yolo command


roslaunch robot_vision_openvino yolo_ros.launch

Once activated:

The above window will be switched to a result window swiftly as following:

30 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

“FPS” (frames per second) is the rate at which consecutive images called frames appear on a display.
“Object” lists down the recognised object(s) and the confidence level.

The recognised object(s) will be marked in the image window as following:

31 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

4.6. OpenPose Body Posture Recognition

Step 1: Launch a new Terminal, activate camera launch command


roslaunch rchomeedu_vision multi_astra.launch

Step 2: Launch a new Terminal, activate “OpenPose” command


roslaunch robot_vision_openvino yolo_ros.launch

Once activated:

The above window will be switched to a result window swiftly as following:

32 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

The recognised body posture will be marked in the image window as following:

33 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

4.7. Interactive-face Facial Recognition

Step 1: Launch a new Terminal, activate camera launch command


roslaunch rchomeedu_vision multi_astra.launch

Step 2: Launch a new Terminal, activate “Interactive-face” command


roslaunch robot_vision_openvino interactive_face_ros.launch

Once activated:

The above window will be switched to a result window swiftly as following:

34 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

The recognized face will be marked in the image window as following:

In the picture above, the blue square marks the recognised face while the words above indicate the
analysed results regarding gender, age and facial expression information. The yellow star marks the facial
frame and the position of eyes, mouth, nose etc. The short red, green and blue lines represent the facial
direction judgement.

35 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

4.8. MaskRCNN Area Segmentation

Step 1: Launch a new Terminal, activate camera launch command


roslaunch rchomeedu_vision multi_astra.launch

Step 2: Launch a new Terminal, activate “MaskRCNN” command


roslaunch robot_vision_openvino maskrcnn_ros.launch

Once activated:

The above window will be switched to a result window swiftly as following:

36 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

The recognised segments will be marked in the image window as following:

37 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

INDOOR NAVIGATION

5.1. Mapping

Functional Description: Able to carry out mapping indoor.

Step 1: Launch a new Terminal, activate Jupiter Robot


roslaunch jupiterobot_bringup jupiterobot_bringup.launch

Once activated:

Sounds can be heard during the starting up.

Step 2: Launch a new Terminal, activate navigation mapping command


Note: Only one sensor can be used at a time for mapping – select command using 3D camera or LIDAR

Using 3D camera:
roslaunch jupiterobot_navigation gmapping_demo.launch

38 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Using LIDAR:
roslaunch jupiterobot_navigation rplidar_gmapping_demo.launch

39 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The result using 3D camera:

The result using LIDAR:

40 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Step 3: Launch a new Terminal, activate Rviz environment
roslaunch turtlebot_rviz_launchers view_navigation.launch

As following:

Using 3D camera:

Using LIDAR:

41 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Step 4: Launch a new Terminal, activate Turtlebot teleoperating

Using keyboard:

roslaunch turtlebot_teleop keyboard_teleop.launch

42 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

As following:

With:
“i” represents movement to the front
“u” represents anticlockwise movement to the front with left wheel as circle centre
“o” represents clockwise movement to the front with right wheel as circle centre
“k” represents immediate stop
“j” represents anticlockwise movement at the original position with centre of robot as circle centre
“l” represents clockwise movement at the original position with centre of robot as circle centre
“,” represents movement to the back
“m” represents clockwise movement to the back with left wheel as circle centre
“.” represents anticlockwise movement to the back with right wheel as circle centre

Using Gamepad:

roslaunch jupiterobot_teleop_move joy_move.launch

43 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Operation guide:
RB button (button Nr. 5) controls whether the robot executes the command from gamepad.
Left joystick (joystick Nr. 0, 1) controls the robot’s normal speed movement.
Right joystick (joystick Nr. 2, 3) controls the robot’s low speed movement.
With the pressing of LB Button (button Nr. 4) and the using of left joystick, the robot’s movement speed
will be enhanced.

Step 5: Use keyboard to remote control robot to walk slowly


Observe Rviz interface, map out the environment gradually. The effect is as following:

Unknown Area

Position of Robot.
The green line at the
right with the thick
end pointing towards
the front of the robot

Area with
Area with
mapping
obstacles
ongoing
detected
Area without
obstacles
detected

44 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Note:
1) Use low movement speed; The default speed is good. If there are many corners or obstacles, position
the robot to scan the target area directly multiple times. Do not reverse the robot during mapping, nor
stop the robot with the “k” key abruptly when it is moving speedily.

2) The detected 3D height for navigation camera is approximately 30-60mm. For LIDAR, it is about 30-
40mm. Please note that for tables, chairs etc., there could be a hollow space up to 60mm. In this situation,
the robot is unable to detect the obstacle. Suggest to place another obstacle manually at this height.
Alternatively, remember the position of the obstacle and mark it afterwards manually with the image
processing software.

3) When LIDAR is used, transparent glasses, black objects and mirrors should be avoided above the
detection height as these objects could absorb or reflect the laser. If viable, frosted membrane film could
be considered to diffuse reflection.

4) Try to form a closed region to ease the path planning. If there is a large opened region, the path
planning may fail. If it is not possible to form a closed region, manual border drawing afterwards with
the image processing software would be needed.

5) Try to form closed loop during mapping, i.e. robot starts and ends at the same point. If the region to be
covered is large, e.g. multiple rooms, try to form a closed region before entering another region. Once all
regions are covered, the robot should move back to its origin point.

6) Mapping surface needs to be flat. Convex area could affect the accuracy of the mapping process.

7) Keep the terminal of keyboard-controlled robot as the current window, as otherwise the
commands from keyboard could not be transmitted to robot. No restriction if a gamepad is used.

8) Left mouse click rotates Rviz display area. Right click zooms in the display area. Shift + left
mouse click translates the display area.

Step 6: Open a new Terminal, save the map


rosrun map_server map_saver -f /home/mustar/catkin_ws/maps/test1

As following:

45 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

In above, “test1” is the name of correctable map file. Maps are saved under
“/home/mustar/catkin_ws/maps” in the “.pgm” and “.yaml” formats. “.pgm” is the image format of maps
where the overall map can be seen, whereas “.yaml” is the specification document where it records
various map information such as saved position, image resolution, initial posture etc. If map documents
are to be moved, do change the first line of “.yaml” (red box below) to the new absolute path of “.pgm”
after the movement.

Note: only the first line needs to be changed. The other lines are subject to algorithms, may not coincide
with the ones from the picture above.

46 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

5.2. Rviz for Indoor Navigation

Functional Description: Able to navigate indoor using built map.

Step 1: Open a new Terminal, activate Jupiter Robot


roslaunch jupiterobot_bringup jupiterobot_bringup.launch

Step 2: Open a new Terminal, export built map


Note: Only one sensor can be used at a time for navigation – select command using 3D camera or LIDAR.

Using 3D camera:

roslaunch jupiterobot_navigation amcl_demo.launch map_file:=/home/mustar/catkin_ws/maps/test1.yaml

Using LIDAR:

roslaunch jupiterobot_navigation rplidar_amcl_demo.launch


map_file:=/home/mustar/catkin_ws/maps/test1.yaml

Note: Connect the above command by space, not carriage return. Copy the command to Terminal as
following:

Using 3D Camera:

47 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using LIDAR:

Using 3D Camera, once activated:

48 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Using LIDAR, once activated:

Step 3: Open a new Terminal, activate Rviz environment


roslaunch turtlebot_rviz_launchers view_navigation.launch

Using 3D camera, once activated:

49 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Using LIDAR, once activated:

The black dot represents the robot. Its position is however inaccurate as the robot does not know its initial
position. The green arrows surrounding it represent the position & direction the robot thinks it could have.

Step 4: Indicate the robot initial position at Rviz


Observe robot’s position in real environment. Click “2D Pose Estimate” button on Rviz interface and click
correspondingly on the map. Point the green arrow to the direction of the robot. The robot would reposition

50 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
itself on the map as following:
Using 3D Camera:

Using LIDAR:

Step 5: Give the robot target position at Rviz


Observe robot’s position in real environment. Click “2D Nav Goal” button on Rviz interface, click the
desired target location on the map and point the green arrow to the direction of the desired target location.
Once released, robot would plan its path and execute movement to the target position as following:
Using 3D Camera:

51 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Using LIDAR:

Using 3D Camera:

52 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Using LIDAR:

Note: If the robot encounters obstacles during movement, the robot would avoid them by itself and
replan the path. If the target position is blocked by obstacle, it would affect its path planning. If the robot
is stuck, or is potentially going to collide with other objects, please stop the robot movement by pressing
the emergency stop button. After placing the robot in a safe location, a safe target position must be
selected before switching on the robot again. Once it is on, the robot will rotate itself to determine its
current position before resuming the movement to the target position.

53 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

5.3. Coordinates to Locate Target Position

Functional Description: Based on a built map, use coordinate to move the robot to a target position and
return.

Step 1: Launch a new controlling Terminal, activate Jupiter Robot


roslaunch jupiterobot_bringup jupiterobot_bringup.launch

Step 2: Launch a new Terminal, import built map


roslaunch jupiterobot_navigation amcl_demo.launch map_file:=/home/mustar/catkin_ws/maps/test1.yaml

Step 3: Launch a new Terminal, activate Rviz environment


roslaunch turtlebot_rviz_launchers view_navigation.launch

Step 4: Confirm coordinates of target position


Choose “Publish Point” on the Rviz interface. Move the mouse to the target position, observe the
coordinates at the left bottom corner as following:

Click “Publish Point”

Coordinates displayed

In the coordinates, the first two represent the target X & Y-coordinates.

Open the “navigation.py” file under “home/mustar/catkin_ws/src/rc-home-edu-learn-


ros/rchomeedu_navigation/scripts”. Find the 52nd and 53rd line. Change the numbers after “A_x=” into X
coordinates, e.g. “2.76” above and change the numbers after “A_y=” into y coordinates , e.g. “3.68”
above, as shown below. Save the document.

54 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Step 5: Open a new Terminal, navigate robot to target position


roslaunch rchomeedu_navigation navigation.launch
Once activated:

Note: Observe the Terminal, click the “2D Pose Estimate” button on the Rviz interface. Click

55 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
correspondingly on the map and position the green arrow to the direction of robot. Robot would then
move to the target coordinate and return as following:

56 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

5.4. Edit Map Using Image Processing Software

Functional Description: Use image processing software to edit map.

Step 1: Open GNU image processing programme


Search for “GNU” and launch the software

Step 2: Open available map file


Go to “Open” then “File” to open a map file as following:

The image can be zoomed in or out via “+/-“. The direction keys are to move the image.

Step 3: Choose pixel, edit image


Using the tool selection on the left, where edit is needed, choose single or multiple continuous pixel.

57 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Use “Bucket Fill Tool” to fill up the selected area in black.

Step 4: Save image


It is not possible to save image file in .pgm format directly. Instead, export the file to save it: Choose “File”
-> Export as” and save the edited file. When saving, choose “Raw” format as following:

58 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

59 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

MAPPING & NAVIGATION SIMULATION

6.1. Mapping Simulation

Functional Description: Use a mocked LIDAR sensor to operate robot and build map in simulation.

Step 1: Launch a new Terminal, load simulation


Use the command below to launch the simulation.

roslaunch jupiterobot_gazebo jupiterobot_world.launch


world_file:=/home/mustar/catkin_ws/worlds/Jupiter_Robot_Office.world

Note: Connect the above command by space, not carriage return.

Once activated:

The Gazebo simulation will be activated at the same time as following:

60 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

The LIDAR can be seen once the robot area is enlarged.

Step 2: Launch a new Terminal, activate mapping command


Use the command below to use LIDAR in simulation to build map.

roslaunch jupiterobot_gazebo rplidar_gazebo_gmapping_demo.launch

61 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

As following:

Step 3: Launch a new Terminal, activate the Rviz environment

roslaunch turtlebot_rviz_launchers view_navigation.launch

As following:

62 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

The right column can be hidden. There are two topics on the left that need to be changed. First, change
the “Local Map -> Costmap -> Topic” to “/map”. Second, change “Global Map -> Costmap -> Topic” to
“/map”.

Step 4: Launch a new Terminal, activate teleoperating

63 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using keyboard:
roslaunch turtlebot_teleop keyboard_teleop.launch

As following:

With:
“i” represents movement to the front
“u” represents anticlockwise movement to the front with left wheel as circle centre
“o” represents clockwise movement to the front with right wheel as circle centre
“k” represents immediate stop
“j” represents anticlockwise movement at the original position with centre of robot as circle centre
“l” represents clockwise movement at the original position with centre of robot as circle centre
“,” represents movement to the back
“m” represents clockwise movement to the back with left wheel as circle centre
“.” represents anticlockwise movement to the back with right wheel as circle centre

Step 5: Teleoperating the robot to build map


Observe the map building in Rviz interface gradually as following:

64 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Unknown Area

Area with
barrier detected

Area without
barrier

Position of Robot. The green line at the right with the thick end
pointing towards the front of the robot

Make both Gazebo and Rviz windows appear at the same time in order to observe the robot operation in
simulation as following:

Keep the teleoperating window active, remote control the robot to move in the environment, until a map
is built as following:

65 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

A few points to take note for mapping:

1) The simulated robot shall not collide with walls. When the robot collides with obstacles, it would
cause the wheels to spin idly. It might result in high discrepancies in speedometer readings which would
fail the mapping.

2) The default robot movement speed is low. It depends on the processor configuration; higher
performing processor could support higher robot speed. However, the robot speed during mapping should
not be too high. The speed can be increased as appropriate during mapping but if the mapping
discrepancies are too huge, the speed should be lowered and restart.

3) The mapping should be completed at one time. No correction is possible to improve a done map.
However, image processing software can be utilized to edit the image file of the map.

4) LIDAR has a working range of around 240 degree forward fan-shaped region with a distance of
four meters.

5) Try to get the robot back to the initial point of the mapping to loop the whole mapping route
completely.

Step 6: Open a new Terminal, save the map


rosrun map_server map_saver -f /home/mustar/catkin_ws/maps/test2

As following:

66 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

In above, “test2” is the name of correctable map file. Maps are saved under
“/home/mustar/catkin_ws/maps” in the “.pgm” and “.yaml” formats. “.pgm” is the image format of maps
where the overall map can be seen, whereas “.yaml” is the specification document where it records
various map information such as saved position, image resolution, initial posture etc. If map documents
are to be moved, do change the first line of “.yaml” (red box below) to the new absolute path of “.pgm”
after the movement.

Note: only the first line needs to be changed. The other lines are subject to algorithms, may not coincide
with the ones from the picture above.

67 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

6.2. Navigation Simulation

Functional Description: Able to do indoor navigation in simulation using built map.

Step 1: Open a new Terminal, activate Simulation


roslaunch jupiterobot_gazebo jupiterobot_world.launch
world_file:=/home/mustar/catkin_ws/worlds/Jupiter_Robot_Office.world

Note: Connect the above command by space, not carriage return.

Step 2: Open a new Terminal, import the built map


roslaunch jupiterobot_gazebo amcl_demo.launch
map_file:=/home/mustar/catkin_ws/maps/JupiterOfficeSim.yaml

Note: Connect the above command by space, not carriage return. Copy the command to Terminal as
following:

Once activated:

68 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Step 3: Open a new Terminal, activate Rviz environment


roslaunch turtlebot_rviz_launchers view_navigation.launch

Once activated:

69 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The black dot represents the robot position. As the robot does not know where its initial position is, the
location on the map is not accurate. The surrounding bulks of green arrows represents the possible
position and direction the robot thinks it has.

Step 4: Indicate the initial position of robot in Rviz

Observe the position of robot in the simulation. Press the “2D Pose Estimate” button on the Rviz
interface and click on the corresponding location on the map. Point the green arrow to the direction
where the robot is facing. The robot will then readjust its location on the map as following:

Step 5: Indicate the target position of robot in Rviz

During mapping, the initial position of the robot is the position when the robot first switched on. Hence,
in a simulation environment, the “2D Pose Estimate” can be ignored. However, in a real robotic
platform, the initial position of robot does not always correspond to the origin point of mapping, hence it
requires the use of “2D Pose Estimate” to indicate the starting point of robot navigation.

Observe the simulated robot’s position in the simulation environment. Click “2D Nav Goal” in Rviz and
click on the intended spot on the map, as well as point the green arrow to the desired direction for the
robot to face. Once the mouse click is released, the robot will plan its path automatically and move
accordingly to reach the target position as following:

70 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

The green line in the picture above shows the planned path by robot. The purple areas on the room
walls are the obstacle location detected by robot at its current position. Robot will avoid obstacles by
certain range. If robot encounters obstacles during its path or gaps with distance comparable to the
robot diameter, robot will recognise the obstacles and replan the path.

Robot is moving in the simulation environment:

71 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Once the robot is in movement, there are not any functions to stop the movement. If the movement is to
be stopped, indicate the target position near the current robot position. The target position can be
indicated anytime during the robot movement.

6.3. Coordinates for Robot Positioning

Functional Description: Use coordinates to indicate target position for robot and for it to return to
origin position.

Step 1: Open a new terminal, activate Simulation


roslaunch jupiterobot_gazebo jupiterobot_world.launch
world_file:=/home/mustar/catkin_ws/worlds/Jupiter_Robot_Office.world

Step 2: Open a new Terminal, import the built map


roslaunch jupiterobot_gazebo amcl_demo.launch
map_file:=/home/mustar/catkin_ws/maps/JupiterOfficeSim.yaml
Note: Connect the above command by space, not carriage return.

Step 3: Open a new Terminal, activate Rviz environment


roslaunch turtlebot_rviz_launchers view_navigation.launch

Step 4: Verify the target position coordinates

72 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
At the Rviz interface, select “Publish Point” button. Move the mouse click to the desired target position
for the robot and observe the coordinates at the left bottom as following:

Click “Publish Point”

Coordinates
position

Coordinates
shown

In the coordinates, the first two represent the target X & Y-coordinates.

Open the “navigation.py” file under “home/mustar/catkin_ws/src/rc-home-edu-learn-


ros/rchomeedu_navigation/scripts”. Find the 52nd and 53rd line. Change the numbers after “A_x=” into X
coordinates, e.g. “1.8” above and change the numbers after “A_y=” into y coordinates , e.g. “-3.26”
above, as shown below. Save the document.

73 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Step 5: Open a new terminal, navigate the robot to target position


roslaunch rchomeedu_navigation navigation.launch

Once activated:

74 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Observe the control terminal, click the “2D Pose Estimate” button on the Rviz interface and click on the
intended spot on the map, as well as point the green arrow to the desired direction for the robot to face.
The robot will then move towards the inputted coordinates, before returning to its origin point:

75 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

REMOTE CONTROL

TeamViewer

Functional Description: Via the software TeamViewer, it is possible to connect the computer of the robot
to any other computer with any operating system to achieve remote control.

Link for download: https://www.teamviewer.com/en/features/latest-version/#gref

The software is already preinstalled at the computer of the robot and it will launch automatically when
computer starts. On any computer with a wireless network interface controller, search for a signal with
SSID as “mustarxxxx”, enter the password to connect (password is the same as SSID), the numbers
after “mustar” are the same as the last 4 digits of the unit number for robot.

Once TeamViewer launched:

Under “Extras“, choose “Options” and under “General”, change the “Incoming LAN connections” setting
to “accept” as following:

76 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Back to main page, key in “10.42.0.1” under “Partner ID” and click “connect”. Use the password “123456”
to connect.

77 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

F710 (GAMEPAD) CONFIGURATION


Step 1: Test the port for gamepad
Launch a new Terminal, activate the following command & note the gamepad port number.
ls /dev/input/

The result is similar as following:

Remove the wireless receiver module of the gamepad and reactivate.

ls /dev/input/

Note the difference compared to the previous result.

78 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Once compared, it is apparent that “js1” is the current port number of the gamepad.
Note: Changing the USB port for the gamepad could change the “js” port number. Hence, do keep the
same USB port if possible.

Open the “joy_move.launch” file under


“/home/mustar/catkin_ws/src/jupiterobot/jupiterobot_teleop/jupiterobot_teleop_move/launch”, change
the number followed by “js” (inside the red rectangle below) into the number found during the gamepad
port test previously.

79 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Step 2: Test gamepad working condition


Launch a new Terminal, execute the following command (using the port number “js1” as example).
sudo jstest /dev/input/js1

If the gamepad is working correctly, the following test result can be obtained:

80 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

Axes 0 ∼ 5 indicates joystick output, while button 0 ∼ 11 represents button output.

The gamepad brand is Logitech F710 wireless gamepad (“Logitech Cordless RumblePad 2”), with the
mode of working as DirectInput (“D” working mode). Please refer to the pictures below for the gamepad
button guidance (red number indicates joystick output, green number represents button output).

81 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)

8 3
5 9
4
0 2
1 3
0 2 1

11 10

5 4
7 6

Note: By pressing down the “MODE” button, the green indicator beside it turns on - the joystick on the
left and the left D-pad button exchange their functions.

82 / 83

You might also like