Jupiter Robot Software Module User Guide (v0.3.0) - 20210809
Jupiter Robot Software Module User Guide (v0.3.0) - 20210809
Technical Support:[email protected]
Jupiter Robot Software Module User Guide
V0.3.0
Email: [email protected]
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Content
INTRODUCTION................................................................................................................................. 2
SPEECH RECOGNITION & SYNTHESIS .......................................................................................... 3
2.1. Speech Recognition ..................................................................................................................... 3
2.2. Control Gazebo for Jupiter Robot ................................................................................................ 5
ROBOT ARM CONTROL .................................................................................................................... 8
3.1. Robot Arm Initial Position ............................................................................................................. 8
3.2. Activate Robot Arm Debugger ................................................................................................... 10
3.3. Robot Arm Simulation ................................................................................................................ 14
IMAGE PROCESSING ..................................................................................................................... 18
4.1. Object Tracking .......................................................................................................................... 18
4.2. Facial Detection ......................................................................................................................... 23
4.3. Facial Recognition ..................................................................................................................... 24
4.4. People Detection ........................................................................................................................ 28
4.5. Yolo Object Recognition ............................................................................................................. 30
4.6. OpenPose Body Posture Recognition ....................................................................................... 32
4.7. Interactive-face Facial Recognition............................................................................................ 34
4.8. MaskRCNN Area Segmentation ................................................................................................ 36
INDOOR NAVIGATION ..................................................................................................................... 38
5.1. Mapping ..................................................................................................................................... 38
5.2. Rviz for Indoor Navigation.......................................................................................................... 47
5.3. Coordinates to Locate Target Position ....................................................................................... 54
5.4. Edit Map Using Image Processing Software ............................................................................. 57
MAPPING & NAVIGATION SIMULATION ........................................................................................ 60
6.1. Mapping Simulation ................................................................................................................... 60
6.2. Navigation Simulation ................................................................................................................ 68
6.3. Coordinates for Robot Positioning ............................................................................................. 72
REMOTE CONTROL ........................................................................................................................ 76
TeamViewer....................................................................................................................................... 76
F710 (GAMEPAD) CONFIGURATION.............................................................................................. 78
1 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
INTRODUCTION
This User Manual is intended for Jupiter Robot V0.3.0 The functional modules included are Speech
Recognition & Synthesis, Robot Arm Manipulation, Image Processing, Indoor Navigation and Simulation.
2 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Step 1: Launch a new Terminal, activate English key words recognition module
roslaunch pocketsphinx kws.launch
dict:=/home/mustar/catkin_ws/src/basic_function_packages/pocketsphinx/demo/voice_cmd.dic
kws:=/home/mustar/catkin_ws/src/basic_function_packages/pocketsphinx/demo/voice_cmd.kwlist
Note: Connect the above command by space, not carriage return. Copy the command to Terminal as
following:
Once executed:
3 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
List of recognised words can be found in the voice_cmd.dic files under “catkin_ws/src/
basic_function_packages /pocketsphinx/demo/”, including “BACK, FORWARD, FULL, HALF, LEFT,
MOVE, RIGHT, SPEED, STOP”. Combined words such as “FULL SPEED, HALF SPEED” can be
recognised too. The recognition rate is not high, as it is offline.
4 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Functional Description: Able to execute movements for Jupiter Robot by a few specific English
commands in Gazebo.
Note: There might be a short waiting time, until following image appears on the screen. If it fails to activate,
please attempt several times.
5 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Step 2: Launch a new Terminal, activate English key words recognition module
roslaunch pocketsphinx kws.launch
dict:=/home/mustar/catkin_ws/src/basic_function_packages/pocketsphinx/demo/voice_cmd.dic
kws:=/home/mustar/catkin_ws/src/basic_function_packages/pocketsphinx/demo/voice_cmd.kwlist
Once activated:
6 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using the microphone, speak out the 9 commands under 2.1. Recognition results can be seen as
following:
It shows that Jupiter Robot acts accordingly in Gazebo, but it is possible that its action is not entirely
triggered by the command. As long as there is robot movement, it shows that voice command works.
7 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
To ensure the safety of robot arm operation, it is necessary to place the arm as below.
Note: Before placing the arm manually, make sure that the arm power supply is switched off. Never
manipulate the arm position while the power is on to avoid damage.
Check the steering position at the bottom - the data cables should be tangle-free as following:
8 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
9 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Use the following source codes to rotate the motor slightly to change the position of robot arm
rostopic pub -1 /waist_controller/command std_msgs/Float64 -- 0.3
10 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
In the given source codes, “Float64 – 0.3” represents controller’s target position. When the motor reaches
this position, it would stop. The number could be changed to 0.0 to return the motor to its initial position.
A negative number indicates a turn in an opposite direction. Note that the speed of the robot arm was set
low in the control parameter. Any changes of target position data should be done gradually to avoid
collision of motor with the robot itself, screen etc. When movement is completed, the window appears as
following:
11 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Note that if a large parameter is set, motor may get stuck. Red warning would appear on the Terminal of
activated robot arm as following:
12 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
At the same time, the stuck motor would flash red light as following:
If such situation occurs, immediately press the emergency stop switch to cut off the electricity supply to
the mobile platform and robot arm. Manually adjust the controller’s position to normal. Then switch on the
electricity supply again and launch the Terminal for robot arm.
13 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Functional Description: Able to control a simulated robot arm in Gazebo with MoveIt!
Note: The simulated robot arm is of different configurations than the one on Jupiter Robot.
Step 1: Launch a new Terminal, activate robot arm simulation in Gazebo
roslaunch jupiterobot_arm_moveit_config demo_gazebo.launch
Once activated:
14 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
15 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
At Rviz, select “Planning” under “MoveitPlanning”. Under “Query”, choose “Select Goal State”. There are
four different options available under the pull-down menu: “random valid” (any reachable random position),
“random” (random position), “current” (current position), “same as start” (same initial position), as well as
three preset postures “up, rest, catch”. Once the posture is selected, press the “Update” button to update
the target posture.
Then, press “Plan” under “Commands” and observe the planned path of the robot arm of Jupiter Robot
at the right. Once “Execute” is clicked and the interface is switched to Gazebo, one can see the simulated
robot arm in Gazebo moving to the targeted position based on the planned path by Moveit!.
16 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Step 3: Select Goal State to plan robot arm movement
Besides utilising preset position as the target point for robot arm movement planning, one can use the
appeared blue ball of the robot arm within Rviz to indicate the target positioning as following:
In the picture above, the blue ball represents the actuator position. The red arrow shows the X direction,
the green arrow indicates the Y direction and the blue arrow points to the Z direction. The red ring is roll,
the green ring is pitch and the blue ring is yaw. Using mouse click to move the blue ball or the three balls
of different colour, one can change the position of the end effector. Moving the three different rings could
change the posture of the end effector. Once adjusted, click “Plan” and “Execute” to plan and execute.
17 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
IMAGE PROCESSING
Functional Description: Able to detect and track an object of specific colour in an image.
18 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
At the same time, two windows on the screen appear – one showing the image as captured by the camera
and one showing a distribution histogram of selected colour as following:
19 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Choose a colour object with a distinct differentiation from background. Use the mouse to select a
rectangle region on the object. The rectangle region should be fully within the compound of the colour
object as following:
20 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once the mouse is moved, there will be a red ellipse circling the object, which will follow the movement
of the object as following (The red object was changed to a yellow one for better contrast purpose):
21 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Note that it is possible for the camera to wrongly recognise other objects or regions of similar colour, once
the colour object leaves the view area of the camera. If the colour object cannot be correctly tracked once
it returns to the view area, one can reselect the rectangle region or bring the colour object near to the
wrongly recognised area to detect and track it again.
22 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once activated:
Depends on facial impressions, distance to camera and eyelid’s position, eyes can be further identified.
23 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once activated:
Two windows appear on screen – one showing the camera view, one showing a conversation window
asking for name as following:
24 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Enter name and press “enter” to confirm. Do keep a person in front of the camera with the face size ratio
as large as possible as following:
25 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
After the first photograph, there will be an enquiry on the conversation window if further photograph should
be taken – select “y” and continue to have 3 – 5 more photographs taken.
26 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
At the conversation window, photos for the second person can be taken, once the one for the first person
has been completed. Once both completed, the facial recognition result window would display the result
of recognising both persons at the same time as following:
Note that algorithim saves the photograph specimens under .ros hidden folder under “Home” folder. Use
“Ctrl + H” to show all hidden folders under “Home” folder to find the .ros folder. In “opencv_apps” inside,
there are individual named folders with photographs taken. For a complete new start, all named folders
under “face_data” would be deleted before taking new photos.
27 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
An image interface appears at the same time. When a person appears in complete in the image, a green
rectangle encompasses the area surrounding the person as following:
28 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
If a person does not appear in complete (e.g. head or leg being cropped), the person may not be detected
as following:
29 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once activated:
30 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
“FPS” (frames per second) is the rate at which consecutive images called frames appear on a display.
“Object” lists down the recognised object(s) and the confidence level.
31 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once activated:
32 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The recognised body posture will be marked in the image window as following:
33 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once activated:
34 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
In the picture above, the blue square marks the recognised face while the words above indicate the
analysed results regarding gender, age and facial expression information. The yellow star marks the facial
frame and the position of eyes, mouth, nose etc. The short red, green and blue lines represent the facial
direction judgement.
35 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once activated:
36 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
37 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
INDOOR NAVIGATION
5.1. Mapping
Once activated:
Using 3D camera:
roslaunch jupiterobot_navigation gmapping_demo.launch
38 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using LIDAR:
roslaunch jupiterobot_navigation rplidar_gmapping_demo.launch
39 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The result using 3D camera:
40 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Step 3: Launch a new Terminal, activate Rviz environment
roslaunch turtlebot_rviz_launchers view_navigation.launch
As following:
Using 3D camera:
Using LIDAR:
41 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using keyboard:
42 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
As following:
With:
“i” represents movement to the front
“u” represents anticlockwise movement to the front with left wheel as circle centre
“o” represents clockwise movement to the front with right wheel as circle centre
“k” represents immediate stop
“j” represents anticlockwise movement at the original position with centre of robot as circle centre
“l” represents clockwise movement at the original position with centre of robot as circle centre
“,” represents movement to the back
“m” represents clockwise movement to the back with left wheel as circle centre
“.” represents anticlockwise movement to the back with right wheel as circle centre
Using Gamepad:
43 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Operation guide:
RB button (button Nr. 5) controls whether the robot executes the command from gamepad.
Left joystick (joystick Nr. 0, 1) controls the robot’s normal speed movement.
Right joystick (joystick Nr. 2, 3) controls the robot’s low speed movement.
With the pressing of LB Button (button Nr. 4) and the using of left joystick, the robot’s movement speed
will be enhanced.
Unknown Area
Position of Robot.
The green line at the
right with the thick
end pointing towards
the front of the robot
Area with
Area with
mapping
obstacles
ongoing
detected
Area without
obstacles
detected
44 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Note:
1) Use low movement speed; The default speed is good. If there are many corners or obstacles, position
the robot to scan the target area directly multiple times. Do not reverse the robot during mapping, nor
stop the robot with the “k” key abruptly when it is moving speedily.
2) The detected 3D height for navigation camera is approximately 30-60mm. For LIDAR, it is about 30-
40mm. Please note that for tables, chairs etc., there could be a hollow space up to 60mm. In this situation,
the robot is unable to detect the obstacle. Suggest to place another obstacle manually at this height.
Alternatively, remember the position of the obstacle and mark it afterwards manually with the image
processing software.
3) When LIDAR is used, transparent glasses, black objects and mirrors should be avoided above the
detection height as these objects could absorb or reflect the laser. If viable, frosted membrane film could
be considered to diffuse reflection.
4) Try to form a closed region to ease the path planning. If there is a large opened region, the path
planning may fail. If it is not possible to form a closed region, manual border drawing afterwards with
the image processing software would be needed.
5) Try to form closed loop during mapping, i.e. robot starts and ends at the same point. If the region to be
covered is large, e.g. multiple rooms, try to form a closed region before entering another region. Once all
regions are covered, the robot should move back to its origin point.
6) Mapping surface needs to be flat. Convex area could affect the accuracy of the mapping process.
7) Keep the terminal of keyboard-controlled robot as the current window, as otherwise the
commands from keyboard could not be transmitted to robot. No restriction if a gamepad is used.
8) Left mouse click rotates Rviz display area. Right click zooms in the display area. Shift + left
mouse click translates the display area.
As following:
45 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
In above, “test1” is the name of correctable map file. Maps are saved under
“/home/mustar/catkin_ws/maps” in the “.pgm” and “.yaml” formats. “.pgm” is the image format of maps
where the overall map can be seen, whereas “.yaml” is the specification document where it records
various map information such as saved position, image resolution, initial posture etc. If map documents
are to be moved, do change the first line of “.yaml” (red box below) to the new absolute path of “.pgm”
after the movement.
Note: only the first line needs to be changed. The other lines are subject to algorithms, may not coincide
with the ones from the picture above.
46 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using 3D camera:
Using LIDAR:
Note: Connect the above command by space, not carriage return. Copy the command to Terminal as
following:
Using 3D Camera:
47 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using LIDAR:
48 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
49 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The black dot represents the robot. Its position is however inaccurate as the robot does not know its initial
position. The green arrows surrounding it represent the position & direction the robot thinks it could have.
50 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
itself on the map as following:
Using 3D Camera:
Using LIDAR:
51 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using LIDAR:
Using 3D Camera:
52 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using LIDAR:
Note: If the robot encounters obstacles during movement, the robot would avoid them by itself and
replan the path. If the target position is blocked by obstacle, it would affect its path planning. If the robot
is stuck, or is potentially going to collide with other objects, please stop the robot movement by pressing
the emergency stop button. After placing the robot in a safe location, a safe target position must be
selected before switching on the robot again. Once it is on, the robot will rotate itself to determine its
current position before resuming the movement to the target position.
53 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Functional Description: Based on a built map, use coordinate to move the robot to a target position and
return.
Coordinates displayed
In the coordinates, the first two represent the target X & Y-coordinates.
54 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Note: Observe the Terminal, click the “2D Pose Estimate” button on the Rviz interface. Click
55 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
correspondingly on the map and position the green arrow to the direction of robot. Robot would then
move to the target coordinate and return as following:
56 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The image can be zoomed in or out via “+/-“. The direction keys are to move the image.
57 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
58 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
59 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Functional Description: Use a mocked LIDAR sensor to operate robot and build map in simulation.
Once activated:
60 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
61 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
As following:
As following:
62 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The right column can be hidden. There are two topics on the left that need to be changed. First, change
the “Local Map -> Costmap -> Topic” to “/map”. Second, change “Global Map -> Costmap -> Topic” to
“/map”.
63 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Using keyboard:
roslaunch turtlebot_teleop keyboard_teleop.launch
As following:
With:
“i” represents movement to the front
“u” represents anticlockwise movement to the front with left wheel as circle centre
“o” represents clockwise movement to the front with right wheel as circle centre
“k” represents immediate stop
“j” represents anticlockwise movement at the original position with centre of robot as circle centre
“l” represents clockwise movement at the original position with centre of robot as circle centre
“,” represents movement to the back
“m” represents clockwise movement to the back with left wheel as circle centre
“.” represents anticlockwise movement to the back with right wheel as circle centre
64 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Unknown Area
Area with
barrier detected
Area without
barrier
Position of Robot. The green line at the right with the thick end
pointing towards the front of the robot
Make both Gazebo and Rviz windows appear at the same time in order to observe the robot operation in
simulation as following:
Keep the teleoperating window active, remote control the robot to move in the environment, until a map
is built as following:
65 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
1) The simulated robot shall not collide with walls. When the robot collides with obstacles, it would
cause the wheels to spin idly. It might result in high discrepancies in speedometer readings which would
fail the mapping.
2) The default robot movement speed is low. It depends on the processor configuration; higher
performing processor could support higher robot speed. However, the robot speed during mapping should
not be too high. The speed can be increased as appropriate during mapping but if the mapping
discrepancies are too huge, the speed should be lowered and restart.
3) The mapping should be completed at one time. No correction is possible to improve a done map.
However, image processing software can be utilized to edit the image file of the map.
4) LIDAR has a working range of around 240 degree forward fan-shaped region with a distance of
four meters.
5) Try to get the robot back to the initial point of the mapping to loop the whole mapping route
completely.
As following:
66 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
In above, “test2” is the name of correctable map file. Maps are saved under
“/home/mustar/catkin_ws/maps” in the “.pgm” and “.yaml” formats. “.pgm” is the image format of maps
where the overall map can be seen, whereas “.yaml” is the specification document where it records
various map information such as saved position, image resolution, initial posture etc. If map documents
are to be moved, do change the first line of “.yaml” (red box below) to the new absolute path of “.pgm”
after the movement.
Note: only the first line needs to be changed. The other lines are subject to algorithms, may not coincide
with the ones from the picture above.
67 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Note: Connect the above command by space, not carriage return. Copy the command to Terminal as
following:
Once activated:
68 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once activated:
69 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The black dot represents the robot position. As the robot does not know where its initial position is, the
location on the map is not accurate. The surrounding bulks of green arrows represents the possible
position and direction the robot thinks it has.
Observe the position of robot in the simulation. Press the “2D Pose Estimate” button on the Rviz
interface and click on the corresponding location on the map. Point the green arrow to the direction
where the robot is facing. The robot will then readjust its location on the map as following:
During mapping, the initial position of the robot is the position when the robot first switched on. Hence,
in a simulation environment, the “2D Pose Estimate” can be ignored. However, in a real robotic
platform, the initial position of robot does not always correspond to the origin point of mapping, hence it
requires the use of “2D Pose Estimate” to indicate the starting point of robot navigation.
Observe the simulated robot’s position in the simulation environment. Click “2D Nav Goal” in Rviz and
click on the intended spot on the map, as well as point the green arrow to the desired direction for the
robot to face. Once the mouse click is released, the robot will plan its path automatically and move
accordingly to reach the target position as following:
70 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The green line in the picture above shows the planned path by robot. The purple areas on the room
walls are the obstacle location detected by robot at its current position. Robot will avoid obstacles by
certain range. If robot encounters obstacles during its path or gaps with distance comparable to the
robot diameter, robot will recognise the obstacles and replan the path.
71 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once the robot is in movement, there are not any functions to stop the movement. If the movement is to
be stopped, indicate the target position near the current robot position. The target position can be
indicated anytime during the robot movement.
Functional Description: Use coordinates to indicate target position for robot and for it to return to
origin position.
72 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
At the Rviz interface, select “Publish Point” button. Move the mouse click to the desired target position
for the robot and observe the coordinates at the left bottom as following:
Coordinates
position
Coordinates
shown
In the coordinates, the first two represent the target X & Y-coordinates.
73 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once activated:
74 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Observe the control terminal, click the “2D Pose Estimate” button on the Rviz interface and click on the
intended spot on the map, as well as point the green arrow to the desired direction for the robot to face.
The robot will then move towards the inputted coordinates, before returning to its origin point:
75 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
REMOTE CONTROL
TeamViewer
Functional Description: Via the software TeamViewer, it is possible to connect the computer of the robot
to any other computer with any operating system to achieve remote control.
The software is already preinstalled at the computer of the robot and it will launch automatically when
computer starts. On any computer with a wireless network interface controller, search for a signal with
SSID as “mustarxxxx”, enter the password to connect (password is the same as SSID), the numbers
after “mustar” are the same as the last 4 digits of the unit number for robot.
Under “Extras“, choose “Options” and under “General”, change the “Incoming LAN connections” setting
to “accept” as following:
76 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Back to main page, key in “10.42.0.1” under “Partner ID” and click “connect”. Use the password “123456”
to connect.
77 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
ls /dev/input/
78 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
Once compared, it is apparent that “js1” is the current port number of the gamepad.
Note: Changing the USB port for the gamepad could change the “js” port number. Hence, do keep the
same USB port if possible.
79 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
If the gamepad is working correctly, the following test result can be obtained:
80 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
The gamepad brand is Logitech F710 wireless gamepad (“Logitech Cordless RumblePad 2”), with the
mode of working as DirectInput (“D” working mode). Please refer to the pictures below for the gamepad
button guidance (red number indicates joystick output, green number represents button output).
81 / 83
Jupiter Robot Technology Co., Ltd. (V0.3.0)
8 3
5 9
4
0 2
1 3
0 2 1
11 10
5 4
7 6
Note: By pressing down the “MODE” button, the green indicator beside it turns on - the joystick on the
left and the left D-pad button exchange their functions.
82 / 83