INTRODUCTION
The introduction of the document explains the motivation and background for
developing a last-mile delivery robot. Here's a simplified explanation:
1. Last-Mile Delivery Challenge: The "last mile" in logistics refers to the final
step of delivering a package to the customer. It's expensive and inefficient,
making up about 75% of total supply chain costs. This is especially true in urban
areas due to increased online shopping and e-commerce.
2. Rising Demand: More people are moving to cities, leading to more
deliveries. For example, in Germany, the number of deliveries is expected to
increase significantly by 2025 due to urbanization and online shopping growth.
3. Problems with Traditional Delivery: More delivery trucks in cities create
issues like traffic congestion, pollution, and safety concerns. This has led to
pressure from customers and governments to find sustainable solutions.
4. Goal of the Project: To address these problems, the researchers developed
an autonomous robot for last-mile delivery. It is designed to navigate city
environments, aiming to reduce costs and environmental impact . The main idea
is to solve problems like traffic delays, high costs, and customer dissatisfaction.
The system uses advanced planning to decide the best routes, ensures safety for
pedestrians, and protects packages during the journey. Robots deliver multiple
parcels without needing human help, and everything works automatically. The
robot was tested on the campus of Johannes Kepler University in Linz, Austria.
This research particularly aims in the interaction between autonomous
navigations using object detection. In such a problematic scenario, the robot has
to detect and recognize objects as well as estimate their position by avoiding
them.
**Although Object Detection and Recognizing are largely used in recent studies,
most of them typically assume that the object is either already segmented from
the background or that it occupies a large portion of the image. To locate an
object in any environment is very important because of the distance to the
object and also its size in the image can vary significantly. Therefore, the robot
has to detect objects while moving with various frame rates and with the same
trained object with different angles and sizes.
Some other features can be added like LIMMS (Latching Intelligent Modular
Mobility System). It’s a compact, versatile robot designed for both moving and
delivering packages. Here’s how it works : LIMMS can act like a robotic arm to
pick up and organize packages inside a delivery vehicle. Multiple LIMMS units can
join together to move as a robot with legs, using the package as their body.
When not in use, LIMMS folds up compactly, saving space in delivery trucks.
This facility could reduce around 53% of the logistic cost.
CONSTrUCTION AND THE HARDWARE
REQUIRED
**For LMDbots
1. Base Platform:
*The robot was built on a "Spider ILD01" slope lawn mower.
*This platform was modified to carry packages. A wooden parcel station was
added to store deliveries.
*Its steering system was changed to a quasi-Ackermann drive:
*Two wheels steer instead of all four.
2. Structure:
*Material: The body is made of Poly(methyl methacrylate) (Acrylic) and
reinforced with aluminum L-angles for durability
3. Sensors (Robot's Eyes and Ears):
*These sensors allow the robot to perceive and understand its environment:
3D LIDAR (Ouster OS1):
*Mounted on the roof to detect obstacles and create a 3D map.
*Can measure distances up to 120 meters with a 360° view.
*Depth Camera (Intel Realsense D435):
*Captures detailed depth data of nearby objects to avoid collisions.
*Helps identify objects like pedestrians, bikes, or scooters.
GNSS and IMU:
*Combines GPS (for global position) and an inertial measurement unit (IMU) to
track the robot’s movement.
Encoders:
*Monitor the speed and steering angles for accurate movement control.
4. Processing Units (Robot's Brain):
*Two processors handle different tasks:
*Main Computer (Nvidia Jetson AGX Xavier):
*Processes mapping, localization, and path planning.
*Runs deep learning algorithms for object detection.
*Low-Level Control Computer (Raspberry Pi 3B+):
*Controls motors and executes commands from the main computer.
5. Communication System:
*Uses a 1 Gbps router to transfer data between sensors, processors, and other
components via the *TCP/IP protocol.
*Sensors like the depth camera and GPS connect to the main computer through
USB for fast data transfer.
6. Power System:
*The robot is powered by two 24V lead-acid batteries.
*Lithium phosphate batteries can be used with battery life upto 12 hours.
*Safety features include:
*A fuse box to protect circuits.
*A circuit breaker to prevent damage from power surges.
*A kill switch to stop the robot immediately if needed.
Navigation: Uses sensor data for smooth and obstacle-free movement.
Vibration Control: Monitors rough surfaces and adjusts speed to reduce package
damage.
7. AI feauture
* can be added which provides:
1. Route Optimization:
*AI uses algorithms to find the shortest, fastest, and safest paths for deliveries.
*It adapts routes in real-time based on traffic, road conditions, and unexpected
obstacles.
*This minimizes delivery time and operational costs.
2. Obstacle Detection and Avoidance:
*AI processes data from LIDARs, cameras, and sensors to detect obstacles like
pedestrians, vehicles, or objects.
*It enables the robot to navigate safely and autonomously without collisions.
3. Pedestrian Safety:
*AI predicts pedestrian movements using models like Recurrent Neural Networks
(RNNs) or Bi-directional LSTMs.
*It sends warnings or adjusts the robot's path if a potential collision is detected,
ensuring human safety.
4. Vibration Monitoring for Package Safety:
*AI analyzes data from sensors to measure surface roughness and adjusts the
robot’s speed accordingly.
This protects fragile packages from damage during transit.
5. Autonomous Navigation:
*AI integrates multiple tools like Simultaneous Localization and Mapping (SLAM)
for accurate positioning.
*It creates global and local maps to guide the robot effectively in dynamic
environments.
6. Customer Interaction:
*AI systems manage customer preferences, such as specific delivery times or
contactless delivery.
I*t enables features like smart notifications to alert customers about the robot’s
arrival.
7. Predictive Maintenance:
*AI monitors the robot's health (e.g., battery life, motor performance).
*It predicts potential issues before they cause breakdowns, ensuring smooth
operations.
8. Learning and Adaptation:
*AI allows robots to learn from past deliveries, improving their performance over
time.
*Machine learning helps the robot adapt to different environments, weather
conditions, and customer behaviors.
9. Multi-robot Coordination:
*AI coordinates multiple robots working together in a fleet to divide tasks
efficiently.
*It ensures that all robots collaborate to cover the maximum number of
deliveries in the least time.
10. Security and Privacy:
*AI-powered surveillance features protect the robot and packages from theft or
tampering.
*Algorithms ensure secure handling of customer data and delivery details.
*AI enables LMD robots to operate efficiently, autonomously, and safely, making
them a viable solution for urban delivery challenges.
**LIMMS
The hardware and construction of the LIMMS robot are designed to address the dual needs of
manipulation and locomotion in last-mile delivery. Here's a simplified breakdown:
1. Modular Design
Each LIMMS unit is a 6-degree-of-freedom (DoF) robot, meaning it can move and rotate in multiple
directions.
Both ends of the module are equipped with:
*Wheels for movement on flat surfaces. Latching mechanisms for anchoring onto surfaces or
connecting to other modules
2. Compact and Lightweight Construction
*When not in use, the LIMMS unit folds into a compact size (0.43m x 0.22m x 0.18m) to save space in
delivery vehicles.
*The unit weighs 4.14 kg, making it portable and energy-efficient.
3. Joint Configuration
*LIMMS has a symmetric 6-DoF joint setup:
*Two joints near each end.
*Two joints in the middle (elbow region).
This symmetry allows the robot to function equally well when anchored at either end.
4. Latching Mechanism
*Purpose: Enables LIMMS to attach itself to boxes, vehicle interiors, or other LIMMS units.
**Design:
*A rotating mechanism with three blades that fit into triangular holes on the surface it attaches to.
*The blades self-align to ensure a secure fit, even if the initial position is slightly misaligned.
*Future iterations will include locking pins to secure the latch in place.
5. Prototype Materials
*Carbon Fiber Tubes: House the battery and control components, keeping the design lightweight.
*Nylon 12 Parts: Manufactured using Selective Laser Sintering (SLS) for the structural joints.
*Actuators: Each joint is powered by a DYNAMIXEL XM540-W150-R motor, with additional planetary
gearboxes to enhance torque.
6. Actuation and Motion
*Motors are configured to provide:
*Up to 31 Nm of torque.
*A maximum speed of 2 rad/s.
*The wheels or latches can be swapped on the ends, allowing flexibility in tasks like wheeled motion
or latching.
7. Modular Versatility
*LIMMS can switch between roles:
*Act as a robotic arm to move packages within a vehicle.
*Combine with other units to form a quadruped for terrain traversal.
*Use its wheels for self-balancing, Segway-like movement.
This hardware setup allows LIMMS to perform multiple roles efficiently while maintaining a small
spatial footprint, making it ideal for scalable and adaptable last-mile delivery solutions.
*8
*Use CAD software like SolidWorks or Fusion 360 to design joints and linkages.
* Attach wheels directly to actuators for efficient speed and torque management.
* Bearings like CRBT505A, IKO to be used to improve joint rigidity and load bearing capacity.
* Use DYNAMIXEL XM540-W150-R motor for their high torque output.
*Add planetary gearbox with 3.5 : 1 reducing ratio to increase torque while maintaining speed of
2rad/s.
* Use controller or single board computer eg- Arduino and Raspberry Pi to manage motor control
and sensor integration.
*Add sensors like gyroscope and accelerometer for balance control .
* Use incorporate cameras or LIDAR for navigation and object detection.
*For simulation use software like Gazebo or MATLAB.
Working &
aPPLICATIOns
The LMDBot works by combining advanced sensors, mapping,
and control systems to perform autonomous last-mile
deliveries. It first uses sensors like 3D LIDAR, a depth camera,
GPS, and an IMU to perceive and understand its environment,
creating 2D and 3D maps for navigation. Localization
algorithms, such as Adaptive Monte Carlo Localization (AMCL),
help the robot determine its exact position. The robot plans its
path to the destination, avoiding obstacles in real-time using
data from its sensors. A low-level controller manages the
motors, ensuring smooth movement based on speed and
steering commands from the main computer. Communication
between components is handled via a high-speed router, and a
robust power system ensures uninterrupted operation. This
seamless integration allows the LMDBot to navigate
autonomously and deliver packages efficiently and safely.After
completing the delivery the robot returns to the base for
recharging and next assignment .The delivery bots are
monitored remotely by operators who can intervene in between
if required.
**It is time efficient as well as cost effective
*Time Efficiency
Autonomous Navigation: The robot leverages technologies like
3D LIDAR, depth cameras, and advanced mapping methods to
efficiently navigate urban environments, reducing the time
needed for human-driven deliveries.
Optimized Route Planning: Integration of localization and
mapping (AMCL and EKF) ensures that the robot follows optimal
delivery routes, minimizing delays and congestion.
*Cost Effectiveness
Reduced Labor Costs: By automating the delivery process, the
need for human drivers and personnel for the "last mile" is
significantly decreased, which traditionally accounts for up to
75% of supply chain costs.
Lower Operational Costs: The use of energy-efficient
components like Nvidia Jetson processors and Raspberry Pi,
combined with robust power management systems, reduces
energy consumption and maintenance costs.
*Reduction in Other Factors
Traffic and Environmental Impact: Autonomous delivery
reduces the number of delivery trucks in urban areas,
alleviating traffic congestion and decreasing emissions.
Safety Enhancements: Object detection (via YOLO CNN V4) and
collision avoidance systems ensure safe operation, reducing
risks associated with human error.
Improved Scalability: Autonomous robots can be deployed in
larger numbers to handle increased demand, especially in
urbanized and e-commerce-driven environments.
Overall, the LMDBot’s design streamlines delivery operations
by integrating advanced robotics and sustainable practices,
addressing both economic and environmental challenges.
**IT HELPS:
1.SOLVING LAST MINUTE DELIVERY CHALLENGES
2.ENVIRONMENT SUSTAINABILITY
3.ECONOMIC GROWTH AND SCALABILITY
4.ENHANCING SAFETY
5.IMPROVED CUSTOMER EXPERIENCE
6.SUPPORTING URBANIZATION
7. MAXIMIZE SPACE EFFICIENCY
FUTURE SCOPE AND
INNOVATION
1. Expanded Use Cases
*Last-Mile Delivery: Continued focus on last-mile logistics for e-commerce, groceries, and
pharmaceuticals.
*Warehouse and Factory Automation: Robots will streamline inventory movement, picking, and
packing within industrial settings.
*Healthcare Applications: Use in hospitals to deliver medications, lab samples, or medical supplies.
*Rural and Remote Access: Delivery robots adapted to navigate rugged terrains and serve isolated
areas.
2. Advanced Navigation and Autonomy
*5G Integration: Faster communication for real-time updates, remote monitoring, and seamless
operation in urban environments.
*Edge Computing: On-device data processing for faster decision-making and reduced reliance on
cloud connectivity.
*Enhanced SLAM and AI Algorithms: Improved mapping, obstacle avoidance, and path planning,
enabling robots to handle complex environments like crowded streets and adverse weather.
3. Multi-modal Delivery Systems
*Drone-robot Hybrids: Integration of ground-based robots with aerial drones for seamless delivery in
urban and rural areas.
*Underwater Delivery: Specialized robots for transporting goods underwater, such as in island
communities or coastal logistics.
*Autonomous Vehicle Coordination: Fleet coordination between delivery bots and self-driving
vehicles for integrated logistics.
4. Human-Robot Interaction
*Personalized Service: Robots equipped with voice assistants or displays for direct interaction with
customers.
*Accessibility Enhancements: Features like audio prompts, tactile feedback, or braille for disabled
users.
*Smart Lock Systems: Innovations in secure package compartments using biometric or app-based
access.
5. Energy Efficiency and Sustainability
*Solar-Powered Robots: Use of solar panels for extended operational range, especially in sunny
regions.
Swappable Batteries: Modular battery systems for faster turnaround and reduced downtime.
Recyclable Materials: Robots made with eco-friendly, recyclable materials to minimize environmental
impact.
6. Regulatory Frameworks and Infrastructure
*Urban Infrastructure: Development of dedicated robot lanes, smart sidewalks, and hubs for
charging and package loading.
*Regulatory Adaptation: Policies to ensure safe integration of robots in cities, addressing concerns
about privacy, safety, and job displacement.
7. Innovations in Payload and Design
Temperature-Controlled Compartments: Essential for transporting perishable goods, like food or
vaccines.
Modular Designs: Robots with interchangeable compartments for diverse delivery needs.
Enhanced Terrain Adaptability: Robots with all-terrain wheels, legs, or hybrid locomotion systems to
handle diverse environments.
8. Collaboration with Other Technologies
Internet of Things (IoT): Integration with smart cities and devices for optimized routing and delivery
scheduling.
Blockchain Technology: Ensuring secure, transparent, and tamper-proof delivery records.
AI and Machine Learning: Predictive analytics for demand forecasting, route optimization, and
maintenance.
9. Global Accessibility
Localization: Robots tailored for regional needs, languages, and terrains.
Cost-Effective Models: Affordable robots for small businesses and startups, enabling wider adoption.
10. Social and Ethical Impact
Job Creation: New roles in robot maintenance, programming, and monitoring.
Increased Accessibility: Enhanced services for the elderly and disabled communities.
Reducing Urban Congestion: Lower reliance on traditional delivery vehicles.
Innovations
1. AI-Driven Decision Making:
*Implementing artificial intelligence for
autonomous decision-making in complex
scenarios, such as rerouting due to obstacles or
prioritizing deliveries based on urgency.
2. Multi-Modal Navigation Systems:
*Combining vision-based navigation with other
sensors like LiDAR, ultrasonic, and infrared for
enhanced accuracy.
*Leveraging GPS and RFID for seamless indoor-
outdoor transitions.
3. Dynamic Obstacle Avoidance:
*Enhancing real-time obstacle detection and
avoidance using deep learning models like YOLO
or Transformers.
*Incorporating predictive modeling to anticipate
obstacle movements.
4. Self-Learning Capabilities:
*Developing self-learning robots that can adapt
to new environments and tasks through
reinforcement learning.
*Using feedback loops to improve performance
over time.
5. Human-Centric Design:
*Adding features like voice commands, gesture
recognition, and adaptive interfaces for better
interaction with users.
*Ensuring safety measures to operate alongside
humans in public spaces.
6. Real-Time Data Analytics:
*Collecting and analyzing data during operations
to improve performance, predict maintenance
needs, and optimize workflows.
*By addressing these areas, the Alpha-N ADR can
evolve into a cutting-edge solution for diverse
industries, revolutionizing logistics, robotics, and
automation globally.