Multi-Sensor Obstacle Detection System Via Model-Based State-Feedback Control in Smart Cane Design For The Visually Challenged
Multi-Sensor Obstacle Detection System Via Model-Based State-Feedback Control in Smart Cane Design For The Visually Challenged
ABSTRACT Smart canes are usually developed to alert visually challenged users of any obstacles beyond
the canes’ physical lengths. The accuracy of the sensors and their actuators’ positions are equally crucial
to estimate the locations of the obstacles with respect to the users so as to ensure only correct signals
are sent through the associated audio or tactile feedbacks. For implementations with low-cost sensors,
however, the users are very likely to get false alerts due to the effects from noise and their erratic readings,
and the performance degradation will be more noticeable when the positional fluctuations of the actuators
get amplified. In this paper, a multi-sensor obstacle detection system for a smart cane is proposed via a
model-based state-feedback control strategy to regulate the detection angle of the sensors and minimize the
false alerts to the user. In this approach, the overall system is first restructured into a suitable state-space
model, and a linear quadratic regulator (LQR)-based controller is then synthesized to further optimize
the actuator’s control actions while ensuring its position tracking. We also integrate dynamic feedback
compensators into the design to increase the accuracy of the user alerts. The performance of the resulting
feedback system was evaluated via a series of real-time experiments, and we showed that the proposed
method provides significant improvements over conventional methods in terms of error reductions.
INDEX TERMS Multi-sensor, visually challenged, model-based control, state-feedback, obstacle detections.
2169-3536 2018 IEEE. Translations and content mining are permitted for academic research only.
64182 Personal use is also permitted, but republication/redistribution requires IEEE permission. VOLUME 6, 2018
See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
N. S. Ahmad et al.: Multi-Sensor Obstacle Detection System Via Model-Based State-Feedback Control
however is not an issue to the sonar and laser range finders detection range can be achieved, the size and power consump-
which share a similar working principle commonly known as tion of the device will also increase which directly affect the
Time-of-Flight (ToF). This principle leads to a simple mathe- price, portability and mobility of the device. For some cases,
matical calculation for distance measurement from the sensor a lot of variations in the user alerts may be confusing and
to the object, that is, by generating a beam of energy waves less intuitive, and users usually prefer to receive simplified
directed towards an object, the time it takes for the beam to signals without having to process a lot of raw data from
make its journey back towards its source after being reflected the feedbacks. These issues have sparked a growing interest
can be used to estimate the distance traveled. As laser beam among researchers to study on the design guidelines and
offers a distinct advantage of being able to travel many times improvements that can be introduced to increase the usability
faster than that provided by sonar sensors, it is often used to and marketibility of the devices [10], [11], [28].
detect both static and moving objects [7]. In [8], a fusion of a Inspired by a number of recent smart cane configurations
laser sensor and a camera for an electronic virtual white cane [6], [14], [15], a simple design with integration of ultrasonic
implementation was proposed where the distance calculation and IR sensors for obstacles detection, and vibrotactile and
was based on the laser’s position and the image captured. audio techniques for the feedbacks has been implemented
Nevertheless, for simple design goals, some preferred to for the prototype smart cane in this work. The sensors are
avoid using this kind of technology as the laser light is positioned in such a way that the sensing range of the ultra-
known to be harmful to humans, and extra precautions are sonic sensors includes the left and right front of the user, and
required when using them [9]. The sonar technology, on the from the ground level to the head level, while the IR sensor is
other hand, despite its low resolution and slower response as used for uneven ground surface detection such as holes and
compared to IR and laser, is still being preferred by many descending stairs. Since the device involves low-cost sensors,
developers and has become among the most common sensing the signals sent to the user are prone to noise and erratic
technique for mobility aids. This is mainly due to its low-cost readings which may lead to false alarms, and the performance
and broad beam-width which allows for a wide detection will become worse when the sensors’ positions oscillate with
range [1], [10]. the user’s hand movements. As suggested in [28], the hor-
As the sensor technology is rapidly evolving in paral- izontal orientation of the cane can be fixed by including a
lel with the emerging trends in Internet-of-Things (IoT) mark or indicator on the handlebar to ensure the sensors are
and embedded systems, many refinements of the early always facing forward. While this can be controlled by the
ETAs with new innovative technologies have been developed user, the sensors’ vertical detection angle is bound to fluctuate
in modern assistive devices [11]. Integration of multiples since the user usually has to tilt the cane back and forth
sensors on a single platform to overcome the limitations while walking. Motivated by these issues, the focus of this
of individual sensors has become increasingly popular in work is on improvement of the obstacle detection system
recent years [4], [12]–[17]. Combinations of multiple ultra- by means of model-based state feedback technique. In this
sonic and other sensors have been reported in a series of approach, the overall system is first restructured into a suit-
papers [13], [15], [17], [18] to accommodate a wider range of able state-space model which also includes an accelerometer
obstacles and sensing area. There is also a growing number to sense the tilt angle. A motorized actuator is used to control
of recent studies on microcontroller-based assistive devices the vertical detection angle of the ultrasonic sensors, and a
which allow faster user alerts via various types of actu- linear quadratic regulator (LQR)-based controller is synthe-
ators and wireless feedbacks. Vibrators, for instance, are sized to further optimize the actuator’s control actions while
extensively used to provide haptic or vibrotactile feedbacks ensuring its position tracking. We also integrate dynamic
with different intensities [14], [19], [20]. Another interest- feedback compensators into the design which additionally act
ing approach by [21] and [22] where steering actions of a as noise filters to increase the accuracy of the user alerts. The
mini wheeled mobile robot attached to a white cane was performance of the resulting feedback system was evaluated
introduced to provide a vibrotactile feedback with a sense via a series of real-time experiments, and we showed that the
of direction. Apart from that, audio voice/texts or acoustic proposed method provides significant improvements over the
feedbacks have also been considered by many researchers conventional methods in terms of error reductions.
which can alert the users wirelessly through smartphones
and/or headsets [16], [18], [20], [21], [23]–[26]. II. METHODOLOGY AND MAIN RESULTS
Despite the technological revolution, the assistive devices The overall view of the smart cane system configuration is
have not been successfully adopted and used by a large depicted as in Figure 1 where five sensors (one accelerome-
number of people with visual impairments, and many still ter, Sa , three ultrasonic sensors Sh , Smr , Sml and one infrared
prefer to use the white canes [27], [28]. Several research sensor, Sg ) serve as the interface for input signals to an
findings have shown issues related to the limited use of ATmega328p microcontroller, a servo motor to control the
these smart devices, which include high prices, safety, sensors’ positions, a vibration motor for the vibrotactile alert,
orientations, speed, mobility, portability and optimizations and a bluetooth module for wireless audio feedback. The
of techniques [28], [29]. Combinations of many different focus of this work is on improving the performance of the
sensors, for instance, although faster feedback and wider user alerts which rely on the accuracy of the sensors’ positions
FIGURE 4. Close-up on the side view (top) and top view (bottom) of the
smart cane system with the sketch of its detection range. subsystem Ga , i.e.
Ga ∼ (A1 , B1 , C1 , 0);
C1 0 0 A 0 B
0 C2 0 A1 = 11 , B1 = 11 , C1 = [1 − 1] (4)
C = , D = 04×4 0 0 0
0 0 C3
Cd 0 0 Without any state feedback, the first state cannot be controlled
at all. Assuming x11 = α where α ∈ R is a constant,
yo = [y1 y2 y3 yd ] ,
T
uo = [u1 u02 u03 T
ud ] ;
the uncontrolled output y1 then reduces to y1 = α − x12 .
(3) If α = 0 for instance, the output will be the inverse of the
accelerometer’s angle from the xs -axis. In order to estimate
with A1 ∈ R2×2 , B1 ∈ R2×1 , C1 , Cd ∈ R1×2 , Bd , C2 ∈ R1×3 the state-space model of the main actuator, three different
and C3 ∈ R. The matrices A1 , B1 , C1 , C2 and C3 depend control input profiles which represent the responses from the
on the model of the main actuator and the sensors, while servo motor based on three different user movements were fed
Bd and Cd rely on the user’s hand movement. It is also into the system, and the outputs were then compared with the
straightforward that Cd = [0 γ ], γ ∈ R, as the orientation inputs as shown in Figure 5. Via the open-loop model, the user
of Sa is parallel with the SC-board’s. can also be alerted of the obstacle’s position and drop-off
Although servo motors generally provide a perfect through x2 and x3 . For instance, setting C3 = 1, a warning
steady-state tracking particularly for step responses, with the to the user for the drop-off can be delivered via y3 which then
ultrasonic sensors and other load attached, their dynamic activates the vibrator. As for the obstacle positions, the alerts
will be slightly affected. Moreover, as the nature of the via y2 will be sent to the user via wireless serial transmit,
inputs is always uncertain and highly depends on the hence discretized signals for the audio feedbacks are more
user’s movement, it is therefore useful to take into account suitable. This can be acheived by assigning
the actuator’s dynamic in order to ensure the tracking
behaviour stays within the desired specifications. From the 1 when u2i 6= 0
x2i = (5)
system’s architecture, we can write the main actuator as a 0 when u2i = 0
for i = 1, 2, 3 and C2 = [1 2 4]. The relationship between x2 0 0 0 0
C1 01×2 0 D2 0 0
and y2 along with the user alert is summarized in Table 2. C̃ = , D̃ = (6)
03×2 03×2 0 0 D3 0
0 0 0 D4
TABLE 2. User alert of the obstacle’s position via open-loop control.
solving (11) [30]. The perfect reference tracking can then be TABLE 3. Accuracy (%) of the estimated model based on three different
movements. M1, M2 and M3 denote Movements 1,2 and 3 respectively.
achieved by including K1 ∈ R+ where
ub2 = −82 (β2 ), ub3 = −83 (β3 ) and ub4 = −84 (β2 )
(14)
0 if β2i ≤ ∀i = 1, 2, 3
7 if − β2i ≤ − ∀i = 1, 2, 3
if (β21 , −β22 , −β23 ) ≤ (, −, −)
6
5 if (−β21 , β22 , −β23 ) ≤ (−, , −)
−82 (β2 ) = (15)
4 if (β21 , β22 , −β23 ) ≤ (, , −)
if (−β21 , −β22 , β23 ) ≤ (−, −, )
3
(β21 , −β22 , β23 ) ≤ (, −, )
2 if
1 if (−β21 , β22 , β23 ) ≤ (−2 , 1 , 1 )
1 if β3 > 0
−83 (β3 ) = (16)
0 otherwise.
1 if 50 − l < β2i < 80 − u for any i
−84 (β2 ) = (17)
0 otherwise.
where ∈ [25, 40] and l , u ∈ [10, 30], the outputs y2 , y3
and y4 can be controlled to meet the design requirements as
described in (A1)-(A4).
Proof: From the parameters of D̃ and algorithms for
ub2 , ub3 and ub4 , the requirements (A1) and (A2) can be
clearly met. In order to satisfy (A3) and (A4), F2 and F3 are
designed such that the output maintains its stability (via eig FIGURE 7. Comparison of the output y1 in open-loop via simulation and
(Afi ) ∈ R− and Fi (0) = 1), and the false alerts due to the experiment.
FIGURE 8. Output y1 from the experiment with corresponding duty cycles FIGURE 10. Output y1 from the experiment with corresponding duty
for Movement 1. Both methods show a large overshoot due to unstable cycles for Movement 3. Slightly larger overshoots are seen from the
movement of the user at the beginning (at t ≤ 3s). Slightly larger response of FKn at t ∈ (0, 1)s, t ∈ (7, 8)s and t ∈ (13, 14)s.
overshoots are seen from the response of FKn at t ≈ 7.5s, t ≈ 15.5s
and t ≈ 18s.
FIGURE 12. Side views for experiments with obstacle on the ground (top)
and drop-off (bottom).
FIGURE 13. The responses of x2 , β2 and y2 for Experiments 1,2 and 3 are represented by the left, middle and right subfigures respectively.
FIGURE 14. The responses of x2 , β2 and y2 for Experiments 4,5 and 6 are represented by the left, middle and right subfigures respectively.
y1 and r1 replaced by y2 and y2r . For y3 and y4 , the TEs were the nature of ub3 and ub4 and suitability of the alerts to the
evaluated slightly different than that for y2 to accommodate user. To this end, let Ne be the the number of false readings,
FIGURE 15. The responses of y3 and y4 for performance evaluations of requirement (A2) (via Experiments 1,2,3 and 5) and
drop-off detection (via Experiments 4 and 6).
[8] S. V. F. Barreto R. E. Sant’Anna, and M. A. F. Feitosa, ‘‘A method for image [28] S. Y. Kim and K. Cho, ‘‘Usability and design guidelines of smart canes
processing and distance measuring based on laser distance triangulation,’’ for users with visual impairments,’’ Int. J. Des., vol. 7, no. 1, pp. 99–110,
in Proc. IEEE 20th Int. Conf. Electron., Circuits, Syst. (ICECS), Dec. 2013, 2013.
pp. 695–698. [29] A. G. Abdel-Wahab and A. A. A. El-Masry, Mobile Information Com-
[9] R. Henderson and K. Schulmeister, Laser Safety. New York, NY, USA: munication Technologies Adoption in Developing Countries: Effects and
Taylor & Francis, 2004. Implications. Hershey, PA, USA: IGI Global, 2011.
[10] P. Chanana, R. Paul, M. Balakrishnan, and P. Rao, ‘‘Assistive technology [30] K. Ogata, Modern Control Engineering, 4th ed. Upper Saddle River, NJ,
solutions for aiding travel of pedestrians with visual impairment,’’ J. Reha- USA: Prentice-Hall, 2001.
bil. Assistive Technol. Eng., vol. 4, pp. 1–16, Aug. 2017.
[11] E. M. Ball, Electronic Travel Aids: An Assessment. London, U.K.:
Springer, 2008, pp. 289–321.
[12] J. Liu, J. Liu, L. Xu, and W. Jin, ‘‘Electronic travel aids for the blind
based on sensory substitution,’’ in Proc. 5th Int. Conf. Comput. Sci. NUR SYAZREEN AHMAD (M’14) was born in
Edu. (ICCSE), Aug. 2010, pp. 1328–1331. Kuala Lumpur, Malaysia. She received the B.Eng.
[13] S. Kumpakeaw, ‘‘Twin low-cost infrared range finders for detecting obsta- degree (Hons.) in electrical and electronic engi-
cles using in mobile platforms,’’ in Proc. IEEE Int. Conf. Robot. Biomimet- neering and the Ph.D. degree in control sys-
ics (ROBIO), Dec. 2012, pp. 1996–1999. tems from The University of Manchester, U.K.,
[14] G.-Y. Jeong and K.-H. Yu, ‘‘Multi-section sensing and vibrotactile percep- in 2009 and 2012, respectively. She became a
tion for walking guide of visually impaired person,’’ Sensors, vol. 16, no. 7, member of the IEEE Control Systems Society
p. 1070, 2016. in 2014.
[15] F. Prattico, C. Cera, and F. Petroni, ‘‘A new hybrid infrared-ultrasonic Since 2013, she has been with the School of
electronic travel aids for blind people,’’ Sens. Actuators A, Phys., vol. 201, Electrical and Electronic Engineering, Universiti
pp. 363–370, Oct. 2013. Sains Malaysia. Her research work centers around motion control, robust
[16] D. N. Hung, V. Minh-Thanh, N. Minh-Triet, Q. L. Huy, and V. T. Cuong, stability and performance analysis of constrained and nonlinear systems,
‘‘Design and implementation of smart cane for visually impaired people,’’
and optimization-based controller synthesis with linear matrix inequality
in Proc. 6th Int. Conf. Develop. Biomed. Eng. Vietnam (BME6), T. V. Van,
searches. Her current research interest includes phase-locked loops, embed-
T. A. N. Le, and T. N. Duc, Eds. Singapore: Springer, 2018, pp. 249–254.
[17] S. Chaurasia and K. V. N. Kavitha, ‘‘An electronic walking stick for ded control systems, and robust control analysis, and design of autonomous
blinds,’’ in Proc. Int. Conf. Inf. Commun. Embedded Syst. (ICICES), mobile systems.
Feb. 2014, pp. 1–5.
[18] R. K. Megalingam, A. Nambissan, A. Thambi, A. Gopinath, and
M. Nandakumar, ‘‘Sound and touch based smart cane: Better walking expe-
rience for visually challenged,’’ in Proc. IEEE Canada Int. Humanitarian NG LAI BOON was born in Butterworth,
Technol. Conf. (IHTC), Jun. 2014, pp. 1–4. Malaysia, in 1992. He received the B.Eng. degree
[19] D. Kim, K. Kim, and S. Lee, ‘‘Stereo camera based virtual cane system (Hons.) in electronic engineering from Universiti
with identifiable distance tactile feedback for the blind,’’ Sensors, vol. 14, Sains Malaysia in 2017. He is currently with Intel,
no. 6, pp. 10412–10431, 2014. Penang, and pursuing a part-time M.Sc. degree
[20] D. R. Chebat, S. Maidenbaum, and A. Amedi, ‘‘Navigation using sensory in electrical engineering (computer and micro-
substitution in real and virtual mazes,’’ PLoS ONE, vol. 10, no. 6, pp. 1–18, electronic systems) with Universiti Teknologi
Jun. 2015. Malaysia. He is currently with Intel, Penang,
[21] S. Shoval, J. Borenstein, and Y. Koren, ‘‘Mobile robot obstacle avoidance Malaysia.
in a computerized travel aid for the blind,’’ in Proc. IEEE Int. Conf. Robot.
Autom., vol. 3, May 1994, pp. 2023–2028.
[22] I. Ulrich and J. Borenstein, ‘‘The GuideCane-applying mobile robot
technologies to assist the visually impaired,’’ IEEE Trans. Syst., Man,
Cybern. A, Syst., Humans, vol. 31, no. 2, pp. 131–136, Mar. 2001.
[23] A. Rodrìguez, J. J. Yebes, P. F. Alcantarilla, L. M. Bergasa, PATRICK GOH received the B.S., M.S., and Ph.D.
J. Almazán, and A. Cela, ‘‘Assisting the visually impaired: Obstacle detec- degrees in electrical engineering from the Uni-
tion and warning system by acoustic feedback,’’ Sensors, vol. 12, no. 12, versity of Illinois at Urbana–Champaign, Cham-
pp. 17476–17496, 2012. paign, IL, USA in 2007, 2009, and 2012 respec-
[24] A. S. Martinez-Sala, F. Losilla, J. C. Sánchez-Aarnoutse, and tively.
J. Garcia-Haro, ‘‘Design, implementation and evaluation of an indoor Since 2012, he has been with the School of
navigation system for visually impaired people,’’ Sensors, vol. 15, no. 12, Electrical and Electronic Engineering, Universiti
pp. 32168–32187, 2015. Sains Malaysia, where he currently specializes in
[25] D. Nakamura, H. Takizawa, M. Aoyagi, N. Ezaki, and S. Mizuno,
the study of signal integrity for high-speed digital
‘‘Smartphone-based escalator recognition for the visually impaired,’’ Sen-
sors, vol. 17, no. 5, p. 1057, 2017.
designs. His research interest includes the devel-
[26] B.-S. Lin, C.-C. Lee, and P.-Y. Chiang, ‘‘Simple smartphone-based guiding opment of circuit simulation algorithms for computer-aided design tools.
system for visually impaired people,’’ Sensors, vol. 17, no. 6, p. 1371, 2017. He was a recipient of the Raj Mittra Award in 2012 and the Harold L. Olesen
[27] U. R. Roentgen, G. J. Gelderblom, M. Soede, and L. P. de Witte, ‘‘Inventory Award in 2010, and has served on the technical program committee and
of electronic mobility aids for persons with visual impairments: A literature international program committee in various IEEE and non-IEEE conferences
review,’’ J. Vis. Impairment Blindness, vol. 102, no. 11, pp. 702–724, around the world.
2008.