Viswakumar 2019

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

2019 Fifth International Conference on Image Information Processing (ICIIP)

Human Gait Analysis Using OpenPose


Aditya Viswakumar Venkateswaran Rajagopalan Tathagata Ray
BITS Pilani BITS Pilani BITS Pilani
Hyderabad, India Hyderabad, India Hyderabad, India
[email protected] [email protected] [email protected]
pilani.ac.in
line 1: 4th Given Name Surname
line 5: email address or ORCID Chandu Parimi
BITS Pilani
Hyderabad, India
[email protected]

Abstract—Gait analysis refers to the scientific study of body sensors, goniometers, accelerometers and thermal images [7].
movements that are responsible for locomotion in human These systems are expensive, require extensive calibration,
beings. Gait parameters are known to be reliable indicators of and a controlled environment.
neuromuscular and skeletal health. Gait analysis is used in the
design of therapeutic paradigms for stroke and spinal cord Gait analysis using Microsoft Kinect is a viable non-
injury rehabilitation. In this study, we introduce a marker-less, invasive and cost-effective measurement technique [8]. It
cost-effective, and user-friendly approach to human gait makes use of coded infrared grids to develop a 3D depth image
analysis. We use a simple mobile phone camera and a 2D pose of real-world objects. Studies on Kinect have shown that the
estimation system to obtain important anatomical landmarks. accuracy of the depth image decreases beyond just 4 cm [9].
From these landmarks, we calculate the knee flexion/extension Moreover, under bright ambient light, Kinect fails to produce
angle. Further, we analyze the effects of ambient lighting and a depth image [9-10]. Also, in our previous study [12], we had
the subject’s attire on the measured knee angle. We also test the shown that knee joint angles measured using Kinect had
efficacy of our approach by comparing the measured knee appreciable accuracy only when the knees were visible, i.e.,
angles with a normative gait database. Finally, we compare our not covered by clothing. As a result, knee angle measurement
results with the knee angle readings obtained using MS Kinect using Kinect is difficult for commonly worn Indian attires like
(our previous study). a dhoti or saree.
Keywords—Gait Analysis, OpenPose, Kinect, Neural Recent developments in the field of computer vision have
Networks. attracted much attention from computer scientists owing to its
potential in providing elegant solutions to complex image
I. INTRODUCTION processing problems. Computer vision aims to develop
The gait cycle comprises of a series of body movements efficient systems that can assist in the understanding of digital
leading to locomotion in human beings. A gait cycle is the images and videos. These techniques are mostly based on
duration between successive heel strikes of the same foot. The Artificial Neural Networks (ANN). ANN’s are computing
scientific study of the gait cycle is referred to as gait analysis. models inspired by the functioning of the human brain. Neural
One of the earliest and exhaustive works on the human motion networks consist of nodes interconnected by weighted paths.
was conducted by W.E Weber [1]. He was successful in There exist several types of neural networks depending on
marking the position of limbs for distinct phases of the gait their functionality. Convolutional Neural Networks (CNN)
cycle. The invention of photographic cameras in the early 19th are explicitly designed for the analysis and study of digital
century meant that finer details of human motion could now images [13]. They are employed for a wide variety of
be recorded and studied. E. Muybridge pioneered the applications like image classification, facial recognition, edge
chronophotographic study of human motion analysis [2]. He detection, scene labeling, semantic segmentation, and human
invented the zoopraxiscope, a device that could playback the pose estimation [14].
captured phases of a gait cycle. The subsequent breakthrough
In this work, we suggest an alternative to MS Kinect for
in gait analysis came with the advent of powerful digital
gait analysis. We utilize OpenPose, a marker-less 2D human
computers and image processing techniques. The first video
pose estimation system based on CNN to measure knee
processing system for human gait analysis was developed by
flexion/extension angles. We test the efficacy of our approach
R.B. Davis [3]. He used passive reflective markers and image
by comparing the obtained knee angle with an open gait
processing algorithms to map human joint motion. A major
database [15]. We also show its superiority over Kinect for
limitation of reflective markers is that their accuracy of
knee covering attires worn by the subject.
detection is dependent on the ambient lighting conditions [4].
Also, markers placed on the body are susceptible to choppy II. OBTAINING ANATOMICAL LANDMARKS USING OPENPOSE
movements resulting from the sliding of skin over the bones
[5]. This introduces noise in the measured gait parameters. Human pose estimation is the process of inferring human
With the advancements in integrated circuits and poses from a digital image. Pose estimation requires highly
microcontrollers, it became possible to manufacture wearable accurate detection and identification of human joints. Pose
devices capable of measuring gait metrics [6]. Wearable estimation algorithms follow a top-down or a bottom-up
devices are intrusive and can interfere with the habitual approach.
walking of the subject. This is especially true if the devices are In the top-down approach [16], the first step is to find
heavy, bulky, and connected by wires to some power source. possible regions of interest from an image. This is followed by
Non-invasive techniques for measurement of gait parameters the extraction of joints from these regions. Poses are then
are also available. They make use of floor sensors, inertial inferred from the extracted joints. This approach can be

978-1-7281-0899-5/19/$31.00 ©2019 IEEE 310


2019 Fifth International Conference on Image Information Processing (ICIIP)

computationally very expensive as each region is processed


independently of one another.
The bottom-up approach [17] begins with the processing
of the entire image to obtain possible joint locations. These
joints are then connected to generate a pose model. In contrast
to the top-down approach, the bottom-up approach is
computationally less expensive. But it suffers from the
drawback that it can produce faulty pose models due to the
erroneous connection of joints.
OpenPose is a bottom-up multi-person pose detection system
[18]. It can detect a total of 135 vital body points (no fiducial
markers needed) from a digital image. A single CNN is used
for both key-point detection and association. The key-points
are detected with a score (a numerical value between 0-1). It
is a measure of the overall confidence in the key-points
estimated. Key-point association is done using part affinity
fields (PAF) [19]. They are two-dimensional vector fields that
encode the position and orientation of the human limbs.
OpenPose has been trained to produce three distinct pose
models. They differ from one another in the number of
estimated key-points. a) MPI is the most basic model and can
estimate a total of 15 important key-points b) COCO model is
a collection of 18 points c) BODY_25 pose consists of 25
points. It can be seen from Fig. 1 that BODY_25 is the most
exhaustive pose model. In addition to the key-points estimated
by MPI and COCO models, it contains descriptors for the feet
and pelvic center. We will be using BODY_25 model in our
analysis.

Fig. 2. BODY_25 pose estimation results for randomly clicked images.

III. EXPERIMENTAL SETUP


The gait cycles were captured using a mobile phone
mounted on top of a tripod stand. The phone captures 800 ×
600 pixel resolution images at an average rate of 30 frames
per second. A total of 10 healthy volunteers participated in this
experiment. The average weight and height of the participants
were 70.3 kg and 170.5 cm with a standard deviation of 12.3
kg and 8.39 cm respectively. All the participants were dressed
in shirts and pants. To study the effect of ambient lighting on
knee angle measurement, the subjects were asked to walk
parallel to the camera under the following lighting conditions;
(1) Dim light at an average radiance of 40 lux and (2) Bright
light at an average radiance of 950 lux. Further, to analyze the
effects of the knee being covered by clothing, the subjects
were made to walk wearing a dhoti (a single piece loose-fitting
traditional Indian attire that covers the lower part of the body)
under normal lighting (200-300 lux).
Videos of the subjects walking were downloaded onto a
computer through a wireless network. For each frame of the
video, BODY_25 pose data was generated using OpenPose.
From the pose data knee joint angle was calculated.
IV. CALCULATION OF KNEE ANGLE
The knee angle was calculated using vector dot product.
From the hip, knee, and ankle coordinates obtained from the
Fig. 1. Illustrations of pose models generate by OpenPose (a) The input pose data two vectors were constructed. The first vector
image (b) Output pose estimated using MPI model (c) Output pose estimated begins at the hip and ends at the knee while the second one
using COCO model (d) Output pose estimated using BODY_25 model.

311
2019 Fifth International Conference on Image Information Processing (ICIIP)

begins at the knee and ends at the ankle. The knee angle ( )
for the frame in Fig. 3 is given by the following equation
Normative Bright Lighting
.
= cos (1) 80
| |
70

KNEE ANGLE (DEGREES)


60
50
40
30
20
10
0

0
6
12
18
24
30
36
42
48
54
60
66
72
78
84
90
96
% GAIT

Fig. 5. Average (of 10 subjects) knee flexion/extension angle measured


under bright ambient lighting.

B. Effects of Attire
The error in measured knee angle (for normal lighting)
with the participants wearing dhoti was 18.29%, and that for
Fig. 3. Knee angle measurement from a video frame.
pants was 17.8%.
V. RESULT
For each of the clothing and lighting conditions, three TABLE II. ERROR DUE TO ATTIRE
trials were conducted and the average of the knee angles was Attire
calculated. Dhoti Pants
Error(%) 18.29 17.8
A. Effects of Ambient Lighting Mean Absolute
6.40 6.20
Deviation
The percentage errors in the measured knee angles for dim Standard
and bright ambient lighting conditions were 16.78% and 8.22 7.80
Deviation
13.43% respectively.

TABLE I. ERROR DUE TO VARIATION IN AMBIENT LIGHTING


Normative Wearing Dhoti
Lighting Condition
Dim Lighting Bright Lighting 80
Error (%) 16.78 13.43
70
KNEE ANGLE (DEGREES)

Mean Absolute
5.85 4.24
Deviation 60
Standard
7.43 5.67 50
Deviation
40
30
Dim Lighting Normative
20
80 10
70
KNEE ANGLE (DEGREES)

0
60
0
8
16
24
32
40
48
56
64
72
80
88
96

50 % GAIT
40
30 Fig. 6. Average (of 10 subjects) knee flexion/extension angle measured
20 with the subjects wearing dhoti.
10
VI. DISCUSSION
0
0 10 18 26 34 42 50 58 66 74 82 90 100
We observe that there is no appreciable difference
(16.78% error under dim lighting and 13.43% error under
% GAIT bright lighting) in the measured knee angle with changes in
lighting conditions (radiance variation of approximately 910
Fig. 4. Average (of 10 subjects) knee flexion/extension angle measured lux). We further test the tolerance of OpenPose to extreme
under dim ambient lighting. ambient lighting by altering the image brightness (Fig. 7a, Fig.
7b, Fig. 7c). These images are then processed to extract

312
2019 Fifth International Conference on Image Information Processing (ICIIP)

BODY_25 pose data. It is observed that pose and key-point VII. CONCLUSION
estimation is possible even for extremely whitewashed or We have introduced here a user-friendly, cost-effective
darkened images (Fig. 7d, Fig. 7e). and marker-less approach to human gait analysis using
OpenPose. We were able to measure knee flexion/extension
TABLE III. COMPARISON WITH MS KINECT
angles using a simple phone camera and a personal computer.
Knee Angle Measurement for Dhoti Our approach is tolerant to large variations in ambient
Kinect OpenPose lighting. Our results concur well with normative gait database.
Error(%) Could not measure 18.29
We also demonstrated that OpenPose based knee angle
Mean Absolute
Deviation
Could not measure 6.40 measurement is superior to Kinect when the subject is wearing
Standard day-to-day Indian attires like dhoti (this is also applicable to
Could not measure 8.22
Deviation saree a common dress worn by women).
REFERENCES
In our previous work [12], we observed that MS Kinect
could not detect knee angles for dhotis and other attires that [1] R. Baker, “The history of gait analysis before the
cover the knee. In contrast to Kinect, OpenPose was able to advent of modern computers,” Gait and Posture, vol.
detect knee coordinates with considerable accuracy even 26, no. 3. pp. 331–342, Sep-2007.
when the knees were not visible. A possible reason for this [2] R. Baker, “The history of gait analysis before the
superior performance is due to the CNN employed by advent of modern computers,” Gait Posture, vol. 26,
OpenPose. The CNN has been designed to estimate key no. 3, pp. 331–342, Sep. 2007.
anatomical landmarks/coordinates from images taken under a [3] R. B. Davis, S. Õunpuu, D. Tyburski, and J. R. Gage,
wide range of conditions. One of the reasons for the failure of “A gait analysis data collection and reduction
Kinect could be due to its depth imaging technique, which can technique,” Hum. Mov. Sci., vol. 10, no. 5, pp. 575–
only map objects directly in front of it. 587, 1991.
TABLE IV. COMPARISON WITH MS KINECT [4] H. M. Clayton, “Instrumentation and Techniques in
Locomotion and Lameness,” Vet. Clin. North Am. -
Knee Angle Measurement for Pants
Kinect OpenPose Equine Pract., vol. 12, no. 2, pp. 337–350, Aug.
Error(%) 27.28 17.8 1996.
Mean Absolute [5] C. Reinschmidt, A. J. Van Den Bogert, B. M. Nigg,
6.40 6.20
Deviation
Standard
A. Lundberg, and N. Murphy, “Effect of skin
8.22 7.80 movement on the analysis of skeletal knee joint
Deviation
motion during running,” J. Biomech., vol. 30, no. 7,
pp. 729–732, Jul. 1997.
[6] W. Tao et al., “Gait Analysis Using Wearable
Sensors,” Sensors, vol. 12, no. 2, pp. 2255–2283,
Feb. 2012.
[7] A. Muro-de-la-Herran, B. Garcia-Zapirain, A.
Mendez-Zorrilla, A. Muro-de-la-Herran, B. Garcia-
Zapirain, and A. Mendez-Zorrilla, “Gait Analysis
Methods: An Overview of Wearable and Non-
Wearable Systems, Highlighting Clinical
Applications,” Sensors, vol. 14, no. 2, pp. 3362–
3394, Feb. 2014.
[8] M. Gabel, R. Gilad-Bachrach, E. Renshaw, and A.
Schuster, “Full body gait analysis with Kinect,” in
2012 Annual International Conference of the IEEE
Engineering in Medicine and Biology Society, 2012,
pp. 1964–1967.
[9] J. Han, L. Shao, D. Xu, and J. Shotton, “Enhanced
computer vision with Microsoft Kinect sensor: A
review,” IEEE Trans. Cybern., vol. 43, no. 5, pp.
1318–1334, Oct. 2013.
[10] M. A. Livingston, J. Sebastian, Z. Ai, and J. W.
Decker, “Performance measurements for the
Microsoft Kinect skeleton,” 2012, pp. 119–120.
[11] R. A. El-Laithy, J. Huang, and M. Yeh, “Study on the
use of Microsoft Kinect for robotics applications,” in
Record - IEEE PLANS, Position Location and
Fig. 7. Body pose estimation for extreme lighting conditions (a) original Navigation Symposium, 2012, pp. 1280–1288.
image (b) whitewashed image (c) darkened image (d) BODY_25 pose [12] T. Ray, C. Parimi, V. Rajagopalan, S. V. Sai, V. B
estimation for whitewashed image (e) BODY_25 pose estimation for
darkened image. Athreya, P. Shrivastava, “MS Kinect a potential gait

313
2019 Fifth International Conference on Image Information Processing (ICIIP)

analysis system : Clothing Effect on the gait


kinematics,” in International Society of
Biomechanics, 2015, vol. 79, no. 21, p. 2147.
[13] W. Rawat and Z. Wang, “Deep convolutional neural
networks for image classification: A comprehensive
review,” Neural Computation, vol. 29, no. 9. MIT
Press Journals, pp. 2352–2449, 01-Sep-2017.
[14] M. Egmont-Petersen, D. de Ridder, and H. Handels,
“Image processing with neural networks—a review,”
Pattern Recognit., vol. 35, no. 10, pp. 2279–2301,
Oct. 2002.
[15] C. Kirtley, “CGA Normative Gait Database,”
Clinical gait analysis, 2006. [Online]. Available:
http://www.clinicalgaitanalysis.com/data/.
[16] P. L. X. F. C. Z. Guanghan Ning, “A Top-down
Approach to Articulated Human Pose Estimation and
Tracking,” 2018, pp. 0–0.
[17] M. Li, Z. Zhou, J. Li, and X. Liu, “Bottom-up Pose
Estimation of Multiple Person with Bounding Box
Constraint,” in Proceedings - International
Conference on Pattern Recognition, 2018, vol. 2018–
August, pp. 115–120.
[18] “OpenPose GitHub repository,” 2018. [Online].
Available: https://github.com/CMU-Perceptual-
Computing-Lab/openpose.
[19] Z. Cao, T. Simon, S. E. Wei, and Y. Sheikh,
“Realtime multi-person 2D pose estimation using
part affinity fields,” in Proceedings - 30th IEEE
Conference on Computer Vision and Pattern
Recognition, CVPR 2017, 2017, vol. 2017–January,
pp. 1302–1310.

314

You might also like