ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article

Combining low sampling frequency smartphone sensors and video for a Wearable Mobility Monitoring System

[version 1; peer review: 1 approved, 1 approved with reservations]
PUBLISHED 25 Jul 2014
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

Abstract

A proof-of-concept Wearable Mobility Monitoring System (WMMS) was developed to identify daily activities and provide environmental context, using integrated BlackBerry Smartphone low sensor and video data. Integrated accelerometer data were used to identify mobility changes-of-state (CoS) in real-time, trigger BlackBerry video capture at each CoS, and save activity outcomes on the Smartphone. System evaluation involved collecting WMMS output and (separate) camcorder video under realistic conditions for five able-bodied subjects.  The subjects each performed a consecutive series of mobility tasks; including, walking, sitting, lying, stairs, ramps, elevator, bathroom activities, kitchen activities, dining activities and outdoor walking. Activity, timing and contextual information were obtained from the camcorder for comparison. Sensitivity results for sensor-based CoS identification were 97-100% for standing, sitting, lying and taking an elevator; 67-73% for walking-related CoS (stairs, ramps); 40-93% between walking and small movements (brushing teeth, etc.); and below 27% for daily living activities. False positives occurred in less than 12% of all activities, with less than 5% false positives for half the measures. Better classification results were achieved when using both acceleration features and Smartphone integrated video for all activities except sitting. The evaluation demonstrated that the WMMS algorithm and BlackBerry platform were effective for detecting mobility activities, even with low sampling rate sensors. The combined sensor and video analysis enhanced mobility task identification and contextual information.

Keywords

Wearable, activity monitoring, mobility, change of state, smartphone, accelerometer, video

Introduction

Understanding how people move within their daily lives is important for healthcare decision-making. Typically, a person’s mobility status is self-reported in the clinic, thereby introducing error and increasing the potential for biased information. Functional scales can be administered to help gain an understanding of a person’s mobility status14. However, these tests do not measure how a person moves when leaving the clinic. A quantitative method of characterizing mobility in the community would provide valid and useful information for healthcare providers and researchers.

Wearable systems have been developed to evaluate mobility in any environment or location. These portable systems are worn on the body and collect mobility information within the home and community. Examples include accelerometer-based systems58 portable electromyographs (EMG)9, and foot pressure devices10. Some wearable systems can be used for long-term and low effort monitoring.

Smartphones provide an ideal interface for mobility assessment in the community since they are small, light, easily worn, and easy to use for most consumers. These phones are multitasking, computing platforms that can incorporate accelerometers, GPS, video camera, light sensors, temperature sensors, gyroscopes, and magnetometers11,12.

As shown in Table 1, cell phones have been used to recognize multiple activities by analyzing the phone’s accelerometer or external sensor data. However, most systems only used the phone as a wireless data transfer device, not as a wearable computing platform. These systems used an external computer for analysis of the data.

Table 1. Studies that use a tri-axial accelerometer for mobility monitoring.

AuthorLocationPhone
based
FeaturesFreq
(Hz)
AccuracyActivitiesMethod
He7 PocketNDiscrete cosine transform, Principal
Component Analysis
10097%Still, walk, run,
jump
Support vector
machine
Wang6 WaistNLinear sum, duration, mean, median, STD,
zero-crossing count, correlation, spectrum
amplitude: mean ratio, empirical mode
decomposition
5091%Walk up and
down slope
Gaussian
mixture model
Hache5 WaistNSTD-Y, magnitude area, skewness Y,
inclination angle
5088%Stand, sit, lie,
walk, stairs,
ramp, elevator
Hierarchical
decision tree
Khan8 ChestNAutoregressive coefficients, signal
magnitude area, tilt angle to 3D feature plot
2098%Sit, stand, lie,
walk, stairs,
run
Artificial neural
net, hierarchical
LDA Features
Kwapisz13 Front pant
pocket
YAverage, STD, average absolute difference,
average resultant acceleration, signal
magnitude area, time between peaks, and
binned distribution
2050% stairs,
others 91 ~
95%
Walk, jog, walk
on stairs, sit,
and stand
Multilayer
perceptron
Shumei14 WaistYChange, max, min, and velocity181.9% for
motion, 83.7%
for motionless
Walk, posture
transition,
gentle motion,
stand, sit, lie
Hierarchical
classification
with multiclass
SVM
Ganti15 WaistYEnergy expenditure, body angle, 3D
acceleration skewness, acceleration entropy
7Eat, meeting
(14%); drive,
watch TV,
aerobic
(82%); desk
work (50%);
hygiene, cook
(99%)
Sit, stand,
aerobic, brush
teeth, work, lie
Hidden Markov
Model
Wu19 WaistYSTD-X, STD-Y, STD-Z, mean-Y, sum of
ranges, range-Y, signal magnitude area’s
sum of ranges, difference of sum of ranges,
range-XZ, sum of GPS speed, and GPS
speed.
8100% for static
activities, 98%
for walking,
66 ~ 77% for
stairs, 33% for ramp.
Stand, sit, lie,
walk, going
stairs, going
ramp, elevator,
an, car ride
Hierarchical
decision tree

1Location = sensor location on the person

2Freq = data acquisition frequency

Wearable video-sensor systems have also been developed to log digital memories. Video, GPS, electrocardiogram, and acceleration were used to record location and context of a person’s daily life1618. These Lifelogs had several limitations, including information retrieval, synchronization, light quality, low visual resolution, artifacts, and automatic annotation.

A previous project showed that sensors located in a “smart-holster” could be combined with a BlackBerry Smartphone to provide image-assisted mobility assessment5. An external accelerometer, temperature sensor, humidity sensor, light sensor, and Bluetooth were integrated into the Smartphone holster. Software was written for the BlackBerry 9000, which had an embedded camera and GPS. A hierarchical decision tree combined with a double threshold algorithm classified signals to recognize changes-of-state (CoS) and activities. The BlackBerry then automatically took a digital picture at each CoS. This preliminary work confirmed that a wearable mobility monitoring system (WMMS) could use acceleration and pictures to identify walking movements, standing, sitting, and lying down, and additionally provide context for these activities.

Subsequent research19 used the BlackBerry 9550 Smartphone, with an integrated accelerometer, and revised algorithms to detect changes of state (CoS) and classify activities of daily living. The low sampling rate of the BlackBerry 9550 required some of the algorithms for the first WMMS to be revised. Having all sensors and computing power within the Smartphone provides a broadly accessible platform for wearable activity monitoring. A single subject case study resulted in an average sensitivity of 89.7% and the specificity of 99.5% for walking-related activities, sensitivity of 72.2% for stair navigation, and sensitivity of 33.3% for ramp recognition. Ramp results were poorer since accelerations for ramp gait were similar for walking gait.

Since it was demonstrated that new BlackBerry Smartphones can identify CoS in real-time19, and Smartphone video capture is available on these devices, a preliminary evaluation of the usefulness of wearable video for improving activity classification and context identification is needed. The present study builds on this prior work and presents a proof-of-concept evaluation of a new BlackBerry-based WMMS that uses internal sensors and cell phone video to identify a range of mobility and daily living activities by using the CoS to trigger the acquisition of short video clips. A proof-of-concept evaluation is a necessary step before a large scale evaluation with people with disabilities. This study is the first to integrate sensor and (non-automated) video analysis for wearable mobility monitoring.

Materials and methods

WMMS system

The prototype WMMS was developed for BlackBerry OS 5.0 on the Storm2 9550 Smartphone. The WMMS used the BlackBerry’s integrated accelerometer (3-axis, ± 2g), GPS, camera, and SD memory card for mobility data collection. The 3.15 MP camera had a maximum video resolution of 480x352 pixels and a video frame rate of 30 frames per second20. During video capture, no accelerometer or GPS data were available. With video control active, the accelerometer sampling rate was approximately 8 Hz21. For the WMMS, GPS data were sampled at 1 Hz, but GPS data were not used in the algorithm presented in this paper. The WMMS was worn in a holster on the right, front pelvis.

The new WMMS software developed in22, sampled time, acceleration, and GPS location and then saved the raw data to a 16Gb SD-card. Accelerations were processed to calibrate the axis orientation, using the gravity rotation method4. The processed acceleration was input into a feature extraction algorithm19 for analysis within 1-second data windows. Following feature extraction, the features were analyzed in a decision tree to determine if a CoS had occurred. If a CoS was identified, a three-second video clip was captured. A second decision tree was used to categorize the activity19. Time, features, and activity classification were saved to an output file on the SD card.

Accelerometer features

Acceleration features were identified that were sensitive to changes in mobility status19. These included mean Y-axis acceleration, standard deviation (STD) in X,Y,Z axes (1), Y-axis range (Range_Y) (2), Sum of Ranges (SR) (3), Signal Magnitude Area (SMA) of SR (4), difference of Sum of Ranges (DiffSR) (5), and range of X and Z (Rxz) (6). The inclination angle (7) could distinguish sitting, lying, and standing for the classification of static activities. While the GPS provides useful location information, GPS was not required in this study because accelerometer-based recognition of vehicle riding showed good results19.

        STD_Y=1Ni=1N(yimy)2                         (1)

       Range_Y = (MaxY – MinY)                           (2)

       SR = (Range_X + Range_Y + Range_Z)       (3)

       SMA_SR=i=1NSRi                                        (4)

       Diffsr = (SR2 – SR1)                                    (5)

       Rxz = (Range_X + Range_Z)                       (6)

       Inclination Angle = arctan (mzmy)(°)               (7)

In equations (1) to (7), my is the mean Y-acceleration, yi is the individual acceleration value, N is the samples per data window, SR2 is the sum of ranges in current window, SR1 is the sum of ranges in previous window, and finally mz is the mean Z-acceleration.

Change of state/classification

The CoS algorithm described in19 used pre-set thresholds for feature analysis. Each feature was independently analyzed using single or double-thresholds and scored as true or false. These Boolean values were combined to recognize the activities reported in19 (static state, taking an elevator, walking-related movements) and also to identify changes of state for small movements during activities of daily living (ADL) (meals, hygienic activities, working in the kitchen). CoS classification ran in real-time to enable accurate Smartphone video recording.

Activity classification

The decision tree for activity classification, described in19, used the same features and thresholds as the CoS algorithm to classify eight activities: sitting, standing, lying, riding an elevator, small stand-movements, small sit-movements, small lie-movements, and walking. The small movement categories included ADL and movements while sitting and standing. Following data collection, the video clips were manually reviewed by a human operator to help classify activities. Clips were played back on a BlackBerry 9700 phone. Video and audio were available for qualitative assessment of the current activity.

Test procedure

A convenience sample of five able-bodied subjects (four males and one female, age: 35.2 ± 8.72 years, height: 174.58 ± 10.75 cm, weight: 66.46 ± 9.7 kg) were recruited from the Ottawa Hospital Rehabilitation Center (TOHRC, Ottawa, Canada) for the proof-of-concept evaluation. A form that included informed consent, information sheet, and media agreement was obtained for each subject. Ethical approval for the study was obtained from the Office of Research Ethics and Integrity, University of Ottawa, File Number: A08-12-01. Subjects who had injuries or gait deficits were excluded.

Data collection took place within the TOHRC (hallways, elevator, one bedroom apartment, stairs, Rehabilitation Technology Laboratory) and outside on a paved pathway. Participants wore the WMMS on their right-front waist, in a regular BlackBerry holster attached to a belt, with the camera pointing forward. No additional instructions were given for WMMS positioning. The participants were asked to follow a predetermined path and perform a series of mobility tasks, including, standing, walking, sitting, riding an elevator, brushing teeth, combing hair, washing hands, drying hands, setting dishes, filling the kettle with water, toasting bread, a simulated meal at a dining table, washing dishes, walking on stairs, lying on a bed, walking on a ramp, and walking outdoors. Each activity in the sequence took approximately 10 to 20 seconds to complete. Each trial had 41 changes of state. Three trials were captured on the Smartphone and on a digital video camcorder.

Activity timing was obtained from the camcorder recording for comparison with the WMMS output. Start and end points of each trial were identified by shaking the Smartphone for 2 seconds. The camcorder also provided contextual information for analysis.

Data were imported from the Smartphone SD card into Microsoft Excel for statistical analysis. The Smartphone video clips were reviewed by two independent evaluators to qualitatively classify activities and assess video quality. The evaluators had access to the WMMS activity classification results during video classification.

Results

The results were analyzed in terms of CoS and activity classification. For CoS identification at the transition point between activities (Table 2), sensitivities for standing, sitting, lying, and taking an elevator were between 97% and 100%. Walking-related CoS, such as stairs and ramps, had 67% to 73% sensitivity. Sensitivity5 related to CoS between walking and small movements, such as brushing teeth, were between 40% and 93%. The CoS results for daily living activities were poorer (below 27%) since the continuous series of small movements produced similar acceleration features.

Table 2. CoS sensitivity during the transition between activities.

Changes of StateSubject 1Subject 2Subject 3Subject 4Subject 5Sensitivity
%
TPFNTPFNTPFNTPFNTPFN
StaticStand ↔ Walk140150130100150100.00
Lie ↔ Walk6060606060100.00
Elevator ↔ Walk12011012011011198.30
Sit ↔ Walk606051606096.70
WalkingWalk ↔ Stairs84111841205773.30
Walk ↔ Ramp244242604266.70
Daily living
with walking
Toast bread → Walk303021303093.30
Dry hands → Walk302121303083.30
Prepare meal ↔ Walk515151604286.70
Wash dishes ↔ Walk424242513366.70
Walk → Brush teeth302121302180.00
Dishes → Kettle302112213073.30
Kettle → Toast300321033073.30
Walk → Set dishes212112120340.00
Daily livingWash → Dry hands030321121226.70
Brush teeth → Comb hair030312031213.30
Comb hair → Wash hands030312120313.30

1TP = average true positives

2FN = average false negatives

Table 3 shows the specificity5 results for activity classification using the CoS algorithm19. These classifications are compared between data windows for CoS identification. The number of false positives was less than 12% for all measures, with half the measures reporting less than 5% false positives. Walking produced the most false positives, identifying a walking-CoS when no CoS occurred, for 324 out of 2700 cases (12%). Large standard deviations were found for some measures due to timing differences (i.e., people who walked slower took more time and had more data points for analysis, leading to more true negatives).

Table 3. Activity classification specificity.

Changes of stateSubject 1Subject 2Subject 3Subject 4Subject 5Specificity
%
FPTNFPTNFPTNFPTNFPTN
Lie057048040041080100
Sit16104703402307399.6
Dry hands01511201701501798.7
Stand013751718147565129797.7
Move dishes02211409190696.8
Take an elevator78961015107999018095.5
Wash dishes33712812704832495.4
Comb hair02813822812332395.2
Toast bread33101101512211294.8
Wash hands02023611211321794.2
Ramp12931904031211593.5
Brush teeth23443633212531891.8
Stairs3611242598105746190.4
Prepare a meal45333074843553489.7
Move a kettle12421452441701188.2
Walk744997556384667554343621388
Meal setup-Walk303030302193.3

1FP = average false positives

2TN = average true negatives

Better activity classification results were achieved when using both acceleration features and video clips for all activities except sitting, as compared to using the accelerometer only (Table 4). Acceleration-only analysis with the activity classification algorithm19 had a greater sensitivity than acceleration-and-video for sitting because one CoS was missed in one trial, resulting in no associated video data. Using the accelerometer only, none of the ADLs were identified; however, no false positives were reported, resulting in a high specificity.

Table 4. Activity classification sensitivity and specificity for accelerometer only and accelerometer with Smartphone video.

ActivityAcc. OnlyReviewer 1Reviewer 2
Acc. + VideoAcc. + Video
SE (%)SP (%)SE (%)SP (%)SE (%)SP (%)
StaticLie96100100100100100
Stand989910010098100
Sit96999210079100
Elevator210010010092100
Walking-relatedWalk959296939593
Stairs457386996499
Ramp22985010037100
Daily LivingDining activity01009410075100
Bathroom activity01008410075100
Kitchen activity01007010063100
Move a kettle01003710037100
Wash dishes01003210032100
Make toast01003110031100
Meal setup0100211001100
Dry hands0100151001100
Wash hands0100510031100
Move dishes010071000100
Brush teeth010001000100
Comb hair010001000100

1Acc. = accelerometer

2SE = sensitivity

3SP = specificity

Since accelerometer data were unavailable during video recording, no acceleration could be recorded for a minimum of 2.9 seconds after a CoS was identified. For example, if a person sits in a chair, stands up, and sits down again within 4 seconds (3 seconds of video capture and 1-second data window), the CoS will not be detected or identified. However, digital video would be available during this period for activity classification, enabling the evaluator to see the additional movements during post-processing. The average accelerometer sampling frequency was 7.88 ± 1.39 Hz. Video analyses became difficult in some circumstances, in particular when video images were dark.

The BlackBerry 9550 included a light sensor, but the sensor output was unavailable with Java API 5.0. Light intensity level could be beneficial for recognizing indoor and outdoor environments.

As shown in Figure 1, the extracted features were used to recognize small movements, identified using a combination of SR and STD_Y. Discrete small movements were easily distinguished; however, a CoS could be missed for a continuous series of small movements that has similar accelerometer features. For example, brushing teeth and then combing hair.

bd91d1a5-9d7f-4705-9dc9-c9a786d87220_figure1.gif

Figure 1.

Sum of Ranges to distinguish small movements: brushing teeth (BT), combing hair (CH), washing hands (WH), drying hands (DH), moving dishes (MD), moving a kettle (MK), toasting bread (TB), preparing a meal (PM), and washing dishes (WD).

The sum of ranges was more sensitive than the STD_Y for distinguishing mobility states (Figure 2). Further, the SMA-SR curves were smoother than the sum of ranges curves. Smooth SMA-SR curves were better at defining classification thresholds for climbing stairs and walking.

bd91d1a5-9d7f-4705-9dc9-c9a786d87220_figure2.gif

Figure 2.

Top: Sum of ranges (SR) and STD_Y to distinguish walking and static states (BT=brushing teeth, CH=combing hair, WH=washing hands, and DH=drying hands).

Bottom: SMA-SR to identify climbing stairs and walking.

Recording a video was a valuable medium for evaluating mobility activities since audio information was available and multiple-images could be used in the analysis (Table 5). The sense of motion also provided useful information for categorizing the activity and context.

Table 5. Video-clip analysis.

ActivityContextShaking
images
Image directionSound
StandWall, groundNForwardQuiet
SitCeiling, wallNForward, upwardQuiet
LieCeilingNUpwardQuiet
Ride an elevatorElevatorNForwardElevator
Walk on level
ground
GroundYForwardWalking
Climb upstairsStairs, hand
rail
YForward, upwardClimbing stairs
Climb
downstairs
WallYForward, downwardClimbing stairs
Up rampWall, ceiling,
hand rail
YForward, upwardWalking
Down rampWall, ceiling,
hand rail
YForward, downwardWalking
Brush teethWashbasinYForwardFlushing
Comb hairWashbasinNForwardQuiet
Wash handsWashbasinYDownwardFlushing
Dry handsTowel rack,
towel
YForwardQuiet
Set dishesTableYForwardDish collisions
Move kettleKettle, kitchen
background
YForwardQuiet or filling
water
Toast breadToaster, TableNForwardQuiet
Meal setupTableNForward & upwardQuiet or filling
water
Wash dishesSink, dishesYForwardWashing,
flushing, dish
collisions

Activity recognition for static and walking states was confirmed by video-clip evaluation. Occasionally, the video showed a moving-hand during walking, as the arm swung during locomotion. Since the phone was located on the right waist, the video usually showed more of the right side than the left. For ramp navigation, the Smartphone video was not able to show the ramp, however, the video did allow the evaluator to determine if the person was ascending or descending. Similarly, the video was unable to show the stairs when descending, but the downward movement and sound of the footfalls on the stairs allowed the evaluator to categorize the stair descent activity.

The elevator CoS was usually triggered by a transition from walking to standing, rather than from elevator movement. The activity was classified using the video by seeing the elevator door or the inside of the elevator in the video-clip. The elevator inside was unclear and dark, but the elevator sound was clearly defined. Further, the bathroom, kitchen, and dining room areas were clearly identified from video analysis.

Small movements, such as brushing teeth, combing hair and moving plates could not be identified because these activities were out of the Smartphone video field. Some clips could clearly identify moving a kettle, toasting bread, meal preparation, and washing dishes (Figure 3). Furthermore, drying hands and meal preparation needed continuous video images to identify the activities (i.e., single images would not be useful). Moreover, the video-based activity classification was hampered for people with shorter lower bodies when a tabletop or kitchen counter obstructed the phone. For example, moving a kettle and toasting bread were not visible in the video field for these shorter participants.

bd91d1a5-9d7f-4705-9dc9-c9a786d87220_figure3.gif

Figure 3.

Images from BlackBerry video for small movements: a) kitchen cabinet and oven, b) bathroom sink, c) dining table and meals, d) moving a kettle, e) toasting bread, f) meal preparation, g) washing dishes, h) towel and towel rack, i) drying hands.

Dataset 1.Data for combination of smartphone sensors and video for a wearable mobility monitoring system.
The Raw-acceleration-file contains the raw accelerations as output by the BlackBerry smartphone during all trials. The Raw-data-files (subjects 1-2-3-4-5) contain the signal features calculated from the accelerations plus the ‘real status’ of the subject (what they were doing at that instant) during each sample of each trial. The real status is required to verify the accuracy of the wearable mobility monitoring system (WMMS) prediction

Discussion

This study demonstrated that BlackBerry accelerometer signal analysis and cell phone video assessment can be combined to identify many mobility activities and the context of these activities. Since a readily available Smartphone was worn in a typical manner, at the waist in a holster, and no external sensors or hardware was required, the WMMS could be easily implemented in the community.

The ability to appropriately identity a CoS is critical for a WMMS that uses video, with accurate and real-time CoS identification needed to trigger the acquisition of video clips at the appropriate time to enable post-processing. The WMMS successfully recognized CoS for both static and walking activities. By taking the initial activity classification from the sensor data and refining the classification decision using the BlackBerry video, superior activity recognition results were obtained.

Static activity (sitting, standing, lying) classification accuracy improved from outcomes in the literature (below 94.6%,5,13,23) to 97.3%. The results would have been 100% except one CoS was missed during a walk-to-sit CoS of one of the trials. More thorough biomechanical analysis is required to identify the movement pattern that created this outlying error. Standing and lying down were recognized with 100% accuracy.

The WMMS produced better sensitivity results for walking than the previous Smartphone-smart-holster system5. While various studies in the literature had better walking accuracy (above 96%7,23), these studies only distinguished differences between very distinct mobility states, such as running, jumping, and no-movement. Accuracy of these systems would likely reduce if they were categorizing similar activities, such as running on level ground and running up an incline.

Stair navigation identification improved from 60% in the literature5,13 to 75% with the new WMMS. However, the result did not reach the 95% target. Acceleration-based categorization remains difficult for stairs since the signal patterns between walking on level ground and stairs are similar, and the stair slope may not greatly change accelerations at the waist for able-bodied people.

CoS identification for ramp walking was enhanced from 43.3% for the previous system5 to 66% because the new WMMS used a specific ramp judgment within the decision tree. Ramp classification also improved from 16.6%5 to 43.5%. Accelerometer signals from able-bodied people do not always change when moving from level ground to an incline, however, accelerometer signals from people with mobility disabilities may be sufficiently distinct to enable consistent ramp CoS triggering. During ramp descent, classification errors may occur if a walking CoS is triggered before the person finishes descent and the waist-mounted video does not show the ramp bottom. An altitude sensor may help detect changes in body vertical position during ramp and stair navigation, thereby providing more accurate CoS triggering and sensor-based classification.

Specificity is also important for a WMMS that incorporates video since identifying a CoS when no change actually occurs (i.e. a false positive) results in inappropriate video capture that affects storage capacity, battery life, and increases the workload for video post-processing. Most prior research did not report specificity or false positive results1315. For the two studies that reported false positives, the maximum false positive rate was above 12% for walking-related activities5,24. Improved threshold calibration methods that are specific to the individual may help to decrease false positives and false negatives.

The BlackBerry device was able to process the features and algorithms in real-time and save outcome data for each 1-second window. Most mobility research used cell phones to collect raw data and then analyzed features and algorithms offline1315,25. For the new WMMS, threshold settings, decision trees, feature extraction and timing were executed on the Smartphone in real time.

The combination of accelerometer and video camera was superior to using the accelerometer data alone to provide mobility information for activities of daily living. Bathroom, kitchen, and dining room activities were recognized by videos in 63% to 94% of the cases. Although the combination of accelerometer and video could not consistently identify small ADL movements, this method had better accuracy than the accelerometer only, which could not categorize any of these ADLs. Further, CoS for small movements could not be clearly identified. A higher accelerometer sampling rate may allow for more complex signal analysis that could improve small movement CoS identification. Additional features, such as signal magnitude area, skewness5, energy, correlation26, and kurtosis27, might help to recognize the small movement CoS, but these features need greater sampling rates and/or greater accelerometer range.

The video was better than still images for refining activity classification and recognizing context, such as flooring type, bathroom/kitchen, and outdoors. From previous research5, still images had limitations in dark areas such as elevators, but video clips with audio were helpful for identifying an elevator and other states. Since videos had continuous images to distinguish upward and downward movement, these clips were useful for stairs and ramp navigation. Although video cameras could potentially recognize small ADL movements, the waist-mounted Smartphone location limited the camera to activities occurring about waist height.

While BlackBerry video proved useful, the current implementation requires human time to review and classify the activities. Manual video assessment could be used for research and specific clinical applications; however, automated video analysis would make the system more efficient and promote widespread use of video data for activity classification. Further research into automated video analysis is required to achieve this goal.

Conclusions

Building on previous work that demonstrated the BlackBerry’s ability to identify changes of state in real-time, a new WMMS was presented that uses only the Smartphone’s accelerometer and video for unsupervised and ubiquitous mobility analysis for research and healthcare applications. This study was the first to integrate sensor and video analysis for wearable mobility monitoring. This sampling rate for the phone sensors was lower than used in previous studies, demonstrating that a standalone WMMS can be implemented even for older-hardware phones. This standalone WMMS was designed for independent community ambulation and has minimal space requirements and setup time. Furthermore, the context of a person’s mobility activities can be identified, including the environment in which mobility takes place.

While monitoring mobility, the new system saves raw sensor data and features. The raw data could be analyzed by other researchers or used in post-processing to improve activity classification. Novel multi-feature algorithms were developed to recognize activities under lower accelerometer sampling rates and range. By combining and weighting sum, range, and covariance statistics, the WMMS was able to recognize standing, sitting, lying, riding an elevator, walking on level ground, ramps, stairs, and ADL (washing hands, drying hands, setting dishes, moving a kettle, toasting bread, preparing a meal, and washing dishes). Static activities, walking on level ground, walking on stairs, walking on a ramp, and riding an elevator had higher sensitivities than previous studies, but the overall CoS identification and activity classification performance could be improved by further research.

Adding additional sensors, increasing the accelerometer sampling rate/sensitivity, and adding new user-specific threshold calibration methods can be considered in future work. The classification of other small movement ADL activities also requires further research to increase sensitivity and specificity. While evaluation with able-bodied participants was warranted for this proof-of-concept study, further evaluation research with various patient populations is required before this WMMS can be accepted for use by people with mobility disabilities.

Consent

Informed written consent to publish the results of this study was obtained from each participant.

Data availability

F1000Research: Dataset 1. Data for combination of smartphone sensors and video for a wearable mobility monitoring system, 10.5256/f1000research.4790.d3253828

Comments on this article Comments (0)

Version 2
VERSION 2 PUBLISHED 25 Jul 2014
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Wu HH, Lemaire E and Baddour N. Combining low sampling frequency smartphone sensors and video for a Wearable Mobility Monitoring System [version 1; peer review: 1 approved, 1 approved with reservations]. F1000Research 2014, 3:170 (https://doi.org/10.12688/f1000research.4790.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 1
VERSION 1
PUBLISHED 25 Jul 2014
Views
18
Cite
Reviewer Report 27 Feb 2015
Alan Godfrey, Institute of Neuroscience, Newcastle University, UK, Newcastle, UK 
Approved with Reservations
VIEWS 18
This study investigates the use of a smart phone as a wearable mobility monitoring system. However, it is of moderate quality and there are a number of issues that warrant attention for the reader. The study could have been simplified, ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Godfrey A. Reviewer Report For: Combining low sampling frequency smartphone sensors and video for a Wearable Mobility Monitoring System [version 1; peer review: 1 approved, 1 approved with reservations]. F1000Research 2014, 3:170 (https://doi.org/10.5256/f1000research.5114.r7747)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
  • Author Response 16 Mar 2015
    Natalie Baddour, Department of Mechanical Engineering, University of Ottawa, Ottawa, K1N 6N5, Canada
    16 Mar 2015
    Author Response
    Thank you for taking the time to read the manuscript.  In response to comments:
    • The abstract and introduction both make reference to the fact that this is a proof-of-concept study.
    • We have
    ... Continue reading
COMMENTS ON THIS REPORT
  • Author Response 16 Mar 2015
    Natalie Baddour, Department of Mechanical Engineering, University of Ottawa, Ottawa, K1N 6N5, Canada
    16 Mar 2015
    Author Response
    Thank you for taking the time to read the manuscript.  In response to comments:
    • The abstract and introduction both make reference to the fact that this is a proof-of-concept study.
    • We have
    ... Continue reading
Views
22
Cite
Reviewer Report 24 Oct 2014
Paul McCullagh, Computer Science Research Institute, University of Ulster, Newtownabbey, UK 
Approved
VIEWS 22
The paper: “Combining low sampling frequency smartphone sensors and video for a Wearable Mobility Monitoring System”, examines the feasibility of combining the accelerometry and video, available in a commercial smart phone for classifying activities of living in the home.

As can ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
McCullagh P. Reviewer Report For: Combining low sampling frequency smartphone sensors and video for a Wearable Mobility Monitoring System [version 1; peer review: 1 approved, 1 approved with reservations]. F1000Research 2014, 3:170 (https://doi.org/10.5256/f1000research.5114.r6335)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
  • Author Response 31 Oct 2014
    Natalie Baddour, Department of Mechanical Engineering, University of Ottawa, Ottawa, K1N 6N5, Canada
    31 Oct 2014
    Author Response
    Thank you for a thorough and thoughtful review.  Work on the WMMS continues on an ongoing basis;  we have developed the WMMS for the BlackBerry Z10, which has an average ... Continue reading
COMMENTS ON THIS REPORT
  • Author Response 31 Oct 2014
    Natalie Baddour, Department of Mechanical Engineering, University of Ottawa, Ottawa, K1N 6N5, Canada
    31 Oct 2014
    Author Response
    Thank you for a thorough and thoughtful review.  Work on the WMMS continues on an ongoing basis;  we have developed the WMMS for the BlackBerry Z10, which has an average ... Continue reading

Comments on this article Comments (0)

Version 2
VERSION 2 PUBLISHED 25 Jul 2014
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.