0% found this document useful (0 votes)
34 views

CS-541 Wireless Sensor Networks: Lecture 13: Machine Learning Applications On Body Area Networks

This document discusses machine learning applications for body area networks. It outlines an activity recognition pipeline that includes data collection from sensors, preprocessing, segmentation, feature extraction, and feature selection. Feature selection is important for activity recognition as it aims to select a small subset of discriminative features to accurately predict human activity while eliminating irrelevant and redundant features. Common feature selection methods discussed are filter models, which select features based on intrinsic properties of data, and wrapper models, which use a classifier to evaluate feature subsets.

Uploaded by

Kumar Sarthak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views

CS-541 Wireless Sensor Networks: Lecture 13: Machine Learning Applications On Body Area Networks

This document discusses machine learning applications for body area networks. It outlines an activity recognition pipeline that includes data collection from sensors, preprocessing, segmentation, feature extraction, and feature selection. Feature selection is important for activity recognition as it aims to select a small subset of discriminative features to accurately predict human activity while eliminating irrelevant and redundant features. Common feature selection methods discussed are filter models, which select features based on intrinsic properties of data, and wrapper models, which use a classifier to evaluate feature subsets.

Uploaded by

Kumar Sarthak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

CS-541

Wireless Sensor Networks


Lecture 13: Machine Learning Applications on Body Area Networks

Spring Semester 2016-2017

M.Sc. candidate: Katerina Karagiannaki

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 1
University of Crete, Computer Science Department
Outline
• Introduction : Body Area Networks & Human Activity Recognition

• Feature Selection in BANs


• Motivation
• Activity Recognition Pipeline
• Data collection
• Case Study 1 : Benchmark Study
• Case Study 2 : Online Feature Selection

• Demo: Shimmer Sensing Platform


• Air Drums
• Media Player

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 2
University of Crete, Computer Science Department
Body Area Networks (BAN) -> Human Activity
Recognition (HAR)
BAN: Networks of wearable HAR: BAN & Machine Learning
computing devices including  Model Activities of Daily
sensors Living

Types HAR fields


• Implants • Health
• Wearable devices • Health Monitoring
• Mobile phones • Rehabilitation
• Chronic conditions management
• Well-being
• Elderly Assistance
• Act upon alerting events
• Fitness coaching
• Entertainment
• Gaming
3
Characteristics of BANs
• Broad range of sensing devices

Senses Distance, pace, calories 3-axial accelerometer, smartphones with heart


body temperature burned during run gyroscope, magnetometer rate monitors interface
• Heterogeneous data
• Accelerometers: Linear acceleration
• Gyroscopes: Angular velocity
• Magnetometers: Direction of the
magnetic field
• Heart Rate
• Temperature
• Various data rates
• Various every day activities

Spring Semester 2015-2016 4


Feature Selection in Body Area
Networks

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 5
University of Crete, Computer Science Department
Motivation
Context Awareness
Networks sense, and react based on their environment

• Adverse event detection and prevention


• Detection of sensor errors
• Reduction of energy consumption
• Activity recognition, tracking daily activities

Real world domains


• Industry
• Sports
• Entertainment
• Healthcare
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 6
University of Crete, Computer Science Department
Activity Recognition Pipeline

We will emphasize on these methods!

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 7
University of Crete, Computer Science Department
Data Acquisition

Collect data from sensor nodes


Notation of collected data:

k = # sensors, di = multiple values at a time t

Raw data from sensors:


• Sensor sampling rate:
• Number of samples per second taken from a continuous signal
to make a discrete or digital signal (measured in Hz)
• Each sensor samples at a different rate
• Signals suffer from artifacts
• Corruption
• Interference
• Noise

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 8
University of Crete, Computer Science Department
Data Preprocessing
Preprocessing: synchronize signals and remove outliers
• Calibration
• Unit conversion
• Normalization
• Resampling
• Synchronization

Preprocessed time series to represent data

d’i = a dimension,
n = # dimensions,
t = # samples

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 9
University of Crete, Computer Science Department
Data Segmentation

Set of segments
Each segment  wi = (t1, t2) t1: start time, t2: end time

Methods
• Sliding window
• Overlapping
• Non overlapping

Running
• Energy based
Standing Walking
• Intensity of
activities

• 𝐸 = −∞ 𝑠 2 𝑡 𝑑𝑡

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 10
University of Crete, Computer Science Department
Feature Extraction

Motivation
• Reduce data dimensionality signals into features
• Features are discriminative for the activities

Feature vectors Xi = F(D’, wi) F: feature extraction


function
Time domain Frequency domain Hybrid
waveform characteristics periodic structures both time and frequency
Feature Types & statistics information
• Slopes • Spectral features • Wavelet
• Amplitude • Fourier coefficients representation
• Mean
• Standard Deviation
• Kurtosis
• Etc.
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 11
University of Crete, Computer Science Department
Feature Extraction

Extract features for each


• Sensor modality
• Window Sensor
Stream

Example
• Y features
• S sensors
• W windows
Feature matrix
[W x (S x Y) ]

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 12
University of Crete, Computer Science Department
Feature Selection
Motivation  Accurate prediction of Human Activity
• Select a small subset of features
• Having N features  select M << N features
• Eliminate irrelevant and redundant features  Why?
• Degrade learning quality
• Memory and computational time in learning process
Algorithms
• Supervised  labelled data, training, offline
• Unsupervised  unlabeled data, no training, real time
F1 F2 … … … Fn F1 … … Fm

Original Feature Set Reduced Feature Set


N features M features
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 13
University of Crete, Computer Science Department
Feature Selection Methods

Filter model Popular filter metrics


• Analyze intrinsic properties of data • Correlation
• Ignore the classifier • Mutual Information
• Consider each feature separately • Euclidean Distance
• Class Separability
1. Score features using a statistical measure
2. Rank features by score E.g. Relief-F

Lower classification accuracy! Fast computation time and low complexity!


Subset is not optimal
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 14
University of Crete, Computer Science Department
Feature Selection Methods

Wrapper model
• Utilize a classifier to evaluate the quality of the selected features
1. Find a subset of features
2. Evaluate the classification quality using the selected subset
3. Repeat 1. and 2. until the desired quality is found

Feature Selection
requires
• Search strategy
• Objective function
to evaluate the
feature subset

Better classification accuracy! High computation time and high complexity!


Why?  Classifier evaluation
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 15
University of Crete, Computer Science Department
Feature Selection Methods

Embedded model Embedded variable selection


• Combination of filter and wrapper model E.g. Decision Trees
• Takes advantage of both approaches

1. Utilize filtering criteria to select different candidate subsets


2. Evaluate the classification quality on each candidate subset
3. Select subset with the highest classification quality

Better classification accuracy Less efficient than filters,


than filters! More efficient than wrappers!
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 16
University of Crete, Computer Science Department
Feature Selection Methods

A different approach to Feature Selection problem


• Graph-based Feature Selection
• Graph G(V,E)
• V: Vertices  Features
• E: Edges  Similarity measure (e.g. Correlation)
• Detect clusters in graph
• Discard features from clusters
• Those that do not satisfy
a criterion

Graph Based Feature Selection


GCNC, NCRE algorithms

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 17
University of Crete, Computer Science Department
Classification of Human Activity

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 18
University of Crete, Computer Science Department
Classification of Human Activity

Choice of classifier depends on application and needs


• Supervised/Unsupervised Learning
• Computational Complexity
• Execution time

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 19
University of Crete, Computer Science Department
Data Collection - FORTH-TRACE Dataset
Data Collection in Signal Processing Lab (ICS-FORTH)
FORTH-TRACE Dataset
• 5 Shimmer sensor Activities Locations
nodes • Stand
• 3-axial
accelerometer • Sit
• 3-axial gyroscope • Walk
• 3-axial
magnetometer
• Climb stairs
• 16 activities • Talk
• 15 participants • Combinations
• Postural transitions
between activities

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 20
University of Crete, Computer Science Department
Case Study 1 : Benchmark Study

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 21
University of Crete, Computer Science Department
Experimental Setup and Parameters

Java pool Acquisition & FORTH-TRACE dataset


HAPT dataset
of Feature Preprocessing
Selection
Algorithms Window size = {2, 5, 10, 20}
Segmentation seconds
Experiments
Mean, Median, Std, Var
10 independent Feature Extraction – Skewness, Kurtosis
runs of all
combinations of
statistical features Zero Crossing Rate
Pairwise Correlation, ….
FS Algorithms
SFFS – supervised
and window Feature Selection Relief-F – ranker
sizes
Algorithms FSSA – unsupervised
GCNC – unsupervised graph-based
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 22
University of Crete, Computer Science Department
Feature Subset Evaluation

• Post-Processing Evaluation Metrics


• Representation Entropy
𝑑 Where 𝜆𝑗 = the eigenvalues of
𝐻𝑟 = − 𝜆𝑗 log 𝜆𝑗 the 𝑑𝑥𝑑 covariance matrix of a
𝑗=1 feature set of size 𝑑
• Measures the amount of redundancy in feature set

• SVM Classification Accuracy


• Split feature set into Train (90%) and Test (10%) sets
• Gaussian kernel SVM classifier
to evaluate the effectiveness of the feature set for classification
• How does window size affect … ?
• Hr, SVM accuracy
• Execution time

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 23
University of Crete, Computer Science Department
Representation Entropy vs window size

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 24
University of Crete, Computer Science Department
Representation Entropy vs window size

Representation Entropy improves with small windows

Small windows  Discriminative features


Large windows  Extracted Features represent more than one activity

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 25
University of Crete, Computer Science Department
SVM Classification Accuracy vs window size

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 26
University of Crete, Computer Science Department
SVM Classification Accuracy vs window size

Classification Accuracy improves with small windows

Small window size  less activity labels – accurate matching of features to labels
Large windows  Extracted Features represent more than one activity

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 27
University of Crete, Computer Science Department
Execution time vs window size

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 28
University of Crete, Computer Science Department
Execution time vs window size

Small window size –> more windows –> larger feature vectors
–> more time to execute algorithms

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 29
University of Crete, Computer Science Department
Conclusions
• Window size affects Hr, SVM accuracy, execution time
• Hr and SVM accuracy improve with small windows
• Small window size – less activity labels – accurate matching of features
to labels
• Execution time tradeoff
• Small window size – more windows – larger feature vectors –
more time to execute algorithms
• Feature subset quality
• Supervised SFFS : little redundancy in feature subsets (Hr)
• Unsupervised FSSA : compensates by good classification accuracy
• Ranker Relief-F: good classification quality (supervised)
• Graph based GCNC : good classification quality and little redundancy

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 30
University of Crete, Computer Science Department
Case Study 2 : Feature Selection in an Online
Environment

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 31
University of Crete, Computer Science Department
Feature Selection in an Online Environment

• Extension of FORTH-TRACE library in Android Devices


• Dynamic data processing and storage
• Database integration to load and store data
• Dynamic data gathering
• Temporal windows of size W to simulate online scenario
• Online Feature Extraction and Feature Selection

Impact of temporal windows to FSA performance?


Impact of activity labels to FSA performance?

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 32
University of Crete, Computer Science Department
Data Processing in online environments

Dataset

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 33
University of Crete, Computer Science Department
Experimental Setup and Parameters

Java/Android FORTH-TRACE dataset 


pool of
Acquisition & W = {2, 3, 4, 5} minutes -
Feature Preprocessing P = {9, 6, 5, 4} data partitions
Selection
Window size = 2 seconds
Algorithms Segmentation
Experiments
Mean, Median, Std, Var
10 independent Feature Extraction – Skewness, Kurtosis
runs of all
combinations of
statistical features Zero Crossing Rate
Pairwise Correlation, ….
FS Algorithms
and window Feature Selection Relief-F – ranker
sizes FSSA – unsupervised
Algorithms
GCNC – unsupervised graph-based

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 34
University of Crete, Computer Science Department
Dynamic performance of Feature Selection
Algorithms

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 35
University of Crete, Computer Science Department
Dynamic performance of Feature Selection
Algorithms

Inconsistent FSA performance over different temporal windows W


 due to variation of activity labels within different partitions P

Online HAR applications involve short data chunks!

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 36
University of Crete, Computer Science Department
Impact of Activity Labels to the Classification
Accuracy

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 37
University of Crete, Computer Science Department
Impact of Activity Labels to the Classification
Accuracy

Distribution of Activity Labels and CA


• Single primary activity  optimal CA
• More activities less accurate predictions

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 38
University of Crete, Computer Science Department
Online performance of library modules

Most time consuming module: Feature Extraction


FSA execution:
• Relief-F (supervised): fast  not suitable for online realizations
• GCNC (graph-based): adequate time and performance for
integration with online architectures

Energy Requirement: ~ 20 Joule

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 39
University of Crete, Computer Science Department
Conclusions

• Online execution of Machine Learning pipeline for HAR


• Size of temporal window W affects Feature Selection performance
• Short W  Less activities in data – performance variations

• Activity labels in different partitions affect Classification Accuracy


• Percentage of activities in a single partition
• Types of activities

• Online performance of Machine Learning components


• Feature Extraction and Feature Selection  time consuming

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 40
University of Crete, Computer Science Department
Demo

• Shimmer Air Drums


https://www.youtube.com/watch?v=a1F9HwzaYoY

• Media Player
• Play, Stop and Next Track using Android application and a Shimmer node

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 41
University of Crete, Computer Science Department
References and Material for Reading
• Bulling, Andreas, Ulf Blanke, and Bernt Schiele. "A tutorial on human activity recognition using body-worn
inertial sensors." ACM Computing Surveys (CSUR) 46.3 (2014): 33.
• O.D.Lara, and M.A.Labrador, "A survey on human activity recognition using wearable sensors",
Communications Surveys & Tutorials, IEEE, 15(3), pp. 1192-1209, 2013.
• Alelyani, Salem, Jiliang Tang, and Huan Liu. "Feature Selection for Clustering: A Review." Data Clustering:
Algorithms and Applications 29 (2013).
• Guyon, Isabelle, and André Elisseeff. "An introduction to variable and feature selection." The Journal of
Machine Learning Research 3 (2003): 1157-1182.
• Slides on Feature Selection http://research.cs.tamu.edu/prism/lectures/pr/pr_l11.pdf .
• Wang, Aiguo, et al. "Accelerating wrapper-based feature selection with K-nearest-neighbor." Knowledge-
Based Systems 83 (2015): 81-91.
• Fast unfolding of communities in large networks Vincent D. Blondel , Jean-Loup Guillaume, Renaud
Lambiotte, and Etienne Lefebvre (2008)
http://arxiv.org/pdf/0803.0476.pdf .
• Karagiannaki Katerina, Panousopoulou Athanasia and Tsakalides Panagiotis. ”An Online Feature Selection
Architecture for Human Activity Recognition.” Proceedings of IEEE International Conference on Acoustics,
Speech and Signal Processing 2017 (ICASSP).
• Karagiannaki Katerina, Panousopoulou Athanasia and Tsakalides Panagiotis. ”A Benchmark Study on Feature
Selection for Human Activity Recognition.” Proceedings of the 2016 ACM International Joint Conference on
Pervasive and Ubiquitous Computing (UbiComp): Adjunct. ACM, 2016

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 42
University of Crete, Computer Science Department
Thanks!

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 43
University of Crete, Computer Science Department
Backup slides

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 44
University of Crete, Computer Science Department
Feature Selection
Search Strategies

• Exhaustive Search • Greedy Search


𝑛
• 𝑚
combinations for a fixed Greedy Search
value of m
• 2𝑛 combinations for optimized m
Exponential Sequential Randomized
• Unfeasible, even if m and n are Algorithms Algorithms Algorithms
small
• Exhaustive evaluation of 10 out of Exhaustive
Sequential
Simulated
20 features involves 184,756 Forward
Search Annealing
Selection
feature subsets!!!
Sequential
Branch and Genetic
Backward
Bound Algorithms
Selection

Sequential
Floating
Selection

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 45
University of Crete, Computer Science Department
Sequential Feature Selection Algorithms

Sequential Algorithms

Sequential
Sequential Backward
Forward Sequential Floating Selection
Selection
Selection

1. Start from empty set Forward Backward


1. Start from full set of features 1. Do a forward
2. Add the feature that search resulting in 1. Do a backward search
2. Remove any variable that does
improves performance subset S resulting in subset S
not deteriorate performance
the most
3. Until no such feature exists 2. Then a backward 2. Then a forward search
3. Until no improvement search starting with S starting with S

CS-541 Wireless Sensor Networks


46
Spring Semester 2015-2016 University of Crete, Computer Science Department
Feature Selection Methods

• Graph Clustering

Louvain Community Detection [Blondel et al.]


Unsupervised Algorithm
• Optimize Modularity
 link density between clusters
• 2 passes (each pass = 2 steps)
1. Detect clusters in graph
2. Create new graph
using clusters from 1.
3. Repeat 1. and 2. until no gain in
Modularity

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 47
University of Crete, Computer Science Department
Classification algorithms
Support Vector Machines – SVMs (1/2)
SVMs are linear classifiers that find a hyperplane to
separate tow classes of data, positive and negative.
Training examples D are
o Input vector:
o Class label:

• SVM finds a linear function


to maximize the margin around
the separating hyperplane

CS-541 Wireless Sensor Networks


Spring Semester 2015-2016 48
University of Crete, Computer Science Department
Classification algorithms
Support Vector Machines – SVMs (2/2)
• Linear separable case  not the ideal situation

• Map the data in the input space X to a feature space


F via a non-linear mapping φ kernel trick

• Our experimental parameters


o Gaussian Kernel
o Polynomial of order 2 Kernel
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 49
University of Crete, Computer Science Department
Classification algorithms
k Nearest neighbors – kNN
k Nearest neighbors of a record x are data points that
have the k smallest distance to x.
• Classification with majority voting among the nearest
neighbors

• Our experimental parameters


• k = 10 Neighbors
• Distance metrics: Cosine and Euclidean
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 50
University of Crete, Computer Science Department
Classification algorithms
Decision Trees
Features are mapped to classes using a decision tree
as a predictive model.

• Leaves: class labels


• Braches: Conjunctions of
features leading to class
labels

• Moves from top to bottom


• Chooses a variable at each step that best splits the set of
items  maximization of information gain
CS-541 Wireless Sensor Networks
Spring Semester 2015-2016 51
University of Crete, Computer Science Department

You might also like