49-Article Text-153-3-10-20200225

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

IJEEMI, Vol. 2, No.

1, February 2020, pp:48-59


DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

Fast Algorithm to Measure the Types of Foot


Postures with Anthropometric Tests Using Image
Processing
Husneni Mukhtar#, Dien Rahmawati, Desri Kristina Silalahi, Ledya Novamizanti, Muhammad Rayhan Ghifari,
Ahmad Alfi Adz Dzikri, Faris Fadhlur Rachman, Ahmad Akbar Khatami
Faculty of Electrical Engineering of Telkom University
Jl. Telekomunikasi, Terusan Buah Batu, Bandung, 40257, Indonesia
#
[email protected], dienrahmawati@ telkomuniversity.ac.id, desrikristina@ telkomuniversity.ac.id.

Article Info Abstract


Article History: There are two types of tools for measuring the foot posture, uniplanar (anthropometric and
Received Feb 2, 2020 radiographic types) and multiplanar tools (such as Foot Posture Index-6 and -8). The process of the foot
Revised Feb 10, 2020 posture measurement with both tools performed by a doctor was commonly carried out by using manual
Accepted Feb 20, 2020 equipment such as ruler, arc, goniometer, marker and applying the observation skill by eyes. It needs
time to measure for each foot. For research needs, a large number of samples have to be provided by a
doctor to analyze data statistically which consumes much more time and exhaustion from the workload
Keywords: in the measurement process. Hence, the aim of this study is to significantly decrease the measurement
Foot posture time and minimizing human error by developing software of anthropometric measurements of foot
posture based on digital image processing (DIP). The anthropometric tests used in this study consist of
Pronation
Rear Foot Angle (RFA), Medial Length Arc Angle (MLAA) and Arch Height Index (AHI). Instead of
RFA using equipment with a series of measurements to determine the foot posture, the DIP system only needs
MLA two pictures of the foot as the input of the system. The methods involved in the image processing are
AHI performed by a series of digital image processing, started from pre-image processing, noise filter, Sobel
Image processing edge detection, feature extraction, calculation, and classification. The result of the image processing is
able to determine the foot posture types for all tests based on the values of angle and length of the foot
variables. The error measurements of length and angle are 6.22 % and (0.26-1.74) %, respectively. This
study has demonstrated the development algorithm in MATLAB to measure the foot posture, which is
named Anthro-Posture v1.0 software. This software offers an efficient alternative way in measuring and
classifying the foot posture in a shorter time and minimizing the human error in the measurement
process. In the future, this study can be improved to be used by doctors in obtaining large amounts of
data for research needed

Corresponding Author: This work is an open-access article and licensed under


Husneni Mukhtar a Creative Commons Attribution ShareAlike 4.0 International
Faculty of Electrical Engineering of Telkom University (CC BY-SA 4.0).
Jl. Telekomunikasi, Terusan Buah Batu,
Bandung, 40257, Indonesia
Email: [email protected]

the foot alignment examination as a clinical assessment of


I. INTRODUCTION patients with pain and lower-limb injury [7]. However, the
The assessment of foot posture should play a notable role to assessment of the foot alignment often meets difficulties for
observe its correlation with other fields, such as finding out the doctors and researchers because the results have concern about
effect of foot posture in the gait analysis [1] where a distinct of the validity, reliability and usefulness of all measures due to a
foot pressure might be resulted by different foot posture, in data variety of confounding factors [8].
diagnosis on leg muscle activity and leg physical therapy [2], Nevertheless, a static foot assessment in foot posture is
[3], in foot kinematic as walking [4] and in musculoskeletal generally carried out to categorize the foot type based on its
examination in clinical practice and research [5], [6] for example

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


48
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

anatomical characteristics. In these three decades, there were


many foot posture classification tools developed which were
divided into uniplanar and multiplanar tools. Examples of
multiplanar tools are Foot Posture Index (FPI)-6 and FPI-8 that
combine sagittal, frontal and transversal assessments of the feet
[9], [10]. Meanwhile, there were many uniplanar tools that have Black open-box
also been developed and employed. The most frequently used

Black screen
tools were the anthropometric and radiographic types. Rearfoot
angle (RFA), Medial Longitudinal Arch Angle (MLAA),
navicular drop (ND), footprint (arch index) and malleolar
valgus index are included in anthropometric measurements [8], Stand
[11]. platform

Research on foot pronation has been carried out by previous Lighting camera
system
researchers such as Langley et al. [9], Bailey et al. [12], and
James et al. [13]. In general, their research was conducted by Fig. 1. Illustration of the system arrangement of taking pictures/ images.
marking the tibia and calcaneus lines of the subject's feet using
markers, then measuring the angle between the midline of the 2) Experiment
tibia and calcaneus using a goniometer. Another pronation foot In this study, after the photo session was completed, the
study was done by Lin et al. [14][15] using image processing to digital images were processed using a developed graphical user
determine the angle between the tibia and calcaneus midline, interface (GUI) based on Matlab software. In this tool, each
but this study did not directly classify the results of the angular image was processed and measured using techniques of RFA,
measurements to determine the type of foot pronation. MLAA, and AHI. Then, the classification of the foot type was
For doctors or clinicians, the anthropometric measurements resulted for each technique.
were widely performed by direct measurement [7], [13], [16], B. The Diagram Block
[17] such as evaluating the curvature and harmony of the foot
[8] using a goniometer to measure RFA [9] or using a scanner The technique of digital image processing was performed by
device and other technologies [5], [15], [18] such as RFID several steps in this study as shown in Fig. 2. The acquired image
sensors and data visualization with computer modeling [12]. It was previously processed by converting the image format and
is rare to find studies that use image processing in measuring reducing the noise in order to obtain a better extraction result of
either angle or length of the measured foot variables. the feature.
Furthermore, for the necessity of large medical data, utilization Image Image Pre- Feature
of those devices, such as goniometer and arc will spend more Classification
acquisition processing extraction
time or need more people to perform the measurement.
Therefore, the purpose of this study is to provide and develop a Fig. 2. Block diagram of Digital Image Processing of anthropometric tests.
fast algorithm for measuring and classifying the foot posture
types based on anthropometric tests, namely RFA, MLAA and
AHI using a set of DIP technique.
Real RGB to grayscale Edge
Noise filter
Image conversion detection
II. MATERIALS AND METHODS
A. Experimental Setup Median filter Threshold Normalization
This study used ten participants with information on age,
body weight, height, and body mass index (19.3  1.76 years, Fig. 3. Block diagram of Image Pre-processing.
57.3  10.39 kg, 166.8  7.64 m, 20.50  2.78 kg/m2). All
photographs of the feet of participants were taken in a standard C. Image Pre-processing
studio setting. Image pre-processing is aimed to improve the image data by
suppressing unwanted distortions such as color transformation,
1) Materials and Tool
filtering, segmentation and scaling in order to prepare the image
A camera, Canon 200D with a 5.76 focus and a 25mm lens,
was used to take the images of the participant’s standing foot of to be used in the next process. This process (Fig. 3) was carried
both sides of the rear and inner foot for each leg. During the out in six sequences.
process of taking photographs, the participant stood on a i. Converting the RGB to grayscale image by eliminating the
platform block. An adequate lighting system was provided to hue and saturation information while retaining the
acquire a good contrast image between the object and the
luminance.
background (described in Fig. 1).

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


49
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

ii. Decreasing the noise effect using the smooth filter by D. Feature Extraction
replacing each pixel with the average of its 33 A process or method to reduce the dimensionality of initial
neighborhood. raw data, but still accurately and completely depicting the initial
iii. Undertaking edge detection to identify the edges in an data set, is called feature extraction. In this study, the practical
image using a Sobel edge detector. Thus, the resulting use of feature extraction was performed by image processing
image becomes clearer. using the algorithms to detect features such as points in a digital
iv. Doing the normalization process on the image by ranging image. Feature detection was used to calculate the RFA,
the image on a scale of 0 and 1. MLAA, and AHI, which will be explained separately as
v. Converting a grayscale image to binary image using the follows.
threshold method by replacing each pixel in the image 1) Rearfoot Angle (RFA)
with a value of 0 (for a typical black intensity) or 1 (for a RFA was measured as the acute angle between the projection
typical white intensity). The profiles of the foot image and of two lines, as shown in Fig. 4(a). The four marked points [21]
the background would be shown in white intensity and were the base of the calcaneus, the Achilles tendon attachment,
black intensity, respectively. the Achilles tendon center at the height of medial malleoli and
vi. Applying a non-linear filter, namely median filter, to the center of shank posterior, 15 cm above the Achilles tendon
remove the noise of the resulting error without reducing center. The calculation process of RFA was described in the
following steps.
the image sharpness by replacing the gray level of each
pixel by the median of the gray levels in the neighborhood i. It requires a binary image, then the user placed 4 points on
of the pixels [20]. each edge of calcaneus bone using marker tools of the
developed algorithm,
Sobel operator, written in Equation 1, used the derivative ii. The algorithm calculates and marks the coordinate center
approximation to find or detect the edge. This operator applied of each distance owned by each pair of marker points,
a pair of horizontal ( ) and vertical ( ) gradient matrices
iii. The algorithm calculates the gradient value using a
(commonly in a 33 dimension) at the weight of the central numerical gradient (Equation 4) to estimate the values of
pixels [21]. It works by calculating the gradient of the image the partial derivatives in each dimension using the known
intensity at each pixel within the image by calculating the function values from the step ii.
magnitude (Equation 2) and finding the direction of the largest
increase from light to dark and the rate of change in that ∇F = ̂+ ̂ (4)
direction (Equation 3).
+1 +2 +1 iv.The angle (in radian unit) is calculated and converted in
= 0 0 0 ∗ , degree.
−1 −2 −1
2) Medial Longitudinal Arch Angle (MLAA)
−1 0 +1 MLAA is the midpoint of the medial malleolus (MM), the most
= −2 0 +2 ∗ (1) prominent aspect of the navicular tuberosity (NT) and the most
−1 0 +1 medial prominence of the first metatarsal head (MH) [5] as
= + (2) shown in Fig. 5. The calculation process of MLAA is described
in the following steps.
Θ= (3)
i. It requires a grayscale image, then the user placed 3 points
The variable of is the original source image, is the gradient of MH, NT and MM using marker tools of Graphical User
Interface in MATLAB,
magnitude, and Θ is the gradient direction. Two kernels (3×3)
ii. The algorithm calculates the distances of MM-NT, NT-
of Sobel filter used for changes in the horizontal and vertical
MH, and MH-MM using Euclidean distance, the distance
direction were convolved with the original image to calculate
between two points in Euclidean space (relationship
the approximations of the derivatives. To compute and ,
between the angle and the distance). The angle at NT (in
the appropriate kernel (window) was moved over the input radian) was calculated using Equation 5, then converted in
image, then computing the value for one pixel, and shifting one degree. Descriptive classification of foot posture for RFA
pixel to the right. Once the end of the row was reached, moved and MLAA calculation are presented in Table I.
down to the beginning of the next row. The results represented
an edge by showing how abruptly or smoothly the image
changes at each pixel and how the edge was likely to be cos = (5)
oriented. A gradient value of 0 indicated the vertical edge that
3) Arch Height Index (AHI)
was much darker on the left side.

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


50
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

AHI, a clinical measure to assess the static foot posture and arc angle and length based on RFA, MLAA and AHI tests. Fig. 7(b)
height [22] is a ratio of the dorsum height at 50%-foot length to shows an image pre-processing using the developed algorithm.
the truncated foot length [23][24]. The calculation process of
1) Anthro-Posture v1.0 software
AHI is described in the following steps.
An algorithm was created in MATLAB to determine the foot
i. It requires a binary image, then the user places 4 points posture from RFA and/ or MLAA tests. AHI was used as
using marker tools to calculate the AHI_total and additional information to support the results of determining the
AHI_instep (on dorsum, base, the most posterior point of classification of foot posture. The developed algorithm, which
the calcaneus and the first metatarsophalangeal joint or the is named Anthro-Posture v1.0 software, was built using a
front end of the foot) Graphical User Interface (GUI) and designed with the principle
ii. The algorithm determines the coordinate center of the total of user-friendly. Fig. 8 shows the main menu of the software
foot length and calculates the AHI_total (Equation 6) and where the user can choose one of the tests by clicking the
AHI_instep (Equation 7). selected button.
AHI = (6) 2) Validity Test
The angle and the length, the measurement of the variables
AHI = (7) of RFA, MLAA, and AHI, were validated by the specialists and
a certain comparison method. The specialists, doctors of the
E. Classification medical rehabilitation in the Department of Medical
The types of foot postures (supination, neutral, or Rehabilitation in Hasan Sadikin Hospital (RSHS) Bandung
supination) were determined by the DIP results of RFA or validated the result of length and angle measurement in this
MLAA rules, as presented in Table I. study using a ruler and arc as shown in Table II. The error
means of length and angle from those comparisons were 6.22%
TABLE I. CLASSIFICATION OF FOOT POSTURE (SUPINATION, NEUTRAL, and 0.26%, respectively.
PRONATION).
Another validation technique was comparing the angle
Type of Type of foot posture result of DIP with Kinovea software. This software, a valid and
tests Supination Neutral Pronation reliable tool [25, 26], is one of the test methods to accurately
RFA [24] RFA ≥ 5° 4° valgus to RFA ≥ 5°
measure the angle range. Table III shows the comparison results
varus 4° varus valgus
MLAA [27] > 150° 130° to 150° MLAA < 130°
of Konivea and Anthro-Posture v1.0 software. The differences
in angle measurements were caused by the resolution of
Kinovea (the resolution is only 1 degree). Hence, it could be
supination neutral pronation considered that its error means is 1.74%.
(a) (b)

Tibia
MM
Talus
3
Calcaneus c
b

Instep height
NT
(a) a 1
(b) MH
2
Instep (truncated) foot length
4

Total foot length


Fig. 4. Illustration of (a) Calculating RFA by the developed algorithm in
MATLAB and (b) RFA from the anatomy of right rear foot [25].
Fig. 5. Illustration of determining (a) MLAA and (b) AHI.
III. RESULTS
As shown in Fig. 6, each participant stood on a black open-
box which was placed on the platform. The black open-box has
three sides, which are the base, left and right sides. The black
background was placed as a screen in front of the front legs (for
taking pictures of the rear foot) and next to the legs (for taking
pictures of the inner-side foot) to obtain a foot image that Lighting set

contrasts with the background color. The lighting system was


positioned to provide sufficient luminance for the camera. Standing camera

The results of taking pictures of the rear and inner-side foot


are shown in Fig. 7(a). Each image was then processed by pre-
processing technique (in the steps in Fig. 3.) in order to get a
grayscale and binary images to conduct some measurements of Fig. 6. The photograph of a participant's feet that stood on the standing
platform was taken using a camera and lighting set.

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


51
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

(a) (b) TABLE III. COMPARISON RESULT OF ANGLE MEASUREMENT OF KONIVEA


SOFTWARE AND SOFTWARE OF ANTHRO-POSTURE V1.0.

Angle measurement in
Tested angle (o)
Konivea software
in Anthro-
Angle Error (°)
Posture v.1
Description result
Rear foot RGB to grayscale software
Real Image
conversion
Noise filter Edge detection
(°)
Median filter Threshold Normalization

19.76o 20 o 0.24°

Inner-side foot

Fig. 7. (a)Photographs of the rear and inner side of the standing foot, (b)
Results of each step of image pre-processing. 27.82o 27o 0.82°

40.79o 40o 0.79°

42.30o 42o 0.30°

3) The Listing Program of Anthro-Posture v1.0 for


Anthropometrics Tests.
The listing program of the image pre-processing was shown
in Listing Program 1 while the listing program for each chosen
Fig. 8. The main menu of Anthro-Posture v1.0 software. test in the main menu, namely RFA, MLAA or AHI, were shown
in the Listing Program 2 to 4.
Listing program 1. Program of image pre-processing.
TABLE II. ERRORS OF LENGTH AND ANGLE VALIDATIONS BETWEEN %import and resize image file
SPECIALISTS AND ANTHRO-POSTURE V1.0 SOFTWARE.
<code>
Length (cm) Angle (°) [FileName,Pathname]=uigetfile('*.jpg',sprintf('Pilih sampel
N
Ruler Software Error Arc Software Error untuk scan'));
1 6.1 6.1797 0.0797 140 140.7728 0.5520 if FileName==0
2 5.1 5.2035 0.1035 125 125.5943 0.4754 return
3 5.9 5.9628 0.0628 152 152.1459 0.0960 end
4 6.0 6.0266 0.0266 128 128.3519 0.2749
5 4.8 4.9103 0.1103 138 138.9661 0.7001 fullName=fullfile(Pathname,FileName);
6 4.9 5.0157 0.1157 146 146.1195 0.0818
7 5.2 5.2356 0.0356 142 142.1341 0.0944 disp(fullName);
8 4.9 4.9511 0.0511 139 139.2135 0.1536 imdat=imread(fullName);
9 4.2 4.2298 0.0298 145 145.2784 0.1920 imdat=imresize(imdat,[842 1500]);
10 4.6 4.6077 0.0077 157 157.0745 0.0475 imdat=rgb2gray(imdat);
preProcessing(hObject, eventdata, handles,imdat);

<\code>

%Pre-processing
<code>

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


52
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

gambar=im2bw(gambar,0.0600);
%function filter binary gambar=doLPF(gambar);

function gambar=filterBinary(gambar) % noise reduction


gambar=bwareaopen(gambar,10); gambar=filterBinary(gambar);
axes(handles.edgeDisplay);
%function HPF imshow(gambar);

function gambar=doHPF(gambar) <\code>


kernelFilter=[0 -1/4 0;-1/4 2 -1/4;0 -1/4 0];
gambar=imfilter(gambar,kernelFilter,'conv'); Listing program 2. Program to measure the angle in RFA test.

%function LPF %Plotmarker


function gambar=doLPF(gambar) % --- Executes on button press in plotPoint_Calc.
kernelFilter=[ 1/9 1/9 1/9;1/9 1/9 1/9;1/9 1/9 1/9; ]; function plotPoint_Calc_Callback(hObject, eventdata, hand
gambar=imfilter(gambar,kernelFilter,'conv'); les)
% hObject handle to plotPoint_Calc (see GCBO)
function preProcessing(hObject, eventdata, handles,gambar % eventdata reserved - to be defined in a future version of
) MATLAB
% hObject handle to selectFile (see GCBO) % handles structure with handles and user data (see GUI
% eventdata reserved - to be defined in a future version of DATA)
MATLAB axes(handles.edgeDisplay);
% handles structure with handles and user data (see GUI hold on;
DATA) axis on;
% gambar dalam bentuk matrix image
counter1=1;
% Noise reduction
G = fspecial('gaussian',[5 5],5); xSTJ1=zeros(1,2);
gambar=imfilter(gambar,G,'same'); ySTJ1=zeros(1,2);

% do High pass filter counter2=1;


gambar=doHPF(gambar); xSTJ2=zeros(1,2);
C=double(gambar); ySTJ2=zeros(1,2);

%Sobel Masking ( Edge Detection ) for c=1:4


for i=1:size(C,1)-2 [x,y]=ginputc(1, 'Color', 'r', 'LineWidth', 0.5);
for j=1:size(C,2)-2 if mod(c,2)==1
%Sobel mask for x-direction: plot(x,y, 'g.', 'MarkerSize', 15, 'LineWidth', 1);
Gx=((2*C(i+2,j+1)+C(i+2,j)+C(i+2,j+2))-
(2*C(i,j+1)+C(i,j)+C(i,j+2))); xSTJ1(counter1)=x;
%Sobel mask for y-direction: ySTJ1(counter1)=y;
Gy=((2*C(i+1,j+2)+C(i,j+2)+C(i+2,j+2))- counter1=counter1+1;
(2*C(i+1,j)+C(i,j)+C(i+2,j)));
elseif mod(c,2)==0
%The gradient of the image plot(x,y, 'r.', 'MarkerSize', 15, 'LineWidth', 1);
%B(i,j)=abs(Gx)+abs(Gy); xSTJ2(1,counter2)=x;
gambar(i,j)=sqrt(Gx.^2+Gy.^2); ySTJ2(1,counter2)=y;

end counter2=counter2+1;
end end
end
% filtering implement low pass filter
gambar=doLPF(gambar); % middle point
global Xstjleg;
%Convert image to binary and LPF global Ystjleg;

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


53
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

axes(handles.display_image);
Xstjleg=times(plus(xSTJ1,xSTJ2),1/2); hold on;
Ystjleg=times(plus(ySTJ1,ySTJ2),1/2); axis on;
global XMH;
plot(Xstjleg,Ystjleg, 'b.', 'MarkerSize', 15, 'LineWidth', 1); global YMH;

<\code> [x,y]=ginputc(1, 'Color', 'r', 'LineWidth', 0.5);


% -------- XMH=x;
%Calculating RFA Angle YMH=y;
% ----------
<code> plot(x,y, 'b.', 'MarkerSize', 15, 'LineWidth', 1);
% --- Executes on button press in NT_PLOT.
% --- Executes on button press in getangle. function NT_PLOT_Callback(hObject, eventdata, handles)
function getangle_Callback(hObject, eventdata, handles) % hObject handle to NT_PLOT (see GCBO)
% hObject handle to getangle (see GCBO) % eventdata reserved - to be defined in a future version of
% eventdata reserved - to be defined in a future version of MATLAB
MATLAB % handles structure with handles and user data (see GUI
% handles structure with handles and user data (see GUI DATA)
DATA) axes(handles.display_image);
global Xstjleg; hold on;
global Ystjleg; axis on;
disp('====RFA==='); global XNT;
disp(Xstjleg); global YNT;
disp(Ystjleg);
[x,y]=ginputc(1, 'Color', 'r', 'LineWidth', 0.5);
%Calculating gradient of each marker point then take the XNT=x;
average value YNT=y;
linemidcalc=line(Xstjleg,Ystjleg,'LineWidth',2);
meangradient_calca=mean(gradient([Xstjleg],[Ystjleg])); plot(x,y, 'r.', 'MarkerSize', 15, 'LineWidth', 1);
% --- Executes on button press in MM_PLOT.
%Convert gradient to degree function MM_PLOT_Callback(hObject, eventdata, handles
calca_angle=atan(meangradient_calca)*57.2957795131; )
% hObject handle to MM_PLOT (see GCBO)
set(handles.meangradient_calca,'String',sprintf(num2str(me % eventdata reserved - to be defined in a future version of
angradient_calca))); MATLAB
set(handles.calca_angle,'String',sprintf(num2str(calca_angle % handles structure with handles and user data (see GUI
))); DATA)
axes(handles.display_image);
<\code> hold on;
axis on;
Listing Program 3. Program to measure the angle in MLAA
test. global XNM;
global YNM;
%Plot marker for medial malleolus (MM), navicular
%tuberosity (NT), and metatarsal head (MH) [x,y]=ginputc(1, 'Color', 'r', 'LineWidth', 0.5);
XNM=x;
<code> YNM=y;
% --- Executes on button press in MH_PLOT. plot(x,y, 'g.', 'MarkerSize', 15, 'LineWidth', 1);
function MH_PLOT_Callback(hObject, eventdata, handles)
% hObject handle to MH_PLOT (see GCBO) <\code>
% eventdata reserved - to be defined in a future version of
MATLAB %Calculating MLAA
% handles structure with handles and user data (see GUI
DATA) <code>

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


54
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

% eventdata reserved - to be defined in a future version of


% --- Executes on button press in getAngle. MATLAB
function getAngle_Callback(hObject, eventdata, handles) % handles structure with handles and user data (see GUI
% hObject handle to getAngle (see GCBO) DATA)
% eventdata reserved - to be defined in a future version of axes(handles.ImOut);
MATLAB hold on;
% handles structure with handles and user data (see GUI axis on;
DATA)
global XNM; %Plot marker for start point and end point of foot height
global YNM; counter1=1;

global XNT; xSTJ1=zeros(1,5);


global YNT; ySTJ1=zeros(1,5);

global XMH; counter2=1;


global YMH; xSTJ2=zeros(1,5);
ySTJ2=zeros(1,5);
%variable relation between points
MH_COORDINATE=[XMH YMH]; for c=1:2
NT_COORDINATE=[XNT YNT]; [x,y]=ginputc(1, 'Color', 'r', 'LineWidth', 0.5);
MM_COORDINATE=[XNM YNM]; if mod(c,2)==1
plot(x,y, 'g.', 'MarkerSize', 15, 'LineWidth', 1);
J=[MM_COORDINATE;NT_COORDINATE];
K=[NT_COORDINATE;MH_COORDINATE]; xSTJ1(counter1)=x;
L=[MH_COORDINATE;MM_COORDINATE]; ySTJ1(counter1)=y;
counter1=counter1+1;
%Finding eucledian distance of each point
M=pdist(J,'euclidean'); elseif mod(c,2)==0
N=pdist(K,'euclidean'); plot(x,y, 'r.', 'MarkerSize', 15, 'LineWidth', 1);
O=pdist(L,'euclidean'); xSTJ2(1,counter2)=x;
ySTJ2(1,counter2)=y;
disp(M);
disp(N); counter2=counter2+1;
disp(O); end
end
%Calculating cos value of NT
cos_NT=((M^2)+(N^2)-(O^2))/(2*M*N); % middle point
disp(cos_NT); global XVer;
global YVer;
%Calculating MLAA degree
NT_Rad=acos(cos_NT); XVer=times(plus(xSTJ1,xSTJ2),1/2);
NT_Deg=NT_Rad*57.2958; YVer=times(plus(ySTJ1,ySTJ2),1/2);

<\code> plot(XVer,YVer, 'b.', 'MarkerSize', 15, 'LineWidth', 1);

plot1VerCor=[xSTJ1 ySTJ1];
plot2VerCor=[xSTJ2 ySTJ2];
Listing Program 4. Program to measure AHI test.
%Calculating Height of Foot %Find Euclidean distance of start point and end point
<code> VerDist=[plot1VerCor;plot2VerCor];
% --- Executes on button press in VerDist. VerDistPix=pdist(VerDist,'euclidean');
function VerDist_Callback(hObject, eventdata, handles)
% hObject handle to VerDist (see GCBO) %Calibrating distance
global VerDistCm;

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


55
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

Yhor=times(plus(yleg1,yleg2),1/2);
b = VerDistPix * 0.026458333; plot(Xhor,Yhor, 'b.', 'MarkerSize', 15,'LineWidth', 1 );
VerDistCm = b-2.29938;
plot1HorCor=[xleg1 yleg1];
set(handles.VerDistCm,'string',num2str(VerDistCm)) plot2HorCor=[xleg2 yleg2];

<\code> %Find Euclidean distance of start point and end point


HorDist=[plot1HorCor;plot2HorCor];
Calculating Length ( Total ) of Foot HorDistPix=pdist(HorDist,'euclidean');

<code> %Calibrating distance


global HorDistCm;
% --- Executes on button press in HorDist.
function HorDist_Callback(hObject, eventdata, handles) a=HorDistPix*0.026458333;
% hObject handle to HorDist (see GCBO) HorDistCm = a-2.90794;
% eventdata reserved - to be defined in a future version of
MATLAB set(handles.HorDistCm,'string',num2str(HorDistCm))
% handles structure with handles and user data (see GUI
DATA) <\code>
axes(handles.ImOut);
hold on; %Calculating Length (Instep) of Foot
axis on; <code>
% --- Executes on button press in InsDist.
%Plot marker for start point and end point of foot length function InsDist_Callback(hObject, eventdata, handles)
counter1=1; % hObject handle to InsDist (see GCBO)
xleg1=zeros(1,5); % eventdata reserved - to be defined in a future version of
yleg1=zeros(1,5); MATLAB
% handles structure with handles and user data (see GUI
counter2=1; DATA)
xleg2=zeros(1,5); axes(handles.ImOut);
yleg2=zeros(1,5); hold on;
axis on;
for c=1:2 %Plot marker for start point and end point of foot Instep
[x,y]=ginputc(1, 'Color', 'r', 'LineWidth', 0.5); length
if mod(c,2)==1 counter1=1;
plot(x,y, 'g.', 'MarkerSize', 15, 'LineWidth', 1); xins1=zeros(1,5);
yins1=zeros(1,5);
xleg1(1,counter1)=x;
yleg1(1,counter1)=y; counter2=1;
counter1=counter1+1; xins2=zeros(1,5);
yins2=zeros(1,5);
elseif mod(c,2)==0
plot(x,y, 'r.', 'MarkerSize', 15, 'LineWidth', 1); for c=1:2
[x,y]=ginputc(1, 'Color', 'r', 'LineWidth', 0.5);
xleg2(1,counter2)=x; if mod(c,2)==1
yleg2(1,counter2)=y; plot(x,y, 'g.', 'MarkerSize', 15, 'LineWidth', 1);

counter2=counter2+1; xins1(1,counter1)=x;
end yins1(1,counter1)=y;
end counter1=counter1+1;
% middle point
global Xhor; elseif mod(c,2)==0
global Yhor; plot(x,y, 'r.', 'MarkerSize', 15, 'LineWidth', 1);

Xhor=times(plus(xleg1,xleg2),1/2); xins2(1,counter2)=x;

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


56
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

yins2(1,counter2)=y;
Rear Foot Angle (RFA) Measurement
counter2=counter2+1;
end
end
% middle point
global XIns;
global YIns;

XIns=times(plus(xins1,xins2),1/2);
YIns=times(plus(yins1,yins2),1/2); Select File

plot(XIns,YIns, 'b.', 'MarkerSize', 15,'LineWidth', 1 );


Fig. 9. The RFA test in Anthro-Posture v1.0 software.
plot1InsCor=[xins1 yins1];
plot2InsCor=[xins2 yins2];

%Find Euclidean distance of start point and end point


Medial Longitudinal Arch Angle (MLAA) Measurement
InsDist=[plot1InsCor;plot2InsCor];
InsDistPix=pdist(InsDist,'euclidean');

%Calibrating distance
global InsDistCm;

a=InsDistPix*0.026458333;
InsDistCm = a-2.90794;

set(handles.InsDistCm,'string',num2str(InsDistCm))

<\code> Fig. 10. The MLAA test in Anthro-Posture v1.0 software.

4) Anthropometric Test with RFA. Arch Height Index (AHI) Measurement


Rearfoot angle was calculated by uploading the binary image in
Anthro-Posture v1.0 software, then clicked the <plotPoint
Calcaneus> button to plot the four points at the upper and below
edges of calcaneus line. The result of the gradient and angle of
tibia and calcaneus was displayed on the screen (Fig. 9). The
classification of the foot posture of all participants was
determined based on the angle, as presented in Table IV (p for
pronation, n for neutral and s for supination).

5) Anthropometric Test with MLAA.


Medial longitudinal arch angle was calculated by uploading
the grayscale image in Anthro-Posture v1.0 software, then
marked the three points of MH, NT, and MM on the image by Fig. 11. The AHI test in Anthro-Posture v1.0 software.
clicking the buttons of <MH_PLOT>, <NT_PLOT> and
<MM_PLOT> respectively. The result of MLAA was displayed
on the screen after clicking the <GetAngle> button (Fig. 10).
The classification of the foot posture of all participants was 6) Anthropometric Test with AHI.
determined based on the angle, as presented in Table V. All The arc height index was calculated by uploading the binary
participants were classified having the pronated and neutral feet image in Anthro-Posture v1.0 software, then clicked the buttons
based on the MLAA measurement. From the results, 70% of of <HorDist> and <VerDist> to mark the two pairs points of
participants had a neutral right foot and 90% a neutral left foot. total foot length and instep length. Whereas for the instep arc
height

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


57
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

TABLE IV. THE CLASSIFICATION RESULT OF RFA. Based on the RFA test, of all participants, 50% had a neutral
Rearfoot angle (o) Classification right foot and 60% had a neutral left foot. According to the
Subject-n finding result of RFA using the goniometer device [5], the
Right Left Right Left
1 1.65 2.91 n n method consistency was lower compared to MLAA and other
2 6.00 1.36 p p tools.
3 2.81 4.66 n n The MLAA test in this study resulted in more participants
4 15.43 8.05 p p having a neutral foot. Hence, the result difference of the
5 9.05 6.11 p p classification from both tools was around 20-30%.
6 3.93 10.07 n p Furthermore, one could interpret that of 50% participants
7 5.41 3.20 p n having a pronated right foot with RFA, 40% of them have a
8 3.73 2.38 n n neutral right foot with MLAA. In the other side for the left feet,
9 13.34 16.13 p p of 40% of participants having a pronated left foot with RFA,
75% of them have a neutral left foot with MLAA. These results
TABLE V. THE CLASSIFICATION RESULT OF MLAA. reinforced the findings of the previous research [5] which stated
that MLAA was the strongest uniplanar tool due to its higher
MLAA (o) Classification reliability, good agreement on steps for foot classification and
Subject-n
Right Left Right Left wider foot classification limits.
1 140.77 134.17 n n For AHI test, it is often used as supporting data for further
2 128.75 131.89 p n observation. One is categorized in the high-arched group when
3 150.04 150.01 n n the instep arch height index is at least 0.388 and in the low-
4 125.59 131.82 p n arched group when AHI equal or less than 0.262 [29]. From all
5 145.27 149.81 n n participants, it is only 30% of them included in the lower-arched
6 157.07 151.86 p p group. Associated with MLAA, from the lower-arched foot
7 137.36 150.03 n n participants, 66% of them have pronated feet. Meanwhile
8 146.11 152.35 n n associated with RFA, 100% of participants have pronated feet.
9 142.63 143.70 n n All these AHI results were quite relevant because based on the
10 139.23 141.02 n n finding study [22], a low-arched may result in a pronated foot,
but not all pronated feet have a low-arched type.
TABLE VI. THE CALCULATED AHI_TOTAL AND AHI_INSTEP OF THE INNER- The test results of RFA, MLAA, and AHI have been
SIDE FOOT.
validated by the specialists and other comparison tools using a
Right foot (cm) Left foot (cm) ruler, arc and Konivea software in order to improve the accuracy.
Subject The error means of length and angle in this software were 6.22%
AHI_ AHI_ Arch AHI_ AHI_ Arch
-n and (0.26-1.74) %, respectively.
total instep group total instep group
1 0.18 0.27 n 0.21 0.27 n Determination of the points of tibial and calcaneus in this
2 0.15 0.22 l 0.18 0.22 l first algorithm was still carried out manually, the same as that
3 0.23 0.32 n 0.24 0.34 n done by the doctors in their manual measurement using
markers, goniometers and arc [9], [13]. But in the case of DIP,
4 0.18 0.25 l 0.24 0.26 l
this manual step was a limitation that should be eliminated in a
5 0.17 0.24 l 0.17 0.24 l
future developing algorithm. However, compared to other DIP
6 0.23 0.34 n 0.22 0.32 n research conducted by Lin et. Al [14], this Anthro-Posture v1.0
7 0.21 0.29 n 0.19 0.28 n software was not only capable of measuring the angle and
8 0.21 0.29 n 0.24 0.36 n length of images but also providing the classification results of
9 0.24 0.33 n 0.22 0.30 n the foot posture.
10 0.19 0.28 n 0.19 0.27 n
V. CONCLUSION
index used the instep foot length instead of total foot length [28].
The comparison result between the instep height and the instep The anthropometric techniques are commonly used in the
foot length is displayed respectively in Fig. 11, while the classification of foot types, but the lack of carrying out the
calculated AHI of all participants is presented in Table VI (arch assessment with these uniplanar tools should be improved in
group: n for normal, l for lower). many ways. This study has demonstrated the development
algorithm in MATLAB to measure the foot posture, which is
IV. DISCUSSION named Anthro-Posture v1.0 software. The advantages of this
The Anthro-Posture v1.0 software has been created, technique are providing statistical medical data in a shorter time
validated and tested completely in this study. The classification and minimizing the human error in measurement. In the future,
results of all anthropometric tests could be learned more to this study can be improved to be used by doctors in obtaining
compare them to other works. large amounts of data for research needed.

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


58
IJEEMI, Vol. 2, No. 1, February 2020, pp:48-59
DOI: 10.35882/ijeeemi.v2i1.10 ISSN:2656-8624

REFERENCES [22] M. T. Gross, “Intraexaminer Reliability , Interexaminer Reliability , and


Mean values for Nine Lower Skeletal Measures in Healthy Naval
[1] M. N. Anas, “An instrumented insole system for gait monitoring and Midshipmen.”
analysis,” Int. J. Interact. Mob. Technol., vol. 10, no. 6, pp. 30–34, 2014.
[23] X. Zhao, T. Tsujimoto, B. Kim, and K. Tanaka, “Association of arch
[2] S. Morioka, M. Okita, Y. Takata, S. Miyamoto, and H. Itaba, “Effects of height with ankle muscle strength and physical performance in adult
changes of foot position on Romberg’s quotient of postural sway and leg men,” vol. 34, no. 2, pp. 119–126, 2017.
muscles electromyographic activities in standing,” J. Japanese Phys. Ther.
Assoc., vol. 3, no. 1, pp. 17–20, 2000. [24] T. G. Mcpoil et al., “Effect of using truncated versus total foot length to
calculate the arch height ratio,” vol. 18, pp. 220–227, 2008.
[3] J. Kongcharoen, S. Pruitikanee, S. Puttinaovarat, Y. Tubtiang, and P.
[25] M. Saghazadeh, K. Tsunoda, and T. Okura, “Foot arch height and rigidity
Chankeaw, “Gamification smartphone application for leg physical
index associated with balance and postural sway in elderly women using
therapy,” Int. J. online Biomed. Eng., vol. 15, no. 8, pp. 31–41, 2019.
a 3D foot scanner,” Foot Ankle Online J., vol. 7, no. 4, 2014.
[4] A. K. Buldt, G. S. Murley, P. Levinger, H. B. Menz, C. J. Nester, and K.
[26] C. M. Norris, “Lower limb motion during walking, running and jumping,”
B. Landorf, “Are clinical measures of foot posture and mobility associated
in Managing Sports Injuries (Fourth Edition). doi:10.1016/B978-0-7020-
with foot kinematics when walking?,” J. Foot Ankle Res., vol. 8, no. 1,
3473-2.00011-3, 2011.
pp. 1–12, 2015.
[5] H. Menz and S. E. Munteanu, “Validity of 3 Clinical Techniques for the [27] A. Puig-diví and C. Escalona-marfil, “Validity and reliability of the
Measurement of Static Foot Posture in,” Journal of Orthopaedic and Kinovea program in obtaining angles and distances using coordinates in
Sports Physical Therapy, vol. 3, no. 8, 2005. 4 perspectives,” 2019.
[6] A. Keenan et al., “The Foot Posture Index : Rasch Analysis of a Novel , [28] C. Balsalobre-Ferna´Ndez, “The Concurrent Validity And Reliability Of
Foot-Specific Outcome Measure,” vol. 88, no. January, pp. 88–93, 2007. A Low-Cost,High-Speed Camera-Based Method For Measuring The
Flight Time Of Vertical Jumps,” J. Strength Cond. Res., vol. 28, no. 2, pp.
[7] J. Burns, B. P. Hons, A. Keenan, B. Pod, and A. Redmond, “Foot Type 528–533, 2014.
and Overuse Injury in Triathletes,” vol. 95, no. 3, pp. 235–241, 2005.
[29] R. J. Butler, H. Hillstrom, J. Song, C. J. Richards, and I. S. Davis, “Arch
[8] M. Razeghi and M. E. Batt, “Foot type classification: a critical review of Height Index Measurement System,” J. Am. Podiatr. Med. Assoc., vol.
current methods,” Gait Posture, vol. 15, no. 3, pp. 282–291, 2002. 98, no. 2, pp. 102–106, Mar. 2008.
[9] B. Langley, M. Cramp, and S. C. Morrison, “Clinical measures of static [30] M. T. H. Parash, H. Naushaba, A. Rahman, and S. C. Shimmi, “Types of
foot posture do not agree,” J. Foot Ankle Res., pp. 1–6, 2016. Foot Arch of Adult Bangladeshi Male,” vol. 1, no. 4, pp. 52–54, 2013.
[10] A. C. Redmond, J. Crosbie, and R. A. Ouvrier, “Development and
validation of a novel rating system for scoring standing foot posture: The
Foot Posture Index,” Clin. Biomech., vol. 21, no. 1, pp. 89–98, Jan. 2006.
[11] G. S. Murley, H. B. Menz, and K. B. Landorf, “Journal of Foot and Ankle
Research A protocol for classifying normal- and flat-arched foot posture
for research studies using clinical and radiographic measurements,” vol.
13, pp. 1–13, 2009.
[12] V. Erickson, A. U. Kamthe, and A. E. Cerpa, “Measuring Foot Pronation
Using RFID Sensor Networks,” in Proceedings of the 7th ACM
Conference on Embedded Networked Sensor Systems, 2009, pp. 325–
326.
[13] J. S. Lee, K. B. Kim, J. O. Jeong, N. Y. Kwon, and S. M. Jeong,
“Correlation of Foot Posture Index With Plantar Pressure and
Radiographic Measurements in Pediatric Flatfoot,” Ann Rehabil Med,
vol. 39, no. 1, pp. 10–17, 2015.
[14] C. H. Lin, C. C. Yeh, and Z. H. Qiu, “Assessment of Subtalar Joint Neutral
Position : Study of Image Processing for Rear Foot Image,” vol. II, pp. 3–
7, 2017.
[15] C. Lin, Z. Qiu, and C. Yeh, “Image processing for rear foot image
evaluating leg and foot angles,” Measurement, vol. 126, no. May, pp.
168–183, 2018.
[16] A. C. Redmond, J. Crosbie, and R. A. Ouvrier, “Development and
validation of a novel rating system for scoring standing foot posture : The
Foot Posture Index,” vol. 21, pp. 89–98, 2006.
[17] T. Mcpoil, D. Ph, M. W. Cornwall, and D. Ph, “Relationship Between
Neutral Subtalar Joint Position and Pattern of Rearfoot Motion During
Walking,” Foot angkle, vol. 15, no. 3, 1994.
[18] M. Hill, R. Naemi, H. Branthwaite, and N. Chockalingam, “The
relationship between arch height and foot length : Implications for size
grading,” Appl. Ergon., vol. 59, pp. 243–250, 2017.
[19] C. Lin, Z. Qiu, and C. Yeh, “Image processing for rear foot image
evaluating leg and foot angles,” Measurement, vol. 126, no. May, pp.
168–183, 2018.
[20] J. J. Li Tan, “Image Processing Basics,” in Digital Signal Processing
(Second Edition), 2013.
[21] S. Gupta and S. G. Mazumdar, “Sobel Edge Detection Algorithm,”
International Journal of Computer Science and Management Research,
vol. 2, no. 2, pp. 1578–1583, 2013.

Indonesian Journal of Electronics, Electromedical, and Medical Informatics (IJEEEMI)


59

You might also like