0% found this document useful (0 votes)
169 views40 pages

Ergonomic Posture Analysis of Workers Using Artificial Intelligence and Computer Vision-1

This document summarizes a project analyzing worker posture using artificial intelligence and computer vision. The project uses a Microsoft Kinect sensor to track body joints in real-time and assess postures for ergonomic risk. Two students conducted the research under an advisor from the Department of Industrial Engineering at the University of Engineering and Technology Taxila. They developed a methodology to record worker postures, calculate risk levels associated with different postures like bent, twisted, or walking, and determine fatigue levels. The goal was to help evaluate if workers are maintaining safe postures or not.

Uploaded by

RP17 CE21
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
169 views40 pages

Ergonomic Posture Analysis of Workers Using Artificial Intelligence and Computer Vision-1

This document summarizes a project analyzing worker posture using artificial intelligence and computer vision. The project uses a Microsoft Kinect sensor to track body joints in real-time and assess postures for ergonomic risk. Two students conducted the research under an advisor from the Department of Industrial Engineering at the University of Engineering and Technology Taxila. They developed a methodology to record worker postures, calculate risk levels associated with different postures like bent, twisted, or walking, and determine fatigue levels. The goal was to help evaluate if workers are maintaining safe postures or not.

Uploaded by

RP17 CE21
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 40

Ergonomic Posture Analysis of Workers

Using Artificial Intelligence and Computer


Vision

Session 2017-2021
Group Members

Muhammad Jawad 17-IE-07

Usama Masood 17-IE-53

Advisor
Dr. Salman Hussain

Assistant Professor
Department of Industrial Engineering
University of Engineering and Technology Taxila
March 2021
Ergonomic Posture Analysis of Workers
using Artificial Intelligence and Computer
Vision
Undertaking
We certify that project work titled “Analysis of Ergonomic posture of worker using the
applications of Artificial Intelligence and Computer vision’ is our own work. No portion of the
work presented in this project has been submitted in support of another award or qualification
either at this institution or elsewhere and this is purely based our own research work and where
material has been used from other sources it has been properly acknowledged and referred.

Muhammad Jawad 17-IE-07

Usama Masood 17-IE-53


Abstract

The human recognition based on image processing can be performed using several techniques
like digital image processing, Computer vision and Artificial intelligence. This study focuses on
tracking of human body and objects using Microsoft Kinect and apply it on different
applications. The degree of freedom of body joints is calculated using the sensor. The main
objective is to perform the ergonomic analysis of industrial worker to observe its working
posture to check whether the worker is performing task in correct posture or not. For that
purpose, complete image tracking of worker is performed using Kinect sensor and decision about
correct posture is made using the application of Artificial Intelligence. Real time observations of
worker are taken in which straight, twisted, bent, walking/moving positions are recorded, and
risk values associated with these postures are calculated. It also helps to determine the fatigue
level of workers.

Acknowledgment
Alhamdulillah, Thanks to Allah SWT, who with His willing gave us the opportunity to complete
this Final Year Project “Analysis of Ergonomic posture of worker using the applications of
Artificial Intelligence and Computer vision”. Firstly, we would like to express our deepest thanks
to, Dr. Salman Hussain, Assistant Professor at Department of Industrial Engineering, UET
Taxila, assigned as our project supervisor who is guiding us a lot in this project. We would also
like to extend our thanks to all department Professors, and all staff of the Industrial Engineering
Department, UET Taxila.
Table of Contents
1 Understanding of the Applied Concept....................................................................................8

1.1 Introduction.......................................................................................................................8

1.2 Problem Statement............................................................................................................9

1.3 Aims and Objectives.......................................................................................................10

1.3.1 Aims.........................................................................................................................10

1.3.2 Objectives................................................................................................................10

1.4 Mapping of Project with Sustainable Development Goals (SDG’s)...............................10

1.4.1 Good Health and Well-Being (SDG 3)....................................................................10

1.4.2 Industry, Innovation, and Infrastructure (SDG 9)...................................................10

1.5 Summary.........................................................................................................................11

2 RELATED WORK.................................................................................................................11

2.1 Literature Review............................................................................................................11

2.1.1 Kinect Sensor...........................................................................................................11

2.1.2 Ergonomic Assessment using Kinect Sensor:.........................................................12

2.2 Research Gap:.................................................................................................................14

2.3 Project Management.......................................................................................................16

2.3.1 Division of work......................................................................................................16

2.3.2 Work Schedule Plan................................................................................................17

2.3.3 Gantt Chart...............................................................................................................18

2.3.4 Estimated Project Cost.............................................................................................18

2.4 List of Software and Hardware.......................................................................................19

2.4.1 Hardware..................................................................................................................19

2.4.2 Software...................................................................................................................20

2.5 Modern Tools Used.........................................................................................................20


2.6 Summary.........................................................................................................................21

3 METHODOLOGY.................................................................................................................21

3.1 Experimental Design.......................................................................................................21

3.2 Data Collection...............................................................................................................21

3.3 Methodology Framework................................................................................................22

3.4 Body Tracking using Kinect...........................................................................................24

4 ANALYSIS............................................................................................................................26

4.1 Result and Discussion:....................................................................................................26

4.1.1 Percent of time in posture........................................................................................27

4.1.2 Percent of risk in posture.........................................................................................28

4.1.3 Observed postures....................................................................................................28

4.1.4 Depth Image Tracking:............................................................................................30

4.2 Environment and Sustainability......................................................................................31

4.3 Life long learning............................................................................................................32

4.3.1 Process Before Applying Artificial Intelligence:....................................................32

4.3.2 Process After Applying Artificial Intelligence:.......................................................32

4.3.3 Technology Improvement In Our Project:...............................................................33

4.3.4 Image Processing:....................................................................................................33

4.3.5 Other Applications:..................................................................................................33

4.3.6 Engineer and society:...............................................................................................34

4.4 Recommendations and Conclusion.................................................................................34

Appendices....................................................................................................................................35

Appendix A................................................................................................................................35

Appendix B................................................................................................................................37

Appendix C................................................................................................................................38
LIST OF FIGURES

Figure 2-6 Scheduling with Gantt chart....................................................................................16


Figure 3-1 Kinect device performing tracking..........................................................................22
Figure 3-2 Skeleton image formed using Kinect.......................................................................23
Figure 3-3 Depth image formed using Kinect...........................................................................23
Figure 3-4 Face Recognition.......................................................................................................24
Figure 4-1 Recording by Kinect studio......................................................................................27
Figure 4-2 Skeleton image of worker under Examination......................................................28
Figure 4-3 Depth image tracking using processing..................................................................29

LIST OF TABLES

Table 2-1 Summary of different sensors used for motion analysis........................................10


Table 2-2 Summary of Ergonomic Assessment using Kinect Sensor......................................11
Table 2-3 Research Gap Analysis Table ( Ergonomics )..........................................................12
Table 2-4 Research gap on sensors.............................................................................................13
Table 2-5 Work Distribution Table............................................................................................14
Table 2-6 work schedule..............................................................................................................15
Table 2-7 Estimated Project Cost...............................................................................................17
Table 2-8 List of hardware and their usage..............................................................................18
Table 2-9 List of software and usage..........................................................................................18
Table 2-10 Modern tools used.....................................................................................................19
Table 4-1 percent time of legs, trunk and arms........................................................................25
Table 4-2 Percentage of posture in each risk............................................................................26
Table 4-3 Observed values of postures......................................................................................27
1 Understanding of the Applied Concept
1.1 Introduction

This chapter gives an insight into the project, provides an introduction for Microsoft Kinect and
image tracking for posture analysis. In our project, a computer-vision based application for
tracking and monitoring human body movements is developed for aiming at ergonomic analysis
of worker and object recognition using Kinect sensor.
Instead of attaching different sensors on various parts of body for the purpose of tracking the
Posture of person, Kinect sensor is more appropriate for this purpose which track the Posture of a
person. Basically, it comes with Xbox which is used for gaming purpose. But it can be used for
various applications in which ergonomics is one of them. It works on the principle of computer
vision in which stereo matching algorithm is used for depth image tracking.
It basically consists of three cameras or sensors which includes depth image which is for
measuring the depth or distance of the image. The other two sensors are RGB and infrared
sensors

Figure 1-1 Microsoft Kinect Sensor: Kinect 1

1.2 Problem Statement

Using the applications of Artificial Intelligence and computer vision to perform the ergonomic
analysis of body posture of worker. We can use Kinect sensor for tracking of posture and objects
which is mostly used in gaming purposes. Using the python language, we will read different
angles of body and then apply it on different applications like posture analysis, object
identification and gaiting system.
Depth of knowledge Required (Attribute of CEP -4)

The knowledge of Sensors is required for tracking objects. To perform Kinect Sensing using each
camera. Familiarity with Python Programming and Processing 3 Programming (Java). Also to have
detailed knowledge of Computer Vision and Artificial Intelligence

1.3 Aims and Objectives

1.3.1 Aims

Our aim is to apply the concept of Artificial Intelligence and Computer vision on different
applications by tracking with Kinect Sensor, particularly ergonomic analysis of worker to reduce
the fatigue level.

1.3.2 Objectives

Following are the objectives that drive us towards achieving the project aim

 Ergonomic Analysis of worker


 Object tracking using Kinect sensor
 Finding the different angles of body joints using python coding
 Applying the concepts of computer vision and artificial intelligence for implementing it
on different applications.

1.4 Mapping of Project with Sustainable Development Goals (SDG’s)

1.4.1 Good Health and Well-Being (SDG 3)

As our project titled as “Ergonomic Posture Analysis of Workers using Artificial Intelligence
and Computer Vision” is purely related to improving health of worker by reducing the risk of
injuries due to bad postures. In our project we are applying Ergonomics techniques to check the
posture of worker is riskless or not. So, if we found any risk of injury, we will improve the
workspace of worker or we will correct the posture of worker. This goal is briefly studied in
Section 1.4.3 (workplace safety training), Section 2.1.2 (ergonomic assessment using Kinect
sensor), Section 4.1.2 (percent of risk in posture) and Section 4.3.6 (Engineer and Society).

1.4.2 Industry, Innovation, and Infrastructure (SDG 9)

As we are using artificial intelligence to get better results for ergonomic posture analysis which
is a technologically modification in field. We can get better angles and at different positions we
can find the angles of worker’s posture will help us to check the risk factor more accurately.
Moreover, the other applications of this system like fatigue analysis, gaiting system are
innovation in medical field. Using AI techniques will improve the results, so it is also an
innovation in the field of ergonomics. Details are discussed in Section 2.1.1 (Kinect sensor),
Section 2.5 (Modern Tools Used), Section 4.3.2 (process after applying AI) and 4.3.3
(technological improvement in our project).

1.5 Summary

This is an introductory chapter and deals with the concept of human body tracking and object
recognition using Kinect. It provides the introduction of Microsoft Kinect along with it’s
different applications. It describes that how different images can be formed using different
cameras of Kinect like infrared, depth image and RGB camera. The aim is to apply it on posture
analysis for ergonomic assessment.

2 RELATED WORK
2.1 Literature Review

2.1.1 Kinect Sensor

A research paper was published in 2012 by Kourosh Khoshelham and Sander Oude Elberink
named “Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications”. In
this research paper he describes the usage of kinect sensor for measuring depth of the given
object. In 2013, Enrique J. Fernandez-Sanchez , Javier Diaz and Eduardo Ros presented their
research on the topic “Background Subtraction Based on Color and Depth Using Active
Sensors”. In this paper tracking using algorithms and depth measurement sensor are used.
Different depth measuring sensors are taken and algorithms are developed to calculate the depth
of an object. Developing algorithm takes a lot of time while using a sensor is much easier and
accurate.

Table 2-1 Summary of different sensors used for motion analysis

Author Title Main Outcomes


Wenjun Zeng University of Microsoft Kinect Sensor and Better analysis of Kinect
Missouri (2012) Its Effect sensor and issues occurred in
using infrared sensor

Kourosh Khoshelham and Accuracy and Resolution of Usage and Analysis of depth
Sander Oude Elberink (2012) Kinect Depth Data for Indoor measurements of data within
Mapping Applications a closed workplace

Enrique J. Fernandez-Sanchez Background Subtraction Tracking using algorithms


, Javier Diaz and Eduardo Based on Color and Depth and depth measurement
Ros (2013) Using Active Sensors sensor

Alexandra Pfister, Alexandre Comparative abilities of Kinect sensor can perform


M West, Shaw Bronner, Jack Microsoft Kinect and Vicon GATE analysis better and in
Adam Noah (2014) 3D motion capture for gait an easier way than Vicon 3D
analysis apparatus and is cheap also
Alfredo Patrizi, Ettore Comparison between low- Cost Benefits of using Kinect
Pennestrì & Pier Paolo cost marker-less and high-end sensor instead of using
Valentini marker-based motion capture infrared sensor
systems for the computer-
aided assessment of working
ergonomics

Solution:

Kinect is the most suitable sensor available which we can use with much ease and accuracy to
obtain body angles, tracking voices, tracking colors, tracking body angles. It is also less costly
than other sensors and systems. So Kinect sensor should be preferred.
2.1.2 Ergonomic Assessment using Kinect Sensor:

In 2013 a research paper was published by Jorge Alcaide-Marzal named “ Using Kinect sensor in
observational methods for assessing postures at work “ which examines the potential use of
Kinect range sensor in observational methods for assessing postural loads. In this study it is
shown that sensors can detect the position of the joints at high sampling rates without attaching
sensors or markers directly to the subject under study

Summary

Table 2-2 Summary of Ergonomic Assessment using Kinect Sensor

Author Title Main Outcomes


Martin Kampel and Rainer Planinc Ergonomic- Ergonomics analysis
2014 Monitoring of Office of workers doing
Workplaces Using work in sitting posture
Kinect
Micheal Otto, Eva Lampen and Felix Applicability Ergonomic study of
Auris Evaluation of Kinect moving person using
2019 for EAWS Ergonomic some standard
Assessment postures
Janaka Ruwanpura and Ayman Habib Application of Tracking of worker
2012 Microsoft Kinect according to their job
Sensor for Tracking by tracking their cap
Construction Worker colors
Catalina Mocanu and Irina Mocanu Human Body Posture Tracking 2-D motion
2013 Recognition Using a of human postures
Kinect Sensor
Pierre Plantard, Hubert Shum and Frank Validation of an Tracking human
Multon Ergonomic posture and
2016 Assessment Method implementing RULA
Using Kinect Data in method on it
Real Workplace
Conditions

Table 2.2 shows summary of work done by researchers on ergonomic assessment using Kinect
sensor which includes Tracking 2-D motion of human postures. The main outcomes of every
researcher is summarized to make the comparison and identification of research gap
Soution:

Kinect is the most suitable sensor available which we can use with much ease and accuracy to
obtain body angles, tracking voices, tracking colors, tracking body angles. It is also less costly
than other sensors and systems. So Kinect sensor should be preferred. By doing fatigue analysis
and gate analysis, the injury factor is reduced and fatigue in muscles can be reduced. Any posture
mistake can be observed from the observed data and can be corrected which in turn improve the
posture.

2.2 Research Gap:

Table 2-3 Research Gap Analysis Table ( Ergonomics )

Author Title Main Outcomes Research Gap


Martin Kampel and Ergonomic- Ergonomics analysis Do not explain other
Rainer Planinc Monitoring of Office of workers doing postures
2014 Workplaces Using work in sitting
Kinect posture
Micheal Otto, Eva Applicability Ergonomic study of Do not explain all
Lampen and Felix Evaluation of Kinect moving person using joints and angles
Auris for EAWS some standard
2019 Ergonomic postures
Assessment
Janaka Ruwanpura Application of Tracking of worker Do not tack postures
and Ayman Habib Microsoft Kinect according to their job of workers
2012 Sensor for Tracking by tracking their cap
Construction Worker colors
Catalina Mocanu and Human Body Posture Tracking 2-D motion Do not explain the
Irina Mocanu Recognition Using a of human postures 3-D motion of human
2013 Kinect Sensor posture
Pierre Plantard, Validation of an Tracking human Do not have any
Hubert Shum and Ergonomic posture and specific ideal
Frank Multon Assessment Method implementing RULA position. It just
2016 Using Kinect Data in method on it relates the total
Real Workplace obtained score with
Conditions total RULA score of
ideal case

Table 2.3 shows the research gap analysis based on ergonomics or posture analysis. It shows the
main outcomes of every researcher who have done work on ergonomics and research gap has
been identified that how it is different from tracking human posture using Kinect by calculating
the degree of freedom of different links of body. Also how many links theory shows that can be
measured using Kinect and what is actual value while tracking.

Table 2-4 Research gap on sensors

Author Title Main Outcomes Research Gap


Wenjun Zeng Microsoft Kinect Better analysis of Kinect sensor has
University of Sensor and Its Effect Kinect sensor and many diverse
Missouri (2012) issues occurred in applications i.e; 3D
tracking, 2D tracking,
using infrared sensor
Gait analysis
Kourosh Khoshelham Accuracy and Usage and Analysis of Using 2 Kinect sensor
and Sander Oude Resolution of Kinect depth measurements gives accurate and 3
Elberink (2012) Depth Data for Indoor of data within a closed dimensional results
Mapping Applications workplace

Enrique J. Fernandez- Background Tracking using Kinect sensor can also


Sanchez , Javier Diaz Subtraction Based on algorithms and depth do voice recognition
and Eduardo Ros Color and Depth measurement sensor which can also be
helpful in
(2013) Using Active Sensors
identification of
workers
Alexandra Comparative abilities Kinect sensor can Using 2 Kinect
Pfister, Alexandre M of Microsoft Kinect perform Gait analysis sensors can perform
West, Shaw and Vicon 3D motion better and in an easier more accurate Gait
analysis
Bronner, Jack Adam capture for gate way than Vicon 3D
Noah (2014) analysis apparatus and is cheap
also
Alfredo Patrizi, Ettore Comparison between Cost Benefits of using It is a multi-tasking
Pennestrì & Pier low-cost marker-less Kinect sensor instead sensor and can have
Paolo Valentini and high-end marker- of using infrared other applications like
biomechanics lab etc.
(2016) based motion capture sensor
systems for the
computer-aided
assessment of
working ergonomics

Table 2.3 shows the research gap analysis based on sensors that can be used for tracking both
human body and objects. It shows how sensors differ in cost with respect to functionality. Here
the major comparison is being done between infrared and Kinect sensor.

2.3 Project Management

2.3.1 Division of work

The whole project tasks are distributed among both group members to utilize the abilities
efficiently and effectively.
Team members

1. Muhammad Jawad
2. Usama Masood

Table 2-5 Work Distribution Table

Tasks Performed by Individual


Sr# Tasks Performed in
Muhammad Jawad Usama Masood Group

1 Internet searching Internet searching about Kinect Project Selection


about Kinect sensor sensor and AI
and AI

2 Found and studied related Found and studied related research Literature
research papers papers review and
research gap

3 Tracking of different angles Tracking of different angles and joints Human Tracking
and joints of human body of human body
4 Tracking of machines and Tracking of machines and Object Tracking
objects using Kinect sensor objects using Kinect sensor

5 Half report writing Half report writing Presentation Slides

Table 2.5 shows the work distribution table of our project. It summarizes that which tasks are
performed by which group member individually and also work done by both group members.
Basically it shows the division of work.

2.3.2 Work Schedule Plan

The table 2.6 shows the work schedule in which the list of different activities and their duration
is shown. The expected time of each activity is shown in weeks that is the required to complete
certain activity.

Table 2-6 work schedule

Activity Time
Project Selection 1 Week
Discussion With supervisor 1Week
Literature review 3 Weeks
Research gap 2 Week
Analysis of Research papers 1 Week
Methodology/ Framework 2 Weeks
Selection of Sensors 1 Week
Hardware Design 2 Week
Experimentation 2 Weeks
Data collection 2 Weeks
Coding with python 1 Week
Tracking 2 weeks
Analysis by applying it on different application 2 weeks
Applying Artificial Intelligence Techniques 2 weeks
Report Writing 2 weeks
Extent of stake holder involvement and level of conflicting requirements
(CEP attribute-6)

Stake holders for the final product will be industrial workers or construction workers
that are working in specific posture. It also includes organization top management.

2.3.3 Gantt Chart

Figure 2.6 shows the Gantt chart of all the activities along with their durations. It is helpful
because it gives the overall picture of the project with starting and end date. Using this chart we
can find the current position of project that how much work is done and what needs to be done.
Using this we can track our project to check how much we are following the schedule.

Figure 2-2 scheduling with Gantt chart


2.3.4 Estimated Project Cost

Using all the possible sources, an approximate cost analysis of the project is highlighted below:

Table 2-7 Estimated Project Cost

Material Cost in PKR.


Kinect Sensor 5000
Adopter and attachment cables 500
Total project cost 5500

2.4 List of Software and Hardware

2.4.1 Hardware

Table 2-8 List of hardware and their usage

Sr. Name Use Picture


No
1. Kinect Sensor
 For image
tracking
 For face
recognition
 For voice
tracking
 For skeleton
tracking
 For video
gaming
2. Power Adopter
Used as a connection
between laptop and
Kinect sensor to
process the data taken
through the sensor
3. Laptop Used to run different
programs

2.4.2 Software

Table 2-9 List of software and usage

Sr. No Name Use

1. Python To apply data mining and cloud computing techniques


through coding
2. Microsoft Kinect A software use to gather the information taken from
Kinect sensor

2.5 Modern Tools Used

Modern tools can improve every area of practice, and the best part is, they are easily accessible.
Modern practice management streamlines process, saving tons of time, and therefore saving
money. Here are some modern tools that helped us in this project:

Table 2-10 Modern tools used

Purpose Technique

Mechanical Design and Analysis Computer Aided Design


Programming For Image Tracking Python, Processing 3
Product Inspection Computer Vision
Product Inspection Digital Image Processing
Equipment Selection and Comparisons Analytical Hierarchal Procedure
Artificial Intelligence based Product Classification Deep Learning
2.6 Summary

This chapter gives an insight into the related work done by eminent researchers in the fields of
Ergonomics, computer vision, and Artificial Intelligence. It represents the research done on
Kinect sensor for image tracking and its applications, especially to track human body. The
project completion time is calculated using Gantt chart and PERT which is 26 weeks.

Consequences (Attribute of CEP-7)

This project that is about, to analyze the Ergonomic posture of worker using computer vision ,
will result to find the angle or degree of freedom of different body links and joints. It will be used
to Ergonomic assessment to reduce the fatigue level of worker. Learning the skills of computer
vision and Artificial Intelligence and
apply it on wide variety of applications

3 METHODOLOGY
3.1 Experimental Design

To observe the body posture, we will take different samples that is by visiting different industries
and several departments and identify the workers working in awkward position. Then the image
tracking will be done of that worker using the Kinect sensor with python codes as a input to the
sensor which will operate according to the instruction given. To make a better use of it we will
apply it to different other applications as well such as object tracking and checking the bowling
action of bowler and also the fatigue level of player.

3.2 Data Collection

Data will be collected by tracking the posture of workers using different cameras of Kinect
sensor. Python coding will be used to feed the input to the sensor. So based upon application
tracking will be done and data is collected. Using the sensor different angles of the body joints
will be calculated and recorded and analysis will be performed to apply it according to the
applications.

3.3 Methodology Framework.

After doing the literature Review the project is carefully divided into 3 parts with 5
Select
phases. Infrare
d or Sensor
Phase-1- Development of Structure for overall project
Kinect
Phase-2- Selection of appropriate sensor

Phase-3- Writing python codes for posture detection

Phase-4- Experimentation
Image Trackingand tracking using the Kinect sensor
by Kinect
Phase-5- Implementation of Computer Vison Select
for Postural analysis
Programming
language
Kinect sensor and Postural Analysis
Conceptual Design
Angle reading of
Tracking Using Computer Vision
body joints
Data Collection

Image
Manipulation
Perform and Preparation
Tracking to Integrating AI
Improvement
in the find degree Model
Image
posture of freedom
Augmentation

CNN
1. Kinect which is a game controller technology introduced by Microsoft in November
2010. This automated real-time worker tracking system providesImplementation
an opportunity to track
Motion detection
the construction worker location andKinect
With their movements in a specified indoor work area.
2. We will use the Kinect sensor for the application of postural analysis of worker using the
Ergonomic
techniques of computer vision.
Analysis using Model Training
3. The python
main focus is to read the different angles of body joints with the help of Kinect
sensor camera. The object is placed in front of sensor and it will detect the motion of
object or body
4. The programming language that we will use for the input to the Kinect sensor is python.
We use this language for programming because it is easy as compared to other
programming languages.
5. Once we will be able to read the angles of body joints using the python coding then we
can apply it to many applications other than postural analysis.
6. We can use it to test the fatigue analysis of football player, we can also use it for tracking
of object.
7. We can also use it for the examining of sitting posture of interviewer during the interview
to check the confidence level
8. One other application that we can apply it is gating system and also can check the
bowling action of bowler.
9. So we are not specific to one application we can apply it to multiple applications.

3.4 Body Tracking using Kinect

In the initial phase tracking of human body is performed using Kinect. For this purpose the
Kinect device is placed at appropriate height and distance from the object or human body.
Different types of images and postures can be analyzed using every camera of Kinect that is
depth image, infrared image, face recognition, voice recognition and skeleton image formation.
Tracking can be performed in both seated and standing position by changing the movement of
body parts in front of camera

Figure 3-3 Kinect device performing tracking


Basically Kinect has two cables, one of which is attached to the computer that is USB cable and
other is attached to the power switch. Initially drivers has to be installed to connect the Kinect to
the PC. Once drivers get installed successfully, yellow light of sensor starts blinking and red
light of camera gets on as shown in figure 3.1.
Figure 3-4 Skeleton image formed using Kinect
For skeleton image formation you need to be stand or sit at appropriate distance from the sensor.
Basically its main purpose is to show the movement of different body joints. It shows how the
degree of freedom changes as the movement is performed in front of sensor.

Figure 3-5 Depth image formed using Kinect.


Figure 3.3 shows depth image formed in seated position of observer. Basically depth image
shows how far the object by measuring its distance using the depth cameras. As the object move
closer to the sensor the image turns out to be black and when the object moves away from the
sensor the image start turns to brighter
Figure 3-6 Face Recognition
Figure 3.4 shows face tracking using Kinect. Its main purpose is to detect the face of a person in
front of sensor. It is basically used for security purposes and other face recognition applications.

4 ANALYSIS
4.1 Result and Discussion:

Postural analysis of worker is performed as a sample to collect the data of different working
postures. The Kinect sensor is adjusted at an appropriate angle and distance so that it can easily
captures the data. The worker in front of sensor is performing his task which can include
twisting, bending, standing on one leg, moving/walking and other postures. Kinect is recording
all of these postures. Observation frequency of sensor is 5 per second which means that it records
5 observations in 1 sec. In this sample analysis, worker is working with varying postures and
readings are taken for the observation time of 36 seconds. So total observed postures are 182.
Also the worker is carrying no or small load so it is taken as below 10 kg.

Observed postures: 182


Observation frequency: 5 per second

Observation time: 36 seconds

Load : Below 10 kg

4.1.1 Percent of time in posture

Table 4-11 percent time of legs, trunk and arms


LEGS TRUNK ARMS

On both leg 39.56% Straight 19.23% Two over sh. 75.82%


straight
On one 27.47% Bent 26.92% One over sh. 22.53%
straight leg
On one knee 9.89% Twisted 0% Below sh. 1.1%
bent
On two knees 20.33% Bent/twisted 53.3%
bent
Kneeling on 1.1%
one or both
leg
Walking or 1.1%
moving

The table 4.1 shows the percentage of time in different postures recorded by sensor. It has been
divided into three body postures that is Legs, Trunk and Arms. For legs the maximum percentage
is 39.56%, showing that worker spends more time standing on both legs straight and spends least
time in walking and kneeling on one or both legs. For Trunk the maximum percentage is 53.3%
showing that worker performs more working with Bent/twisted trunk. For arms, the maximum
percentage is both the arms over shoulder which is 75.82%.
4.1.2 Percent of risk in posture

Table 4-12 Percentage of posture in each risk

Percentage of postures in each risk category


Number of postures Percentage of postures

Risk 1 24 13.19%

Risk 2 100 54.95%

Risk 3 23 19.23%

Risk 4 35 12.64%

The table 4.2 shows the percentage of postures in each risk category. The total number of
postures for Risk 1 are 24 that is 13.19% of the postures of the worker are normal means that no
risk is involved. More number of time is spend by worker in risk 2 that is 54.95% of the postures
are slightly harmful means Corrective action should be taken during next regular review of
working methods. Risk 3 which is distinctly harmful involves the percentage of 19.23% and
corrective action is taken as soon as possible for these postures. The extremely harmful
percentage of postures is 12.64% to which immediate corrective action must be applied.

Global Risk (min:1 max 4)

2.38

Slightly harmful

4.1.3 Observed postures

Table 4-13 Observed values of postures

CODE
NUMBER Trunk Arms Legs Load RISK
1 4 1 5 1 4
2 4 1 5 1 4
3 4 1 5 1 4
4 4 1 5 1 4
5 4 1 5 1 4
6 4 1 5 1 4
7 2 2 4 1 4
8 2 1 4 1 4
9 4 2 5 1 4
10 4 2 5 1 4
The table 4.3 shows the observed postures in which only 10 out of 182 observations are shown.
It shows the risk code of different parts of the body as shown in figure. The legs value of 5
indicates that worker spends most of the working time in standing on one bent knee which is
extremely risk. Trunk value of 4 shows bent and twisted posture involving high risk.

Figure 4-7 Recording by Kinect studio

The figure 4.1 shows the Kinect studio which is used for the recording purpose. It allows the
developers to record and playback the Kinect data. We can record both the color and depth data
which can used for further analysis in the future. While the worker posture is being tracked we
can use Kinect studio to record the posture and all the tracking which is performed.
Figure 4-8 Skeleton image of worker under Examination
The figure 4.2 shows the skeleton image of person in sitting position. Here the skeleton image is
formed using the 20 body joints as shown in figure 4.2 along with the labeling of joints names
and numbering. The values of joint centers coordinates are recorded. For analysis purpose the
number of joints in the tracking can be reduced according to our requirements. For example if we
want to focus on specific joints then we will consider those joints in the tracking.
.
Depth of Analysis Required (Attribute of CEP -3)

Depth of analysis is required to analyze the tracking of human body and objects using Kinect
sensor. Tracking using infrared, depth and RGB cameras of sensor. Computer and Machine Vision
for extracting information for real time inspection. Finding the degree of freedom of body joints
using sensor

4.1.4 Depth Image Tracking:

Depth is a representation of a surface and free space. It show how closer the objects are. In case
of human body recognition, if we stand in front of Kinect sensor the depth image of a person will
be brighter if he moves closer to the sensor and will be darker if we move away from the camera.
It gives the dense representation of images of objects. So depth image is a image channel in
which each pixel relates to a distance between the image plane and the corresponding object in
the RGB image. So we used processing 3 programming language used to create images,
animations and interaction. We run the code to get the depth image as shown in the figure 4.3.
The code is written in the Appendix and is taken from the Shiffman/ openKinect for processing.
The library that is used is simple open NI which has features of skeleton tracking and gesture
recognition.

Figure 4-9 Depth image tracking using processing

4.2 Environment and Sustainability

A brief discussion is made on the societal and environmental impacts that are caused due to this
project. In industries many workers work as labor performing specific work for seven to eight
hours in particular posture. If the worker performs the working is wrong posture for prolonged
period of time then it may cause serious health injuries like musculoskeletal disorder or back and
neck injuries and also the fatigue element in worker in less period of time. As worker
performance is directly related to the productivity of organization so work environment and
conditions should be make suitable for workforce.
In this project we mainly focus on posture analysis of worker by detecting its motion using
sensor and then comparing it with standard posture values thus fatigue element of worker can be
reduced by making correction in the posture. Similarly, we can use it for security purposes like
face detection and object identification

Consequences of Society and the Environment (Attribute of CEA -5)

Our project has major impacts on society and environment as it is beneficial for health of industrial
workers by reducing the fatigue level during working due to improved working posture

4.3 Life long learning

4.3.1 Process Before Applying Artificial Intelligence:

 Analyzing worker’s posture


 Giving values according to posture
 Calculating RULA/REBA scores
 Concluding results

4.3.2 Process After Applying Artificial Intelligence:

 Sensing the worker’s posture


 Values obtained with final RULA/REBA score
 Concluding results

The process differences doing ergonomics studies using sensor and without using sensor shows
that by using sensor we had performed the work more precisely and, in less time, using Artificial
Intelligence technique. This is also known as subtractive manufacturing system.
The other innovation in our project is real time image processing using Microsoft Kinect
software. Our project will use Artificial Intelligence for analyzing the posture of worker.
4.3.3 Technology Improvement In Our Project:

In our project
 we had used Kinect sensor which submerges the results of depth sensor, motion sensor
and voice recognition and process images directly to the computer screen. The nodes are
shown on structural view and the motion can easily be detected on screen.
 The results obtained are then used to check out the angle of different joints and to find
out the scores assigned.
 After getting the angle the values are copied to MS Excel. These values are then used to
find RULA/REBA scores.
 These scores are then used to conclude either the worker is having a bad posture for
work or it is okay to work in that posture.

4.3.4 Image Processing:

Image processing is a method to perform some operations on an image, in order to get an


enhanced image or to extract some useful information from it. It is a type of signal processing in
which input is an image and output may be image or characteristics/features associated with
that image.
The purpose of early image processing was to improve the quality of the image. It was aimed for
human beings to improve the visual effect of people. In image processing, the input is a low-
quality image, and the output is an image with improved quality.

4.3.5 Other Applications:

There will be many other applications we can perform by slightly changing our program.
For example
 Fatigue analysis of a worker by checking out his posture and relate it with the posture
values obtained before performing work
 Confidence level of an interviewer by checking out his way of sitting and compare it with
ideal case
 Biomechanics of a player can be analyzed by comparing it with ideal situation
4.3.6 Engineer and society:

Artificial intelligence is the most advanced field of today. Our project is also related to artificial
intelligence. Our project is to do ergonomics study using artificial intelligence. Ergonomics is a
field in which we perform tasks to reduce risk factor of getting injured. Our project can be
applied at any place to find out the wrong working postures. This technique is actually used to
improve health of worker and reduce his/her injury factor. So it is highly applicable in industry
to reduce the injury factor of workers and to improve health of workers.

4.4 Recommendations and Conclusion

In this project tracking of human body is performed Using the Kinect sensor, Which is the
motion sensing input device. It basically work on Stereo Matching Algorithm which is based on
deep learning. It is used in the applications of object detection, robotics and remote sensing. The
goal was to locate the position of major joints of body which is shown in Skelton image
generated by Kinect sensor. Then to calculate the degree of freedom of body joints as the worker
changes his posture while working. Microsoft Kinect Sdk and programming languages like
python and processing 3 is used to perform the basic tracking. The main objective was to apply it
on industrial workers to check whether working posture is correct or not. As it is necessary for
workers to work in safe environment, so posture must be correct to reduce the risk level and
fatigue. The risk score of worker is calculated by observing his posture in front of sensor In
which he performs the task in different postures and observations are recorded by sensor with
respect to time. After that percentage of risk in each posture is calculated which includes legs,
trunk and arms. So analysis is performed to check which posture is contributing most to the risk
factor.

Skeleton tracking with kinect is performed using the programming language python in which
shows 20 body joints. Along with that depth image is formed using the processing 3 language.
As our project is related to health and safety so it is recommended to propose solutions to the
problems which are identified for the safety of industrial workers. Once risk score is calculated
and affected body joints are detected, correct posture was recommended to the workers. This
should be implemented by industries to change the working posture by providing the suitable
rests and working at appropriate height. Along with industrial workers we can apply this project
to different applications for future research like object detection, postural analysis of interview
students by observing his posture, bowling action of player can also be observed using this. So
Computer vision is used the form of Kinect tracking and analysis is performed using the
Artificial Intelligence.

Appendices
Appendix A

OBE CEP Attributes

Attribute Complex Problem


Preamble  Tracking of human body and objects using Kinect sensor

Range of Conflicting  Analysis of Ergonomic posture of worker


Requirements  Analysis using each camera of Kinect Sensor.
 Coding using different programming languages particularly python.

Depth of Analysis  Analysis to be performed for tracking human body and objects using
Kinect sensor.
 Tracking using infrared, depth and RGB cameras of sensor
 Computer and Machine Vision for extracting information for real
time inspection
 Finding the degree of freedom of body joints using sensor.

Depth of Knowledge  Sensors for tracking objects


Required  Kinect Sensing using each camera.
 Python Programming
 Processing 3 Programming (Java)
 Computer Vision
 Artificial Intelligence
Familiarity of Issues  Object Tracking using python.
 To get familiar with certain programming languages for coding
purposes.

Extent of Stake  Stake holders for the final product will be industrial workers or
Holder’s Involvement construction workers that are working in specific posture. It also
and level of Conflicting includes organization top management.
Requirements

Consequences  Ergonomic assessment to reduce the fatigue level of worker.


 Learning the skills of computer vision and Artificial Intelligence.
 Apply it on wide variety of applications
Interdependence  Project constitutes of 4 Phases each of which has several activities ,
each stage is linked with the other
Appendix B

Range of Complex Engineering Activities (CEA)

Attribute Complex Activities


Preamble  Tracking of human body and objects using Kinect sensor

 The resources required for the project include Kinect sensor and
Range of Resources
connecting wires to connect with computer. Also availability of
money for buying Kinect sensor. People (worker) availability is also
necessary to perform experimentation on tracking

 Tracking using infrared, depth and RGB cameras of sensor


 Computer and Machine Vision for extracting information for real
Level of Interaction time inspection
 Finding the degree of freedom of body joints using sensor.

 Kinect Sensing using each camera.


 Python Programming for object tracking
Innovation
 Processing 3 Programming (Java)
 Computer Vision
 Artificial Intelligence

 Our project has major impacts on society and environment as it is


Consequences of
society and the beneficial for health of industrial workers by reducing the fatigue
environment
level during working due to improved working posture
 Previously the work was done on tracking using Kinect but it is
only used for recognition purposes. We will use Kinect to not only
Familiarity
track human body but also for objects

Appendix C

Program Learning Outcomes ( Mapping )


PLO’s FYP
Engineering knowledge: An ability to apply Chapter 1:
knowledge of mathematics, science, engineering
• From Problem statement, aims,
fundamentals and an engineering specialization
objectives, most likely scope of the
to the solution of complex engineering problems.
project.
• How project Chapter’s progress?
Problem analysis: An ability to identify, Chapter 2:
formulate, research literature, and analyze
• About basics of process/problem in
complex engineering problems reaching
question, scope of the work and what
sustained conclusions using first principles of
mathematical/simulation/statistical
mathematics, natural science, and engineering
model or related theory used
science
• Ways to tackle the project, relevant
theory related to the problem
• Conclusions linked with the
objectives, whether met or not.
Environment and sustainability: An ability to Chapter 3 ( Last paragraph )
understand the impact of professional
• Understanding impact of standards
engineering solutions in societal and
of quality/health and environment
environmental contexts and demonstrate
included in this chapter. How your
knowledge of and need for sustainable
design/model/framework/project
development.
work fit in the existing standards
related to environment.
• Can your project successfully
address the societal arena and are
their conflict of interest, discuss
this
related to your project.
Individual and Teamwork: An ability to work During project Phase:
effectively, as individual or in a team, on
• From portfolio
multifaceted and/or multi-disciplinary settings.
• Viva and presentation
• From Gantt chart or Microsoft access

Environment and society: An ability to apply Chapter 3:


reasoning informed by contextual knowledge to
• How your project is linked with legal
assess societal, health, safety, legal and cultural
and cultural issues i.e. are there any
issues, and consequent responsibilities relevant
codes or standards of the related
to professional engineering and solution to
project available which you have
complex engineering problems.
used or cited.
• When experiments performed what
safety and health measures taken (add
details in project) and implemented at
workplace (add pictures too).
Project Management: An ability to At various locations:
demonstrate management skill and apply
• Table of contents
engineering principles to one’s own work, as a
• Methodology of work, Gantt chart
member and/or leader in a team, to manage
etc. timeline.
projects in a multidisciplinary environment.
• From portfolio, events assigned and
if any event delayed, marks
deductions

Presentation and viva as group
member and as team leader.
Communication: An ability to communicate In all chapters:
effectively, orally as well as in writing a
• Use the professional styles, formats,
complex engineering activity with the
language, milestone etc. as per
engineering community and with the society at
large, such as being able to comprehend and methodology, various items reflected
write effective reports and design at their proper location.
documentation, make effective presentations and • Proper table of content, clear
give and receive clear instructions. pictures, color pictures where
required.

Lifelong learning: An ability to recognize At various locations:


importance of and purpose of lifelong learning
• Use the modern tools and its
in the broader context of innovation and
importance and its importance and
technological developments.
how this help is lifelong learning to
be described.
• Add future works in points, what else
need to be done to improve project.
• This project helps you in learning
apart from the course studied during
entire degree program.

You might also like