ROSA For Online Worker

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

Occupational Ergonomics 10 (2011/2012) 83–101 83

DOI 10.3233/OER-2012-0194
IOS Press

The Rapid Office Strain Assessment (ROSA):


Validity of online worker self-assessments
and the relationship to worker discomfort
Michael Sonnea,b,∗ and David M. Andrewsc
a Department of Kinesiology, McMaster University, Hamilton, Ontario, Canada
b LeadErgonomics Consulting Services, Windsor, Ontario, Canada
c Department of Kinesiology, University of Windsor, Windsor, Ontario, Canada

Abstract. The purpose of this study was to determine if office workers were capable of using an online version of the Rapid
Office Strain Assessment (ROSA) tool to accurately assess musculoskeletal disorder risk factors in their own offices, and see
if online training can reduce worker-reported discomfort. Fifty-five participants completed a four week program where they
assessed their own office simultaneously with a trained observer, and either received or did not receive feedback on their
performance. Significant differences were found between worker- and observer-reported ROSA final scores, and for the mouse
and keyboard section, with workers underestimating these risk factors on average, compared to the trained observer. Worker and
observer assessments of the chair, monitor and telephone were not significantly different but were significantly correlated (R
values of 0.60 and 0.48). There were a greater number of significant correlations between worker-reported ROSA final scores
and total body discomfort (3 instances) compared to observer-reported relationships (1 instance). Feedback appeared to have a
detrimental effect on worker-assessment accuracy, and the relationship between discomfort and ROSA scores. Mean discomfort
decreased across the four weeks of the study (up to a 51.6% decrease), as did ROSA final scores (3.9 to 3.5). Additional work is
required to improve the validity of worker-reported scores in all sections of ROSA, but self-assessments of office workstations
using the current ROSA online application do show promise in terms of assisting workers to decrease risk factors related to
musculoskeletal disorders, and decrease discomfort levels.

Keywords: Office, computer, checklist, worker-assessment, online training, feedback

1. Introduction

Musculoskeletal disorders (MSD) are the number one source of lost time injuries in Ontario, and
contribute to over $12 billion in indirect and direct costs to Ontario employers per year [1]. Risk factors
related to musculoskeletal disorders in office work include sustained non-neutral postures of the upper
limbs [2], prolonged static sitting while using the computer [3]), awkward postures of the head and
neck [4], and increased muscular activity in the upper back and shoulders [5]. These risk factors have a
large effect on the number of musculoskeletal disorders reported every year, as over 60% of Canadian
workers require the use of a computer to perform required tasks at their jobs [6].
Attempts to proactively control these risk factors in the office have primarily come in the form of
training and ergonomic assessments [7]. The most effective methods of office ergonomics training have


Address for correspondence: Michael Sonne, Department of Kinesiology, McMaster University, 1280 Main Street West,
Hamilton, Ontario, L8S 4L8, Canada. Tel.: +1 519 996 3746; E-mail: [email protected].

1359-9364/11/12/$27.50  2011/2012 – IOS Press and the authors. All rights reserved
84 M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort

involved the participant as an active member in the training, thereby allowing him/her to make their
own workstation modifications [8]. Training and additional assessment recommendations in ergonomics
can be made by using initial risk factor screening tools, such as RULA [9] and REBA [10]. However,
these tools are designed to be general enough to apply to multiple tasks, and do not necessarily apply
to the specific risk factors found in an office environment. While these tools have been extensively
validated, not all of the validation was conducted on computer workstations, leaving questions about the
applicability of the action levels proposed in these posture assessment tools as they pertain to computer
work.
The Rapid Office Strain Assessment (ROSA) [11] is a pen and paper checklist that was developed to
quickly determine if an office workstation requires additional assessment or intervention. ROSA is based
on the CSA standards for Office Ergonomics (CSA-Z412), and highlights MSD risk factors identified
through extensive research specific to office and computer work. The risk factors incorporated into the
tool are organized into several subsections: chair, monitor and telephone, and mouse and keyboard.
These subsections highlight the risk factors unique to each component of the office workstation, and
weight risk scores based on the CSA-Z412, as well as previous research (Fig. 1). The scores recorded
in each subsection are then combined to achieve a ROSA final score, indicative of the overall risk of
musculoskeletal discomfort, as a result of the configuration of the office. Initial research by Sonne et
al. [11] found a significant relationship between worker-reported discomfort and ROSA final scores.
Further analysis revealed that office workstations which were assessed to have a ROSA final score of 5
or higher were associated with increased worker-reported discomfort. Based on this work, a ROSA final
score of 5 was proposed to be a reasonable action level to indicate that further evaluation or intervention
is needed for office workstations.
A limitation of ROSA is that experts are still required to complete the initial screening assessments,
which is reflective of additional costs to the workplace through the hiring of ergonomic consultants.
Additionally, ROSA may act as an effective screening tool, indicating which office workstations are in
need of immediate changes in order to reduce worker discomfort. However, relying solely on expert
consultants to provide this information would be very time consuming and costly for many workplaces.
The effectiveness of training on risk factor reduction in office ergonomics has previously been consid-
ered in the literature. Bohr [8] found that a participatory approach, in which workers were instructed how
to adjust aspects of their own office, was the most effective in improving the workstation, compared to a
traditional lecture-based training approach. Participatory approaches allow workers to receive feedback
from instructors and have been shown to be beneficial in improving training performance [12]. If the
right type of feedback is administered in the right way and at the right time, workers can then apply this
feedback to the task in question in order to improve their performance the next time the task is performed.
If workers could be trained to perform their own ROSA assessments in an online training module, then
the initial screening process would be much quicker and less expensive. Therefore, the purposes of this
study were to determine if office workers were capable of using an online version of the Rapid Office
Strain Assessment (ROSA) tool to accurately assess musculoskeletal disorder risk factors in their own
offices, and see if online training can reduce worker-reported discomfort associated with office work.
The following four research questions were of interest and guided this investigation:
1. Are ROSA subsection and final scores reported by office workers using the online version of the
tool comparable to those determined by a trained observer for the same workstations?
2. What is the impact of directed expert feedback and one month of weekly assessments on the
agreement between trained observer- and worker-reported ROSA scores?
M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort 85

3. What are the relationships between worker- and trained observer-reported ROSA scores and worker-
reported discomfort scores?
4. Is an office ergonomic training protocol, using ROSA online, effective in reducing musculoskeletal
discomfort in office workers?

Table 1
Mean (SE), maximum and minimum anthropometric and demographic information for participants in the
feedback (FB) and no feedback (NoFB) Groups
Feedback (n = 27) No Feedback (n = 28)
Mean (SE) Max Min Mean (SE) Max Min
Age (years) 37.7 (2.1) 55 23 39.4 (2.1) 59 23
Males (n) 6 9
Females (n) 21 19
Height (cm) 166.0 (0.8) 187.9 157.5 167.4 (2.7) 188.0 150.0
Body mass (kg) 71.3 (8.7) 118.2 50 73.1 (4.3) 100.9 45.9
Years at company (years) 9.7 (1.8) 25 0.8 9.8 (2.5) 43 0.5
Years at job (years) 8.8 (2.1) 25 0.8 7.3 (1.6) 43 0.5
Initial whole body discomfort (/1620) * 57.9 (13.5) 270 0 44.2 (13.4) 265 0
University of Windsor (n) 11 11
Private construction company (n) 9 11
School board (n) 4 1
Not-for-profit organization (n) 3 5
*Note: the maximum score on the Cornell University Discomfort Questionnaire [14] is 1620.

2. Methods

2.1. Participants

Participants were recruited from the administrative staff at a private construction company, a school
board’s administrative office, a University of Windsor office, and the regional office of a national not-for-
profit organization (Table 1). To be included in this study, workers had to use a computer workstation
for at least 50% of their normal workday, use the same computer workstation during every workday, and
had not received ergonomic training recently (within 1 year). Fifty-nine office workers were initially
recruited for the study and consented to participate. The procedures were approved by the Research
Ethics Board at the University of Windsor. During the course of the experiment, 4 participants dropped
out due to vacations, illness or prior commitments. Thus, 55 participants completed all 4 weeks of the
study. Participants reported their height, body mass, age, time at company, time at job, and initial level
of discomfort one week prior to the start of data collection. Participants were then assigned to one of
two groups (those who would receive feedback, and those who would not) so that they would be evenly
distributed between the groups based on these variables (Table 1). Finally, participants were asked to
refrain from buying new office equipment or replacing any of their existing furniture during the course
of the four weeks of the study.

2.2. Procedures

Training was performed using an online version of the Rapid Office Strain Assessment (outlined in
Section 2.3.1). This training consisted of two primary components – an assessment module, and an
adjustment module. The goal of this training was to give participants access to resources on how they
86 M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort

could assess and make adjustments to their existing furniture that they felt were necessary throughout
the course of the study.
As this was an initial examination into this type of online training, a training protocol of 4 weeks was
chosen to see if results warranted further study. Weekly collection was conducted in order to see the
feasibility of such a training and assessment program. Participants received an initial training session
where they were instructed on how to use the ROSA online application. Participants registered their
username and account within the ROSA application, and completed an online form on initial discomfort
levels and biographical data (age, sex, height, body mass, years at current job, and years at the current
company). The workers in each experimental group assessed their own workstation, had a trained
observer assess their workstation, and filled out a discomfort questionnaire. The Feedback (FB) Group
received feedback on their performance from the trained observer, while the No Feedback (NoFB) Group
did not.

2.3. Worker assessment

The worker assessments were conducted once per week using the online ROSA application. Times
for the weekly assessments were scheduled either through personal contact onsite or through email, so
that the trained observer and participants did their assessments at the same time.

2.3.1. ROSA online training module


The ROSA online application contains the same risk factor identification information found in the
original ROSA tool [11]. The chair, monitor and telephone, and mouse and keyboard subsections from
ROSA were duplicated in the online version so that the participants and the trained observers used
equivalent risk factor diagrams during their assessments (Fig. 1). A sample screenshot of ROSA online
is provided in Fig. 1A for the chair subsection that assesses height. A copy of the corresponding pen and
paper checklist form for the chair height subsection is included in Fig. 1B. For all workstation subsections
covered in ROSA, risk factors in the online version were presented as text, graphics and live action in
video, with an audio narrative.
Workers navigated through the online ROSA training, selecting the risk factors that most accurately
applied to their current workstation set up. Selections were made by clicking the buttons and check
boxes corresponding to the levels of the risk factors, and the ROSA scores were automatically calculated
by the program. Workers were shown their ROSA scores, as well as information related to interpreting
the score, after the completion of their assessment. Upon logging in or completing an assessment, the
workers could also view the results from their previous assessments, thereby showing them if their scores
had increased or decreased throughout the assessment process.
The online version of ROSA was written in the PHP hypertext processor language (www.php.net),
integrating a MySQL database to track and display user information. The online training module can be
found at http://www.leadergonomics.com/rosa.

2.4. Trained observer assessment

A trained observer performed an assessment of the office workstation at the same time as the worker
assessments each week. The two trained observers who performed the assessments were graduate
students in the field of ergonomics and biomechanics, who had previously provided ergonomic training
and assessments in a consulting role to various private and public companies. Instead of using the online
version of ROSA, the trained observers completed a paper or spreadsheet-based version of ROSA (as
M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort 87

Fig. 1. Screenshot of the ROSA online application – chair height subsection (A). Risk factors are presented as text, graphics
and live action in video with an audio narrative. The tracking menu is to the left of the risk factors, and allows the participant
to view their progress through the assessment. This can be compared to what an ergonomist would use in the pen and paper
ROSA checklist (B) [11].

in [11]). During the course of the study, a subset of workstations (n = 14) were assessed simultaneously
by the two observers, and Intra-Class Correlation Coefficients (ICCs) were calculated to determine inter-
rater reliability. ICCs of 0.69 (chair), 0.91 (monitor and telephone), 0.87 (mouse and keyboard) and 0.87
(final score) were high in magnitude [13] and were comparable to previous results using ROSA [11].
This indicated that the use of ROSA by two observers for this study was statistically appropriate.

2.5. Trained observer feedback

For participants in the Feedback Group (Table 1), verbal feedback was given to them by the trained
observer on the accuracy of their self-assessments, based on their expert evaluation. The trained observer
indicated which assessments were incorrect and how they should have been scored. This feedback
occurred after the participant completed their assessment, but before they completed their discomfort
questionnaire (Section 2.6). To ensure that feedback was given consistently, one of the trained observers
was assigned to the Feedback (FB) Group.

2.6. Discomfort questionnaire

The Cornell University Discomfort Questionnaire [14] contains self-report information on discomfort
across 18 different body parts, which is further evaluated on the frequency of discomfort, the severity of
discomfort, and the degree of work interference that the discomfort causes. An adapted version of the
discomfort questionnaire was completed online by participants after they finished their assessments each
88 M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort

week. Localized discomfort scores were calculated by multiplying the body part’s discomfort frequency,
severity, and work interference value. The maximum discomfort score for this questionnaire is 1620 [14].

2.7. Workstation modification videos

The workstation modification videos included in ROSA online for each subsection (e.g. Fig. 1) were
filmed in generic offices in a local company prior to data collection. Modifications that could be
made without costing the company additional money to purchase new equipment were emphasized
(such as adding a rolled up towel to the back of a chair to add lumbar support). Upon completion of
the discomfort questionnaire each week, all participants in both groups had access to the workstation
modification videos and literature provided in ROSA online. Participants were asked to try and make
changes to their workstation based on the deficiencies in their current setup (as indicated by conducting
their assessment) and these videos. At the end of the study, feedback was given to all participants on
how to adjust their workstations to optimally suit their work habits and body types.

2.8. Data analysis

2.8.1. Experimental groups


To ensure that the distribution of participants between groups was comparable for all anthropometric
(height and body mass) and demographic information (time at company, time at job, initial level of
discomfort), participants were purposefully assigned and a one-way ANOVA was used to assess Group
differences (alpha set at 0.05). This process occurred after the initial training session, but prior to week
1 of the data collection.

2.8.2. Research question #1 and #2


To determine if worker-assessed ROSA scores differed from those determined by a trained observer, a
2 (Assessment Type: worker and observer) x 2 (Groups: FB, NoFB) x 4 (Time: week 1, 2, 3, 4) mixed
ANOVA was performed on the dependent variables (ROSA chair, monitor and telephone, mouse and
keyboard, and final scores). The between-subject factor was Group and the two within-subject factors
were Assessment Type and Time. Alpha was set at 0.05 for all comparisons. Pearson Product Moment
Correlations were used to determine the relationship between worker and trained observer ROSA final
scores. An R value of less than 0.1 was considered low, 0.3 to 0.5 was considered moderate, and greater
than 0.5 was taken to be indicative of a strong positive relationship between variables [15]. Significant
main effects of Time were further analysed with a Tukey’s HSD post hoc test.
This study was exploratory in nature and sought to establish the validity of worker-reported ROSA
scores through the online ROSA assessment process. Validity of self-assessments was deemed to have
been established if mean worker- and observer-reported scores were not significantly different from one
another, and if they were significantly correlated. Finally, a sensitivity and specificity analysis was
conducted on the worker and observer ROSA final scores in reference to the previously proposed cut-off
score of 5 [11]. The cut of value of 5 is used to identify workstations that are at an increased risk of
worker-reported discomfort. This was achieved by comparing discomfort and ROSA final score values,
and identifying where significant increases in discomfort occurred [11].

2.8.3. Research question #3


Pearson Product Moment Correlations were calculated to establish the relationships between worker-
reported discomfort and both worker-reported and trained observer ROSA scores. Correlations between
M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort 89

whole body and localized discomfort were made with ROSA subsection and final scores. The localized
discomfort scores related to the expected body parts that may experience discomfort or injury as a result
of office work (the head and neck: [16,17], upper limbs: [18], and back: [19]) were correlated with the
ROSA final, chair, monitor and telephone, and mouse and keyboard scores. This comparison was made
within each experimental Group (FB, NoFB), during each week of the experiment.

2.8.4. Research question #4


The effects of the two different training protocols on self-reported whole body musculoskeletal dis-
comfort over the course of the 4 week experiment were assessed using a 4 (Time: weeks 1, 2, 3 and 4) x
2 (Groups: FB, NoFB) mixed ANOVA. The between-subject factor was Group, and the within-subject
factor was Time. Alpha was set to 0.05 for all comparisons. Post hoc analysis was performed using
Tukey’s HSD test.

3. Results

3.1. Distribution of experimental groups

There were no significant differences in mean (SE) height, body mass, time at company, time at job,
or initial level of discomfort between the two experimental Groups (p  0.05) (Table 1). As previously
mentioned, 4 participants withdrew from the study for various reasons, and their data were excluded
from the analyses.

3.2. Research question #1 and #2

A significant main effect of Assessment Type was seen in ROSA final scores (F[1,53] = 6.03, p 
0.05) and mouse and keyboard scores (F[1,53] = 4.73, p  0.05), with worker-reported scores being
significantly lower than observer-reported scores (Fig. 2). A significant main effect of Group was also
seen in the ROSA final scores (F[1,53] = 4.01, p  0.05], as well as mouse and keyboard scores (F[1,53]
= 8.50, p  0.05) (Fig. 3). On average, the group that received feedback reported significantly lower
ROSA scores than the group that did not.

Fig. 2. Significant main effect of assessment type in the ROSA final and mouse and keyboard subsection (∗ -statistically
significant at p  0.05).
90 M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort

Fig. 3. Significant main effect of group in the ROSA final and mouse and keyboard subsection (∗ -statistically significant at p 
0.05).

Fig. 4. Significant main effect of time for the ROSA final and subsection scores over weeks 1–4 (∗ -statistically significant at
p  0.05).

A significant main effect of Time was seen for all ROSA scores. ROSA final scores (F[3,159] = 6.03,
p  0.05) and mouse and keyboard scores (F[3,159] = 8.07, p  0.05) decreased from week 1 and 4.
Chair (F[3,159] = 10.18, p  0.05) and monitor and telephone ROSA scores (F[3,159] = 7.16, p  0.05)
followed an increasing trend during the 4 week study. Post-hoc testing revealed significant differences in
M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort 91

Fig. 5. Correlations between worker- and observer-reported ROSA final and subsection scores (A), in the feedback groups (B)
and over time (C).
92 M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort

the ROSA final score between weeks 1 (3.90(0.12)) and 4 (3.52(0.17)) (Fig. 4A). Significant differences
were also seen between week 1 (3.05(0.11)) and week 3 (3.47(0.12)) and week 2 (2.98(.10)) and week
4 (3.26(0.09)) in the chair subsection (Fig. 4B), as well as week 1 (2.75(0.12)) and 3 (3.28(0.14)) and
weeks 2 (2.68(0.17)) and 4 (2.94(0.16)) in the monitor and telephone subsection (Fig. 4C). Finally,
significant differences emerged between week 1 (2.94(0.13)) and week 4 (2.53(0.13)) in the mouse and
keyboard subsection (Fig. 4D).
Correlations between worker- and observer-reported scores ranged between moderate to strong (R =
0.48 and R = 0.60), with the chair subsection showing the strongest relationships (Fig. 5A). When
considering Assessment Type, worker- and observer-reported score relationships were typically stronger
in the No Feedback group. All relationships were statistically significant and ranged between moderate
to strong (Fig. 5B). The relationship between worker and observer scores increased over the span of
the study for the ROSA final score, as well as the mouse and keyboard subsection. In the remaining
subsections, the R values peaked during week 2, followed by a decrease in the following 2 weeks
(Fig. 5C).
The sensitivity and specificity analysis yielded average values of 36% and 76%, and 40% and 80%
for worker-reported and observer-reported scores in the NoFB group, respectively (Fig. 6). In the
FB group, mean sensitivity values for weeks 1–4 were 56% for worker-reported scores, and 36% for
observer-reported scores, while the corresponding mean specificity values were 50% and 55%.

Fig. 6. Sensitivity (Sens) and Specificity (Spec) analysis results for assessment type, group (Feedback (FB) and No Feedback
NoFB), and time.

3.3. Research question #3

Significant correlation values were larger in magnitude between the worker-related scores and discom-
fort than for the observer scores in general (Figs 7A and B). The magnitude of the correlations was the
greatest for participants in the NoFB group between worker-reported ROSA final scores and whole body
discomfort. The fewest significant correlations were seen between monitor and telephone ROSA scores
and localized discomfort (Fig. 7). There were far fewer significant correlations between discomfort and
ROSA scores in the FB group (1 significant relationship) than in the NoFB group (18 significant rela-
tionships), and the overall mean correlation magnitude was less for the workers who received feedback
compared to those who did not.
M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort 93

Fig. 7. Correlations between ROSA scores and discomfort for worker-reported (A) and observer-reported scores (B), during
the 4 weeks of the experiment, as well as in the Feedback (FB) and No Feedback (NoFB) Groups (∗ -statistically significant at
p  0.05).

3.3.1. Additional ROSA score and discomfort relationships


Additional significant relationships between worker-reported discomfort and ROSA scores were seen
when comparing the various discomfort regions and ROSA subsection and final scores. These significant
relationships varied between R = 0.38 (total body discomfort and mouse and keyboard score, NoFB
Group, worker-reported ROSA score, week 3) and R = 0.68 (chair-related discomfort, mouse and
keyboard score, NoFB Group, observer-reported ROSA score, week 4) (Fig. 7).

3.4. Research question #4

A trend was observed for all localized and total body discomfort measures to decrease from week 1 to
week 4 (Fig. 8). A main effect of Time on reported discomfort emerged for total discomfort [F(3,159) =
5.64, p  0.05], total discomfort without leg scores [F(3,159) = 4.83, p  0.05], mouse and keyboard-
related discomfort [F(3,159) = 3.51, p  0.05], and monitor and telephone-related discomfort [F(3,159)
= 3.28, p  0.05] (Figs 8 A, B, D, and E). Significant decreases in discomfort occurred between week 1
(43.2 (8.6)) and week 4 (49.9 (6.8)), as well as week 1 and week 2 (22.9 (5.0)) (Fig. 7A)). The greatest
changes in mean discomfort across Groups were seen in the total body discomfort without leg scores,
with a 51.6% decrease in reported discomfort between weeks 1 and 2 (Fig. 8B).
There were no significant main effects of Time reported for chair-related discomfort (Fig. 8C) or Group
(FB or NoFB) in any of the discomfort categories (final or localized discomfort). There were also no
significant interactions between Time and Group for any discomfort score.

4. Discussion

Worker-reported scores were found to be comparable to observer-reported scores for the monitor and
telephone and chair subsections. Significant differences were found for Assessment Type (worker or
observer) and Group (FB or NoFB) for ROSA final scores, and for the mouse and keyboard subsections.
Worker and observer scores were significantly correlated throughout all weeks for all subsections and
final scores. There were significant positive relationships between discomfort and ROSA scores for the
group that did not receive feedback, but no significant relationships were found for the group that did
receive feedback. Finally, in general, it was found that worker-reported discomfort decreased over the
course of the four week protocol.
94 M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort

Fig. 8. Main effects of Time on mean (SE) discomfort scores: total body discomfort (A), total body without leg discomfort
(B), chair-related discomfort (C), monitor and telephone-related discomfort (D), and keyboard and mouse-related discomfort
(E) (∗ -statistically significant at p  0.05).

4.1. Research question #1

Worker- and observer-reported scores were significantly different from one another in the final score
and the mouse and keyboard subsections, suggesting that these self-reported ROSA scores are not valid,
according to the definition used in this study. The results for each subsection and for the ROSA final
scores are discussed in turn below.
M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort 95

4.1.1. Chair
Observer- and worker-reported ROSA scores were not different for the chair subsection (comprised
of the chair height, depth, armrest and backrest subsections), and worker and observer-reported scores
were significantly correlated. This may be explained in part by the fact that assessments of the chair and
seated posture generally required workers to evaluate their legs and trunk. Self-assessments of postures
such as these, that involve larger segments of the body, have been shown to be moderately accurate when
compared to observer assessments in previous studies [20,21].

4.1.2. Monitor and telephone


Mean worker-reported monitor and telephone scores were not significantly different than those reported
by the observers. In the monitor and telephone subsection of ROSA, postures of the neck and head are
assessed, with one risk factor related to reaching to the phone. Previous research on head and neck
posture self-assessment has reported less than desired accuracy compared to observer assessments [3].
However, the more positive posture results associated with the online ROSA tool may have occurred due
to unique components that other self-assessment approaches don’t utilize. The setup of pictures, text
and video in ROSA is fairly novel and may have provided enough additional information to workers to
enable them to more accurately assess these body parts.
There was a tendency for ROSA scores to increase between weeks 1 and 4 for the monitor and
telephone subsection, as well as the chair subsection. Most modern offices are comprised of furniture
that can be adjusted. The increase in these subsection scores over time may be a result of one piece
of equipment being adjusted, which had an effect on other scores for another piece of equipment. For
example, if the chair was too high, but the monitor was at an ideal height, an adjustment to the proper
height for the chair might result in the monitor now being too high. None of the scores in week 4 for any
subsections were significantly higher than the scores in week 1, indicating that any incorrect changes that
possibly occurred in the middle weeks of the study may have been identified by the worker, re-assessed
and re-adjusted in subsequent evaluations.

4.1.3. Mouse and keyboard


There was a significant difference between observer- and worker-reported ROSA mouse and keyboard
scores, with worker-reported scores being lower in magnitude on average. However, the magnitude of
the correlations between worker and observer scores in this subsection, were relatively high (Fig. 5).
Self-assessments have been previously shown to be effective in providing an accurate evaluation of
keyboard and mouse working posture [3]. The difference between the current and past approaches in this
regard may have been a result of the ROSA tool itself. In the evaluation of shoulder position while using
the mouse in ROSA, there is a fixed option to select an abducted shoulder posture, as well as an additive
option to indicate any abducted shoulder postures caused by the keyboard and mouse being on different
surfaces. Both of these risk factors have a value of 2 in ROSA. If one of these factors was consistently
missed during self-assessments, this could result in the discrepancy between observer-assessment and
worker-assessment scores observed in the present study.

4.1.4. ROSA final score


As previously mentioned, the ROSA final score is determined from the scores achieved from the chair,
monitor and telephone, and keyboard and mouse subsections. The ROSA final score is achieved using
scoring charts (Fig. 1), and is highly reflective of the subsection wherein the highest score lies. As
there was a significant difference between observer- and worker-assessments in the mouse and keyboard
96 M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort

subsection, this would have had a marked influence on any assessment in which the mouse and keyboard
score was the highest score of the three subsections.
Worker-reported ROSA final scores were generally lower in magnitude than the observer-reported
ROSA scores (Fig. 3), which is contrary to much of the previous research regarding the self-reporting
of risk factors. Other research has reported a trend for people to over-report when identifying risk
factors related to musculoskeletal disorders [21,22]. However, these studies focused on industrial work
primarily (in manufacturing or automotive industries), and not computer work. While Heinrich et al. [3]
also indicated that there was a tendency to over-report exposure to risk factors in the office environment,
this pertained only to the duration of computer use and not posture.
Even if workers have a tendency to over-report risk factors, it is entirely possible that risk factors
related to office and computer work could be predisposed to being under-reported. One explanation for
the under-reporting of workers’ scores in the current study is related to the current economic climate
in the participant companies and the city where the study was conducted. While Windsor has one of
the highest unemployment rates in Canada at approximately 14% [23], the majority of workers who
participated in this study worked in the public service, which is regarded as one of the most secure
industries [24]. Job security is a key component in job satisfaction [25,26]. Research has indicated
that workers with higher levels of job satisfaction are less likely to report risk factors and discomfort in
the workplace [27,28]. Systematic differences in factors such as job satisfaction between workplaces
could explain the self-report results of this study compared to others, and highlight the importance of
considering psychosocial risk factors when assessing MSD risk in the workplace; something that the
current ROSA tool does not take into account.
The ROSA final score has an important practical application when considering the implementation
of an online training protocol into a business. Like other ergonomic risk checklists, final evaluation on
whether a job requires additional assessment or attention is based on one number that falls within specific
intervention guidelines (e.g. REBA [10] and RULA [9]. Sonne et al. [11] found that ROSA final scores
of greater than 5 were associated with significant increases in discomfort, and therefore recommend that
a value of 5 be used to determine when an office should receive a more in-depth evaluation into the risk
factors present, and ultimately to set appropriate interventions. In the current study, self-reported scores
were significantly different than observer scores in the mouse and keyboard subsection, as well as for the
ROSA final scores. As the final score is used to make judgments on if a workstation requires additional
assessment, self-reported scores using the ROSA tool cannot be considered valid at this point in time.
In summary, based on the results reported here, the use of self-assessments performed by office
workers of their own workstation using ROSA online, appears to be a valid method of assessing risk
factors related to the chair, monitor and telephone in an office environment. This conclusion is supported
by non-significant differences in worker and observer-reported scores, and significant, relatively large
magnitude positive correlations between these scores. The ROSA scores for the mouse and keyboard
section were significantly different between workers and observers, but they were significantly correlated.
Therefore, the ROSA final and mouse and keyboard worker-reported scores cannot be considered valid
measures at this time. Future work should be conducted in an attempt to increase the ease with which
risk factors in these subsections can be identified. This could be done by improving the posture diagrams
used in the tool. While careful consideration was given to the development of the online software used in
this study, this was the first attempt to create such a training program. Research has indicated that things
such as the size of the posture categories used in the tool [29], how many posture category boundaries
there are [30], and the salience of images used on the tool interface [31] all need to be accounted for in
order to optimize viewer performance.
M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort 97

4.2. Research question #2

There was no significant interaction between Assessment Type (worker or observer) and Time for the
ROSA final score, or any of the subsection scores, indicating that there was no change in the difference
between either Assessment Type throughout the course of the 4 weeks of the study. There were also
no significant interaction effects between Assessment Type, Time or Group, indicating that feedback
had no role in increasing or decreasing the accuracy of worker-reported scores. This result is promising
for the chair and monitor and telephone subsections, as a significant difference between worker- and
observer-reported ROSA scores was not observed at any point during the study. However, it is noteworthy
that workers did not improve in terms of being able to assess the mouse and keyboard condition over
the course of the month during which the study took place. The significant difference of Assessment
Type could be a result of participants not taking their time and fully completing the assessment process
(i.e. watching the videos each time they went through the assessment module). Previous research has
indicated that workers tend to terminate their learning experience early when they have control over
the training, particularly when considering computer-based applications [32], and when they receive
negative feedback on their performance [33].
Correlation coefficients between worker and observer-reported ROSA scores tended to increase be-
tween weeks 1 and 4 for all scores in the No Feedback Group. However, there was a trend for R values
to peak prior to the fourth week of the study for all scores in the Feedback Group (Fig. 5). It appears that
once participants had a chance to use the ROSA online application once, they became familiar enough to
perform a more accurate assessment the second time they logged in. After this point, it is possible that
workers who were receiving feedback may not have performed their assessments with the diligence that
they did in the first two weeks, and correlation values dropped off. This may be a result of the participants
losing interest in the training, as they were completing the same assessment repeatedly. Repeated work
can lead to decreased focus and reduced performance as a result of boredom [34].
The nature of the feedback given may have played a role in the lack of improvements in the validity
of worker-reported ROSA scores. Lee and Carnahan [35] found that when providing feedback on
performance, exact performance feedback was not as effective in improving results as providing feedback
that allowed for a margin of error, both above and below the desired target (also known as bandwidth).
Essentially, allowing workers to have a window of error that was deemed to be acceptable was seen
to increase retention over a period of time as opposed to correcting every single error. Workers in the
Feedback Group were corrected on every error they made in the current study, which may have resulted
in too much information for them to successfully process, and could have reduced the participant’s
retention of information for their next assessment. In the future, the number of pieces of feedback should
be controlled during training and be provided using the principles of bandwidth knowledge of results.
Working within an error rate of 5–10% (actual performance compared to ideal performance) has shown
to increase retention in participants when compared to those who did not receive feedback, or received
exact feedback over the course of a multi-week training program [36].

4.3. Research question #3

Significant correlations of a magnitude comparable to those found by Sonne et al. [11] were found
in this study between discomfort and ROSA scores. Whole body discomfort and ROSA final score
correlations varied in magnitude between R = 0.40 and R = 0.70 [11]. One difference between the
current and previous studies was that total body discomfort scores were more highly correlated with
ROSA scores than discomfort scores that did not include leg discomfort [11]. Office workers tend
98 M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort

to sit for long periods of time throughout the day, a risk factor for the development of lumbar disc
herniation [37]. A symptom of disc herniation is sciatica (pain resulting from irritation of the sciatic
nerve, which can lead to shooting pain that extends into the leg [38]). Considering that sciatica has been
reported in up to 23% of all office workers [39], the authors suggest that it is important to include leg
discomfort in the analysis, as it could be a result of referred pain from a lower back injury.
The differences in the relationship between discomfort and ROSA scores between the current and
previous study in the development of ROSA [11] may partly be a result of the different factors introduced
herein. Sonne et al. [11] conducted assessments in a fairly traditional manner; workers were observed and
then they completed a paper version of the discomfort questionnaire. The introduction of feedback to the
assessment could have impacted how workers reported discomfort for a variety of reasons. The majority
of the feedback that was provided during the course of this study was negative in nature. Typically,
feedback was given to inform workers when they had scored their assessment incorrectly, and that they
needed to do something differently the next time. Van Dijk and Kluger [33] concluded that, in cases of
negative feedback, trainees may lose motivation and could possibly terminate their learning experience
early. Because the trained observers’ assessments were treated as the gold standard in this study, all
differences in worker assessment scores were effectively treated as incorrect answers. Furthermore, the
quantity of feedback that was provided may have acted against workers actually learning from their
errors. Stefanidis et al. [40] found that when attempting to learn new techniques, limited feedback
accompanied by video tutorials was more effective in improving performance than intense feedback
sessions. As feedback was given in the current study for every risk factor that was not scored the same
as the trained observer, there was a potentially large amount of information given to the worker after
the assessments. The impact of feedback may have caused the training and worker assessments to be
negatively affected, thereby preventing significant correlations between worker-reported ROSA scores
and discomfort.
Providing feedback as done here may also contribute to the appearance of a more traditional training
program. While workers were not pressured for time during the course of their assessments, they did
know they were going to receive evaluation on how they performed. This increases the structure of
the training program, and more closely represents a less effective, more tutorial-based training program
compared to completely open access approaches [41], which have been shown to improve overall training
satisfaction and efficacy.
In addition, the mere presence of an investigator during the worker self-assessments may have reduced
the level of autonomy that workers had in performing their online training. A true self-guided program
allows workers to access training materials whenever they choose, and complete tasks at their own
pace [41]. Future research should identify if there is a benefit to workers when using the online training
on their own schedule. Comparing these results to those from a control group would also ensure that the
goals of online training are actually being accomplished.

4.4. Research question #4

Decreases in discomfort over the 4 week period occurred across both feedback groups, and for both
total body discomfort as well as localized discomfort related to the monitor, telephone, mouse and
keyboard. A previous study of the effectiveness of ergonomic training on the relief of discomfort showed
that different types of ergonomic interventions can lead to reduced worker-reported discomfort [42].
Bohr [8] showed that a participatory approach to ergonomics, where workers were instructed on how to
make adjustments, followed by ergonomists helping the workers to make these changes, was the most
M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort 99

effective in reducing symptoms of musculoskeletal disorders. The video-based training incorporated


into ROSA appears to serve a similar purpose of educating workers on how to adjust office furniture with
comparable results to previous studies.
Discomfort and ROSA final scores showed similar decreasing trends over the course of the study. De-
creasing ROSA scores may be reflective of risk factors being reduced or eliminated from the workstation
setup. This indicates that the changes made to offices based on the videos, literature and assessment
structure in the ROSA online application, may have been effective in reducing discomfort.
What is promising about the changes to both office conditions and discomfort reported here is that
no new furniture purchases were made during the course of the study. Any changes made to the offices
were a result of adjusting existing furniture and equipment, or using existing materials to improve the
setup of the offices. Menozzi et al. [43] found similar results in office ergonomics research, with all
forms of ergonomics training proving to be effective in reducing risk factors in the office environment.
Amick et al. [7] found that office ergonomic interventions were most successful when new furniture was
brought in (primarily a new chair), and then workers were trained on adjustments. While the number of
adjustments made by workers in the present study was not recorded, the primary investigator observed
nearly all workers performing adjustments to their furniture throughout the assessment process. Using
ROSA online appears to be an effective method of getting workers to adjust their furniture, which is
a less expensive method of improving the office than making office-wide furniture purchases without
assessing the need.
Discomfort data obtained via a self-report questionnaire are inherently limited. Discomfort is a
subjective measure and relies on workers’ perceptions, which can be affected by their circumstances.
Subjective measures such as this have tended to be over-reported by workers [21,22]. However, the
ease of collecting this type of data using a questionnaire makes this method an attractive alternative for
research purposes. Comparing ROSA scores to more objective outcomes such as injury or injury claim
data over extended periods of time may be a better approach for validating the risk scores estimated by
the ROSA tool.
Similar decreases were seen in risk factors (as reflected in a decrease in ROSA scores) as well as
worker-reported discomfort. While there was no control group to enable the authors to confirm that
self-guided training was more effective than the training used in this study, self-reported discomfort did
decrease in a manner similar to other training-based studies [8,43]. With this in mind, the effectiveness
of the training program described here for reducing discomfort is not fully supported, but results are
promising and suggest that future work to improve the approach seems warranted.

5. Conclusions

The results from this study can be summarized as follows:


1. Workers were able to accurately assess the risk factors associated with the chair, monitor and
telephone, but not with the mouse and keyboard or the ROSA final scores.
2. Worker- and observer-reported ROSA final scores had a similar decreasing trend over the 4 week
training period.
3. Providing augmented feedback to the worker on their performance negatively affected their reported
scores.
4. Worker-reported ROSA final scores and total body discomfort were more highly correlated than
observer-reported ROSA scores and discomfort.
100 M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort

5. Worker-reported discomfort decreased throughout the 4 weeks of the study.


The online version of ROSA allowed for over 200 assessments to be completed in a one-month time
period. The demonstrated speed and ease with which access to online ergonomics training and assess-
ments can be made warrants further research into how to increase the accuracy of worker-assessments
using the ROSA online tool.

Acknowledgements

Thanks to Mike Angelidis, Erika Santarrosa, Alison Schinkel-Ivy, Tim Burkhart, Jennifer Lembke,
Ryan Roach, Janice Forsythe, Anthony Lucas, the University of Windsor IT Department, and Pickering
Technical Services for their help with various aspects of this study and to the Centre of Research Expertise
for the Prevention of Musculoskeletal Disorders (CRE-MSD) for funding.

References
[1] Occupational Health and Safety Council for Ontario 2008, Part 1: MSD prevention guideline for Ontario. Toronto:
Workplace Safety and Insurance Board.
[2] Keir, PJ, Bach, JM, and Rempel, D, Effects of computer mouse design and task on carpal tunnel pressure. Ergonomics.
1999; 42, 1350-1360.
[3] Heinrich, J, Blatter, BM, and Bongers, PM, A comparison of methods for the assessment of postural load and duration
of computer use. Occ. Env. Med. 2004; 61, 1027-1031.
[4] Straker, L, Pollock, C, Burgess-Limerick, R, Skoss, R, and Coleman, J, The impact of computer display height and desk
design on muscle activity during information technology work by young adults. J Electromyogr Kinesiol. 2008; 18(4),
606-617.
[5] Harvey, R, and Peper, E, Surface electromyography and mouse use position. Ergonomics. 1997; 40(8), 781-789.
[6] Marshall, K, Working with computers: Computer orientation for foreign students. Perspectives. 2001; 75(1), 9-15.
[7] Amick III, BC, Robertson, MM, DeRango, K, Bazzani, L, Moore, A, Rooney, T, et al., Effect of office ergonomics
intervention on reducing musculoskeletal symptoms. Spine. 2003; 28(24), 2706-2711.
[8] Bohr, PC, Efficacy of office ergonomics education. J. Occ. Rehab. 2000; 10(4), 243–255.
[9] McAtamney, L, and Corlett, EN, RULA: a survey method for the investigation of work-related upper limb disorders.
Appl. Erg. 1993; 24, 45-57.
[10] Hignett, S, and McAtamney, L, Rapid entire body assessment (REBA). Appl. Erg. 2003; 31, 201-205.
[11] Sonne, MW, Villalta, DL, Andrews, DM, Development and evaluation of an office ergonomic risk checklist: ROSA –
Rapid office strain assessment. Appl. Erg. 2012; 43(1).
[12] Nietfeld, JL, Cao, L, and Osborne, JW, The effect of distributed monitoring exercises and feedback on performance,
monitoring accuracy, and self-efficacy. Metacog. Learn. 2006; 1(2), 159-179.
[13] Portney, L.G., Watkins, M.P. 2000. Foundation of clinical research applications to practice. Prentice Hall, Inc. New
Jersey, USA.
[14] Hedge, A, Morimoto, S, and McCrobie, D, Effects of keyboard tray geometry on upper body posture and comfort.
Ergonomics. 1999; 42, 1333-1349.
[15] Cohen, J, Statistical power analysis for the behavioural sciences. 2nd ed. New Jersey: Lawrence Erlbaum; 1988.
[16] Hagberg, M, and Wegman, DH, Prevalence rates and odds ratios of shoulder-neck diseases in different occupational
groups. Br. J. Ind. Med. 1987; 44, 602-610.
[17] Korhonen, T, Ketola, R, Toivonen, R, Luukkonen, R, Häkkänen, M, and Viikari-Juntura, E, Work related and individual
predictors for incident neck pain among office employees working with video display units. Occ. Env. Med. 2003; 60,
475-482.
[18] Gerr, F, Marcus, M, Ensor, C, Kleinbaum, D, Cohen, S, Edwards, A, et al. A prospective study of computer users: I.
Study design and incidence of musculoskeletal symptoms and disorders. Am. J. Ind. Med. 2002; 41, 221-235.
[19] Jensen, C, Finsen, L, Søgaard, K, and Christensen, H,. Musculoskeletal symptoms and duration of computer and mouse
use. Int. J. Ind. Erg. 2002; 30, 265-275.
[20] Burdorf, A, Reducing random measurement error in assessing postural load on the back in epidemiologic surveys. Scand.
J. Work Env. Health. 1995; 21(1), 15-23.
M. Sonne and D.M. Andrews / Validity of online worker self-assessments and the relationship to worker discomfort 101

[21] Wiktorin, C, Karlqvist, L, and Winkel, J, Validity of self-reported exposures to work postures and manual materials
handling. Scand. J. Work Env. Health. 1993; 19, 208-214.
[22] Andrews, DM, Norman, RW, Wells, RP, and Neumann, P, The accuracy of self-report and trained observer methods for
obtaining estimates of peak load information during industrial work. Int. J. Ind. Erg. 1997; 19, 445-455.
[23] Hall, D, Windsor’s unemployment rate falls but still highest in Canada. Windsor Star. 2009, July 30.
[24] Clark, A, Postel-Vinay, F, Job security and job protection. Ox. Econ. Papers. 2008; 61(2), 207-239.
[25] Blanchflower, DG, and Oswald, AJ, Is the UK moving up the international wellbeing rankings? 2000; National Bureau
of Economic Research Conference, May 4th , 2000.
[26] Heaney, C, Israel, B, and House, J, Chronic job insecurity among automobile workers: effects on job satisfaction and
health. Soc. Sci. Med. 1994; 38(10), 1431-1437.
[27] Bigos, SJ, Battie, MC, Spengler, DM., Fisher, LD, Fordyce, WE, Hansson, TH, et al. A prospective study of work
perceptions and psychosocial factors affecting the report of back injury. Spine. 1991; 16(1), 1-7.
[28] Demure, B, Luippold, RS, Bigelow, C, Ali, D, Mundt, KA, Liese, B, Video display terminal workstation improvement
program: I. Baseline associations between musculoskeletal discomfort and ergonomic features of workstations. J. Occ.
Env. Med. 2000; 42(8), 783-91.
[29] van Wyk, PM, Weir, PL, Andrews, DM, Fiedler, KM, and Callaghan, JP, Determining the optimal size for posture
categories used in video-based posture assessment methods. Ergonomics. 2009; 52(8), 921-30.
[30] Andrews, DM, Arnold, TA, Weir, PL, van Wyk, PM, and Callaghan, JP, Errors associated with bin boundaries in
observation-based posture assessment methods. Occ. Erg. 2008; 8(1), 11-25.
[31] Fiedler, KM, The effect of posture category salience on decision time and errors when using video-based posture
assessment methods. Unpublished Masters Thesis, University of Windsor, Windsor, ON, Canada; 2010.
[32] Brown, KG, Using computers to deliver training: Which employees learn and why? Pers. Psych. 2001; 54(2), 271-296.
[33] Van-Dijk, D, Kluger, AN, Positive (negative) feedback: Encouragement or discouragement. Proceedings of the 15th
Annual Conference of the Society for Industrial and Organizational Psychology. 2000; New Orleans, Louisiana.
[34] Fisher, CD, Boredom at work: A neglected concept. Hum. Relat. 1993; 46(3), 395-417.
[35] Lee, TD, and Carnahan, H, Bandwidth knowledge or results and motor learning: More than just a relative frequency
effect. Q. J. Exp. Psychol. 1990; 42, 777-789.
[36] Wright, DL, Smith-Munyon, VL, and Sidaway, B, How close is too close for precise knowledge of results? Res. Quart.
Exer. Sport. 1997; 68(2), 172-6.
[37] Callaghan, JP, and McGill, SM, Low back joint loading and kinematics during standing and unsupported sitting.
Ergonomics. 2001; 44, 280-294.
[38] Garfin, SR, Rydevik, B, Lind, B, and Massie, J, Spinal nerve root compression. Spine. 1995; 20(16), 1810-1820.
[39] Tuomi, K, Ilmarinen, J, Eskelinen, L, Järvinen, E, Toikkanen, J, and Klockars, M, Prevalence and incidence rates of
diseases and work ability in different work categories of municipal occupations. Scand. J. Work Env. Health. 1991; 17
(1), 67-74.
[40] Stefanidis, D, Korndorffer, JR, Heniford, BT, Scott, DJ, Limited feedback and video tutorials optimize learning and
resource utilization during laparascopic simulator trainer. Surgery. 2007; 142(2), 202-206.
[41] Gist, ME, Schwoerer, C, and Rosen, B, Effects of alternative training methods on self-efficacy and performance in
computer software training. J. Appl. Psych. 1989; 74(6), 884-891.
[42] Bayeh, AD, and Smith, MJ, Effect of physical ergonomics on VDT worker’s health: A longitudinal field study in a
service organization. Int. J. Hum.-Comput. Int. 1999; 11, 109-135.
[43] Menozzi, M, Buol, AV, Waldmann, H, Kundig, S, Krueger, H, and Spieler, W, Training in ergonomics at VDU workplaces.
Ergonomics. 1999; 42(6), 835-845.

You might also like