Measurement and Data Collection
Measurement and Data Collection
Measurement and Data Collection
Data Collection
Θ
By: Ephrem Mannekulih
(MSc. in Biostatistics & Health/Inf., Asst. Prof.)
Study Variables
A variable is a characteristic of a person, object, or phenomenon
that can take on different values in different person/
object/phenomenon.
Eg.
Age, sex, educational status, monthly family income, marital status and
religion will be.
Types of variables…cont’d
Confounding variable - A variable that is associated with the
problem and with a possible cause of the problem.
They may either strengthen or weaken the apparent relationship
between the problem and a possible cause.
In measuring,
We devise some form of scale in the range in terms of set theory and
Then transform or map the properties of objects from the domain onto
this scale.
Measurement Scales
The most widely used classification of measurement scales
are:
Nominal scale;
Ordinal scale
Ratio scale
Measurement Scales…Cont’d
Ordinal scale:
Scale of measurement in which data can be assigned into categories
that are ranked in terms of order.
8
Measurement Scales…Cont’d
Example of ordinal scale:
Pain level:
1. None
2. Mild
3. Moderate
4. Severe
The numbers have LIMITED meaning 4>3>2>1 is all we know
apart from their utility as labels
Precise differences between the ranks do not exist
9
Measurement Scales…Cont’d
Interval scale: Scale of measurement in which data are
measured on a continuum numerical form that are ranked in
terms of magnitude
Temp. 𝑂𝐹 : 50 55 60 65
10
Measurement Scales…Cont’d
It has no true zero value
11
Measurement Scales…Cont’d
Ratio scale: The highest level of measurement scale in which
data are measured on a continuum numerical form that are
ranked in terms of magnitude
12
Measurement Scales…Cont’d
Characterized by equality of ratios as well as equality of
intervals can be determined
Someone who weighs 80 kg is two times as heavy as someone else who
weighs 40 kg.
13
Why measurement Validity & Reliability?
The quality of a research outputs depends on the validity of
instrument we use.
Having very little knowledge but may not admit his ignorance
That is, when they do not measure correctly what they are supposed to
measure
Information Bias…Cont’d
In analytical studies, usually one factor is known and another
is measured
Example;
In case control studies, the ‘outcome’ is known and the ‘exposure’ is
measured
In cohort studies, the exposure is known and the outcome is measured
Types of Information Bias
Interviewer Bias:
An interviewer’s knowledge may influence the structure of questions
and the manner of presentation, which may influence responses
Recall Bias:
Those with a particular outcome or exposure may remember events
more clearly or amplify their recollections
Observer Bias:
Observers may have preconceived expectations of what they should
find in an examination
Information bias…
Hawthorne effect:
An effect first documented at a Hawthorne manufacturing plant;
people act differently if they know they are being watched
Surveillance bias:
The group with the known exposure or outcome may be followed more
closely or longer than the comparison group
Interviewer bias
Use more than one observer or interviewer, but not much, since they
cannot be trained in an identical manner
Cont’d…
When interpreting study results, ask yourself these questions …
Given conditions of the study, could bias have occurred?
Which direction is the distortion? – Is it towards the null or away from the
null?
Tests of Sound Measurement
Sound measurement must meet the tests of
Validity,
Reliability and
Practicality
In fact, these are the three major considerations one should
use in evaluating a measurement tool.
Validity vs. Reliability
Validity Reliability
How well a measurement agrees with How well a series of measurements
an accepted value agree with each other
Test of Validity
Validity:
The degree to which an instrument measures what it is supposed to
measure.
The extent to which differences found with a measuring instrument
reflect true differences among those being measured
It’s a relatively intuitive, quick, and easy way to start checking whether
a new measure seems useful at first glance.
Test of Validity…Cont’d
Having face validity doesn’t guarantee that you have good
overall measurement validity or reliability
It’s considered a weak form of validity because it’s assessed
subjectively without any systematic testing and is at risk for
bias
But testing face validity is an important first step to reviewing
the validity of your test.
Once you’ve secured face validity, you can assess other
more complex forms of validity
Test of Validity…Cont’d
Content validity: is the extent to which a measuring
instrument provides adequate coverage of the topic under
study.
High content validity means the test covers the topic extensively.
by using trained and motivated persons to conduct the research and
Formation of index.
Developing Tools…Cont’d
Concept development;
At this step researcher should arrive at an understanding of the major
concepts pertaining to his study.
The use of more than one indicator gives stability to the scores and it
also improves their validity
Developing Tools…Cont’d
Formation of an index: obtain an overall index for the various
concepts concerning the research study
It is a task of combining several dimensions of a concept or different
measurements of a dimensions into a single index
Scaling
Researchers often face a problem of valid measurement
when;
The concepts to be measured are complex and abstract and
Those items or statements that best meet this sort of discrimination test
are included in the final instrument.
Operationalizing variables
It is necessary to operationally define both the dependent
and independent variables
Concept Variable
• Subjective impression • Measurable with degree of
• Can not be measured precision varies from scale to
• No uniformity on its scale, variable to variable
understanding among
different people
Eg. Excellent, high achiever, Eg. Gender(Male Vs Female),
rich, Age(in year, month),
weight(in kg, gm)
Concepts, indicators and variables
If you are using a concept in your study, you need to
operationalize them into measured term
For this, you need to identify indicators
But those selected must have a logical link with the concept.
Concepts Indicators Variables
Cont’d…
Types of data
Primary data: data collected directly from individuals or subjects or
respondents for the purpose of certain study.
literatures
Surveys
Experiments
Reports
Observation, etc.
Stages of data collection
Three Stages in the Data Collection Process
Stage 1: Permission to proceed
Quality control
Interviews
Self
Data Others
Administered Collection
Methods
Document Observatio
Review n
Interview Types
Face-to-Face, Telephone or Skype
ideally tape record with participant’s permission and take notes
Unstructured
Focus on a broad area for discussion
Participant talks about topic in their own way
Semi-Structured
Common set of topics or questions for each interview
Questions vary depending on participant
Flexibility re order of questions
Follow up on topics that emerge
Structured or Focused Interview
Identical set of questions for each interview
Questions asked in the same way, using the same words for each interview
Open Questions
• Can you tell me about...?
• When did you notice...?
• Why do you think that happened...?
• What happened then...?
• Do you think...?
• How did you know...?
• Did that affect....?
• How did you feel...?
• What impact did that have on....?
• Who else was there,
• What did you see as the main...?
• Where was that....?
• What did you think....?
Questionnaire
Guidelines for Constructing Questionnaire/Schedule
Must keep in view the problem he is to study and be clear about the
various aspects of his research problem
Should depends on the nature of information sought, the sampled
respondents and the kind of analysis intended
Rough draft of the Questionnaire/Schedule be prepared, giving due
thought to the appropriate sequence of putting questions
Researcher must invariably re-examine, and in case of need may revise
the rough draft for a better one
Pilot study should be undertaken for pre-testing the questionnaire. The
questionnaire may be edited in the light of the results of the pilot study.
Questionnaire must contain simple but straight forward directions for the
respondents so that they may not feel any difficulty in answering the
questions.
Interview Skills
Sensitivity to Interviewer/ee
•Researching up, across or ‘down’?
Interaction
Listening Attentively without •Purpose of interview to hear participant’s perspective, experience and
Passing Judgement views
Re-Focusing and Mainlining Control •If going off topic given limited time
Learning the Language •Sensitive to cultural setting and discourse commonly used
Encouraging
o Responses Non- •To not interrupt - eye contact, head nodding, ‘um huh’
Verbally
Checking Back
To discuss emerging findings of interviews, surveys, focus groups etc with participants
Encourage Discussion
Managing the
members of the stays focused
discussion
group interact on the topics
Don’t lead or
Aware of the Time
influence the
group dynamics management
discussion
Ensure all
Managing Facilitating
members
disagreements tasks
participate
Self-administered Questionnaire:
It is a data collection tool in which written questions are presented
to be answered by the respondents in written form.
A self-administered questionnaire can be administered in different
ways
1. Through mailing to respondents
2. Gathering all or part of respondents, giving oral or written instructions,
and letting them fill out the questionnaires;
3. Hand-delivering questionnaires to respondents and collecting them
later
Relatively cheap
No interviewer bias
Cont’d….
Disadvantages:-
Difficult to design and often require many rewrites before an
acceptable questionnaire is produced.
Observation Types:
Covert, Overt, Complete Observer or Complete Participant
Observation
Simple Observation
Researcher as objective outsider
Participant Observation
Researcher immersed in social situation
To achieve intimate knowledge of the setting or group
To understand people’s behaviours, cultural practices, power dynamics
etc
To understand why specific practices occur , how they originate and
change over time
Observation Considerations
Ethical Issues
Covert vs. Non- Convert Observation
Ethical Issues with covert?
Gaining informed consent from full group
Observer Effects (Hawthorne Effect)
Will people change their behaviour if they know they’re being
observed?
Losing objectivity if immersed in a group – ‘Going native’
Recording Data
Difficult to decide what to record
Time Consuming
Documentary Review
Documents are often readily available potential sources of
data
Contain large amounts of information
Written records
Historical documents
Secondary
Poetry
Data
Data Quality Assurance Measures
Standardizing all the features and categories of data
Using consistent data formats and measurement standards
Rigorous data handling and analysis procedures:
Select data collection and storage tools that promote data
consistency
Training
Use of different sources of data
Combining Different Data Collection Techniques
Pre-testing
Supervision
82