0% found this document useful (0 votes)
35 views7 pages

Chapter 7 The Basics of Experimentation

This document outlines the basics of experimentation in psychology, focusing on independent and dependent variables, operational definitions, and the evaluation of these definitions in terms of reliability and validity. It discusses the importance of defining variables operationally and provides examples of research studies to illustrate these concepts. Additionally, it addresses potential threats to internal validity and the structure of the method section in research reporting.

Uploaded by

redsatorou
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views7 pages

Chapter 7 The Basics of Experimentation

This document outlines the basics of experimentation in psychology, focusing on independent and dependent variables, operational definitions, and the evaluation of these definitions in terms of reliability and validity. It discusses the importance of defining variables operationally and provides examples of research studies to illustrate these concepts. Additionally, it addresses potential threats to internal validity and the structure of the method section in research reporting.

Uploaded by

redsatorou
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

10/8/22

Outline

CHAPTER 7 The basics of • Independent and Dependent Variables


⚬ Some research examples identifying variables

experimentation
• Operational Definitions
⚬ Defining the Independent Variable: Experimental
Operational Definitions
⚬ Defining the Dependent Variable: Measured
Operational Definitions
⚬ Defining Constructs Operationally
Christelle Ann Jimenez, RPm, MA (cand) ⚬ Defining Nonconstruct Variables
Psychology Department ⚬ Defining Scales of Measurement
Institute of Arts and Sciences • Evaluating Operational Definitions
Far Eastern University ⚬ Reliability
⚬ Validity
⚬ Evaluating the Experiment: Internal Validity
⚬ Extraneous Variables and Confounding
⚬ Classic Threats to Internal Validity
• Planning the Method Section

1 2

Main features of a Independent and dependent variables


psychological experiment:

• Antecedent conditions are • Experimental hypothesis: states a potential


manipulated, at least two treatment relationship between the independent and
dependent variables
conditions ⚬ If A occurs, then we expect be to follow
• Record the responses ⚬ Independent variable (IV)
• Make an inference, establish cause- ■ is the dimension that the experimenter
intentionally manipulates
and-effect statements ■ variable is independent, values are created
by the experimenter and are not affected by
anything else that happens in the
experiment\
■ example: aspects of physical environment (ie.
lighting, noise level); given task (easy vs
hard, nonsense syllables vs real words);
psychological states (anxious vs nonanxious,
happy vs sad)

3 4

1
10/8/22

Independent and dependent variables Independent and dependent variables

⚬ Independent variable (IV)


■ must be given at least two possible
⚬ Dependent variable (DV)/dependent
values in every experiment measures
■ levels of the independent variable -
create different treatment conditions ■ is the particular behavior we expect
to change because of our
within the experiment experimental treatment; out come
■ quasi-experiments, IV is not
manipulated but selected we are trying to explain
⚬ We are testing the effects of the IV on
• make sure samples are randomly the DV
assigned to avoid confounding
variables

5 6

Independent and dependent variables Research examples

⚬ Schachter (1959)
■ If people are anxious, then they will
want to affiliate or be, with others
■ Misery loves company
■ Potential relationship between two
variables - anxiety and affiliation
■ IV - anxiety
■ DV- affiliation

7 8

2
10/8/22

Research examples Research examples

⚬ Hess (1975) ⚬ Identifying variables


■ Large pupils make people more ■ What did the experimenter manipulate?
attractive ■ is this the independent variable?
■ What was used to assess the effect of
■ use of belladona (makes pupils look the independent variable?
wider) ■ is this the dependent variable?
■ rate four photos ■ IV: What will you manipulate or vary to
■ IV - pupil size test the hypothesis?
■ DV - attractiveness ■ DV: What behavior are you trying to
explain; what will you measure to find
out whether your independent variable
had an effect?]
■ NO MANIPULATION = NO
EXPERIMENTAL HYPOTHESIS

9 10

Defining the IV: Experimental


OPERATIONAL DEFINITIONS
operational definitions

⚬ IV and DV must be defined:


■ conceptually and
■ operationally
⚬ Operational definition - specifies the precise ⚬ Experimental operational definitions
meaning of a variable within an experiment ■ explain the precise meaning of the IV;
■ defines a variable in terms of observable these definitions describe exactly was
operations, procedures, and was done to create the various
measurements
treatment conditions of the experiment
■ clearly describes the operations involved ■ includes all steps that were followed to
in manipulating or measuring the set up each value of teh IV
variables in an experiment
■ statements of operating procedures, set
s of instructions that ell others how to
carry out an experiment

11 12

3
10/8/22

defining the dv: measured operational


definitions defining constructs operationally

⚬ Measured operational definitions ⚬ hypothetical constructs


■ describe exactly what procedures we ■ unseen processes postulated to explain
follow to assess the impact of different behavior
treatment conditions ■ Operationalize as IV and DV
■ example: identifying test by name : ■ IV - treatment conditions
Culture Fair Intelligence Test (CFIT) ■ DV- as measurements

13 14

Defining nonconstruct variables defining scales of measurement

⚬ lighting, crying, etc


⚬ define as specifically as possible

15 16

4
10/8/22

Evaluating operational definitions Evaluating operational definitions

⚬ Reliability - consistency and dependability ⚬ Reliability - consistency and dependability


■ hungry vs not hungry example - every
time OD is applied we get teh same 3. Interitem reliability - the extent to which different
results parts of a questionnaire, test, or other instruments
• interrater reliability - different observers take designed to assess the same variable attain
measurements of the same responses, used in consistent results
content analysis ⚬ internal consistency - individual
• test-retest reliability - comparing scores of items
people who have been measured twice with the ⚬ split-half reliability
same instrument ■ Cronbach's a (alpha)

17 18

Evaluating operational definitions Evaluating operational definitions


2. Content validity
a. Does the content of our measure fairly reflect
⚬ Validity the content of the quality we are measuring?
■ actually studying the variables that we b. Are all aspects of the content represented
intend to study appropriately?
■ manipulation check

• Face validity 3. Predictive validity


• when the procedure is self-evident a. Do your procedures yield information that
• least stringent type of validity because it enables us to predict future behavior or
does not provide any real evidence performance?
b. Schachter study, bring the subjects in a large
waiting room

19 20

5
10/8/22

Evaluating the experiment:


Evaluating operational definitions
internal validity
4. Concurrent validity
a. evaluated by comparing scores on the measuring • Internal validity
instrument with another known standard for the variable ⚬ the degree to which a researcher is able to state a
being studied causal relationship between antecedent conditions
b. comparative rather than predictive • External validity
⚬ how well findings of the experiment generalize or
apply to situations that were not tested directly
5. Construct validity
a. most important aspect of validity
b. deals with the transition from theory to research
application
c. start with a general idea of the qualities that
characterize the construct we want to test; seek ways to
empirically test
d. Have I succeeded in creating a measuring device that
measure the construct I want to test?
e. test with other related constructs
• convergent and discriminant validity

21 22

Problems of internal validity 3. Classic threats to internal validity

• Extraneous variables
• factors that are not the focus of the experiment but
can influence the findings • History
• individual differences, equipment failures, • Maturation
inconsistent instructions, condition of subjects
• Confounding variables • Testing
• when the value of an extraneous variable changes
systematically across different conditions of an • Instrumentation
experiment • Statistical Regression
• Selection
• Subject Mortality
• Selection Interactions

23 24

6
10/8/22

The method section The method section

• Participants
• a place to describe what you ⚬ gender
did in your experiment ⚬ ages
⚬ Participants ⚬ how many participated
⚬ Materials ⚬ how many did not
⚬ Procedure complete
⚬ reasons for dropping out
⚬ other characteristics

25 26

The method section The method section

• Materials
⚬ all items presented to the • Procedures
subjects
⚬ films ⚬ List the procedures in
chronological order
⚬ questionnaires
⚬ stories

27 28

You might also like