Lecture 1 Research Method

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 69

Research Process

Lecture 1

ACCOUNTING RESEARCH INSTITUTE/ FACULTY OF ACCOUNTANCY

RESEARCH PROCESS Identify and Define Research Problem

Theory / Practice Hypotheses / Conceptualization Research Design


Data collection Data Analysis Findings


RESEARCH PROCESS Research Problem

What is a problem?
. . . any situation where a gap exists between the actual and the desired state. A problem does not necessarily mean that something is seriously wrong. It could simply indicate an interest in improving an existing situation. Thus, problem definitions can include both existing problems in the current situation as well as the quest for idealistic states in the future.

RESEARCH PROCESS Problem Identification

How are problems identified?


1. Observation manager/researcher senses that changes are occurring, or that some new behaviors, attitudes, feelings, communication patterns, etc., are surfacing in ones environment. The manager may not understand exactly what is happening, but can definitely sense that things are not what they should be.

2. Preliminary Data Collection use of interviews, both unstructured and structured, to get an idea or feel for what is happening in the situation.
3. Literature Survey a comprehensive review of the published and unpublished work from secondary sources of data in the areas related to the problem.

RESEARCH PROCESS Problem Identification

A literature survey ensures that:


1. Important variables likely to influence the problem are not left out of the study. 2. A clearer idea emerges regarding what variables are most important to consider, why they are important, and how they should be investigated. 3. The problem is more accurately and precisely defined. 4. The interviews cover all important topics. 5. The research hypotheses are testable. 6. The research can be replicated. 7. One does not reinvent the wheel; that is, time is not wasted trying to rediscover something that is already known. 8. The problem to be investigated is perceived by the scientific community as relevant and significant.

RESEARCH PROCESS Problem Identification

What are some business problems you are aware of or have confronted?

RESEARCH PROCESS Problem Definition


PROBLEM DEFINITION STEPS:

Understand and define the complete problem. If more

than one problem is identified, separate and prioritize them in terms of who and when they will be dealt with. Identify and separate out measurable symptoms to determine root problem versus easily observable symptoms. For example, a manager may identify declining sales or lost market share as the problem, but the real problem may be bad advertising, low salesperson morale, or ineffective distribution. Similarly, low productivity may be a symptom of employee morale or motivation problems, or supervisor issues. Determine the unit of analysis = individuals, households, businesses, objects (e.g., products, stores), geographic areas, etc., or some combination. Determine the relevant variables, including specifying independent and dependent relationships, constructs, etc.

RESEARCH PROCESS Problem Definition


Examples of Well-Defined problems:
1. Has the new packaging affected the sales of the product? 2. How do price and quality rate on consumers evaluation of products? 3. Is the effect of participative budgeting on performance moderated by control systems? 4. Does better automation lead to greater asset investment per dollar of output? 5. Has the new advertising message resulted in higher recall? 6. To what extent do the organizational structure and type of information systems account for the variance in the perceived effectiveness of managerial decision-making? 7. Will expansion of international operations result in an improvement in the firms image and value? 8. What are the effects of downsizing on the long-range growth patterns of companies? 9. What are the components of quality of life? 10.What are the specific factors to be considered in creating a data warehouse for a manufacturing company?

RESEARCH PROCESS Definitions


Variable = the observable and measurable characteristics/attributes the researcher specifies, studies, and draws conclusions about.

Types of Variables:

Independent variable = also called a predictor variable, it is a variable or

construct that influences or explains the dependent variable either in a positive or negative way. construct the researcher hopes to understand, explain and/or predict. dependent variable relationship.

Dependent variable = also known as a criterion variable, it is a variable or Moderator variable = a variable that has an effect on the independent Mediating variable = also known as an intervening variable, it is a variable that
surfaces as a function of the independent variable and explains the relationship between the dependent and independent variables.

RESEARCH PROCESS Definitions continued . . .

Measurement = is the process of determining the direction and intensity of feelings about persons, events, concepts, ideas, and/or objects of interest that are defined as being part of the business problem. As part of measurement, Measurement involves two processes: (1) identification/development of constructs; and (2) scale measurement. Construct = also referred to as a concept, it is a abstract idea formed in the mind based on a set of facts or observations. trust, service quality, role ambiguity, etc. Scale measurement = using a set of symbols or numbers to represent the range of possible responses to a research question.

Role Ambiguity Construct


Conceptual/theoretical definition = the difference between the information available to the person (actual knowledge) and that which is required for adequate performance of a role. Operational definition = the amount of uncertainty an individual feels regarding job role responsibilities and expectations from supervisors, other employees and customers. Measurement scale = consists of 45 items assessed using a 5-point scale, with category labels 1 = very certain, 2 = certain, 3 = neutral, 4 = uncertain, and 5 = very uncertain. Examples of items:
How much freedom of action I am expected to have. How I am expected to handle non-routine activities on the job. The sheer amount of work I am expected to do. To what extent my boss is open to hearing my point of view. How satisfied my boss is with me. How I am expected to interact with my customers.

Service Quality Construct


Conceptual/theoretical definition = the difference between an individuals expectations of service and their actual experiences. Operational definition = how individuals react to their actual service experience with a company relative to their expectations that a company will possess certain service characteristics. Measurement scale = consists of 82 items assessed using a 7-point scale, with category labels 1 = not at all essential to 7 = absolutely essential. Examples of items:
Employees of excellent companies will give prompt service to customers. Excellent companies will have the customers best interests at heart. Excellent companies will perform services right the first time. Employees of excellent companies will never be too busy to respond to customer requests. Excellent companies will give customers individual attention. Materials associated with products and services of excellent companies (such as pamphlets or statements) will be visually appealing .
Source: Parasuraman, Zeithaml & Berry, JM, Fall 1985, p. 44.

RESEARCH PROCESS Identify and Define Research Problem

Theory / Practice Hypotheses / Conceptualization Research Design


Data collection Data Analysis Findings


RESEARCH PROCESS Theory/Practice

What is theory ??

RESEARCH PROCESS Theory/Practice

Theory = a systematic set of relationships providing a consistent and comprehensive explanation of a phenomenon. In practice, a theory is a researchers attempt to specify the entire set of dependence relationships explaining a particular set of outcomes.

Theory is based on prior empirical research, past experiences and observations of behavior, attitudes, or other phenomena, and other theories that provide a perspective for developing possible relationships.
Theory is used to prepare a theoretical framework for the research.

RESEARCH PROCESS Identify and Define Research Problem

Theory / Practice Hypotheses / Conceptualization Research Design


Data collection Data Analysis Findings


RESEARCH PROCESS Hypotheses

HYPOTHESES = PRECONCEPTIONS THE RESEARCHER DEVELOPS REGARDING THE RELATIONSHIPS REPRESENTED IN THE DATA, TYPICALLY BASED ON THEORY, PRACTICE OR PREVIOUS RESEARCH.

Examples:
The average number of cups of coffee students drink during finals will be greater than the average they consume at other times. Younger, part-time employees of Samouels restaurant are more likely to search for a new job.

RESEARCH PROCESS Theoretical Framework

Theoretical Framework = a written description that includes a conceptual model. It integrates all the information about the problem in a logical manner, describes the relationships among the variables, explains the theory underlying these relationships, and indicates the nature and direction of the relationships. The process of developing a theoretical framework involves conceptualization which is a visual specification (conceptual model) of the theoretical basis of the relationships you would like to examine.

RESEARCH PROCESS Theory/Practice

Basic Features of a Good Theoretical Framework:


1. The variables/constructs considered relevant to the study are clearly identified and labeled. 2. The discussion states how the variables/constructs are related to each other, e.g., dependent, independent, moderator, etc. 3. If possible, the nature (positive or negative) of the relationships as well as the direction is hypothesized on the basis of theory, previous research or researcher judgment. 4. There is a clear explanation of why you expect these relationships to exist. 5. A visual (schematic) diagram of the theoretical framework is prepared to clearly illustrate the hypothesized relationships.

RESEARCH PROCESS Conceptual Models

Price

Purchase Likelihood Dependent Variable

Independent Variable

Moderator Variable

Discount Level Restrictions

Price Independent Variable

Purchase Likelihood Dependent Variable

RESEARCH PROCESS Conceptual Models

Mediator Variable (full mediation) Perceived Value Purchase Likelihood

Price

Mediator Variable (partial mediation) Perceived Value

Price Independent Variable

Purchase Likelihood Dependent Variable

CONCEPTUAL MODELS SAMOUELS EMPLOYEE DATABASE


Supervision

Work Groups

Employee Commitment

Compensation Supervision

Work Groups

Intention to Search

Compensation

Potential Hypotheses: Commitment is positively related to supervision, work groups and compensation. Intention to Search is negatively related to supervision, work groups & compensation.

RESEARCH PROCESS Identify and Define Research Problem

Theory / Practice Hypotheses / Conceptualization Research Design


Data collection Data Analysis Findings


RESEARCH DESIGN TYPES

Research Design Alternatives Purpose:


(1) Exploratory to formulate the problem, develop hypotheses, identify constructs, establish priorities for research, refine ideas, clarify concepts, etc. Descriptive to describe characteristics of certain groups, estimate proportion of people in a population who behave in a given way, and to make directional predictions. Causal to provide evidence of the relationships between variables, the sequence in which events occur, and/or to eliminate other possible explanations.

(2)

(3)

Research Design Approaches

TWO BROAD APPROACHES:


1. Qualitative. 2. Quantitative.

RESEARCH DESIGN
ROLE OF QUALITATIVE RESEARCH:
Search of academic, trade and professional

literature (both traditional & Internet). Use of interviews, brainstorming, focus groups. Internalization of how others have undertaken both qualitative and quantitative research. Use of existing questionnaires/constructs.

Outcome of Qualitative Research:


Improve conceptualization. Clarify research design, including data collection

approach. Draft questionnaire.

RESEARCH DESIGN

ROLE OF QUANTITATIVE RESEARCH:


Quantify data and generalize results from

sample to population. Facilitates examination of large number of representative cases. Structured approach to data collection. Enables extensive statistical analysis.

Outcome of Quantitative Research:


Validation of qualitative research findings. Confirmation of hypotheses, theories, etc. Recommend final course of action.

RESEARCH PROCESS Identify and Define Research Problem

Theory / Practice Hypotheses / Conceptualization Research Design


Data collection Data Analysis Findings


DATA COLLECTION

Approaches:
Observation

Human Mechanical/Electronic Devices


Surveys

Self-Completion Mail/Overnight Delivery/Fax Electronic Interviewer-Administered Face-to-Face Home, Work, Mall, Focus Groups Telephone

DATA COLLECTION

Selection of data collection approach?

Budget Knowledge of issues qualitative vs. quantitative Respondent Participation Taste Test; Ad Test Card Sorts; Visual Scaling Time Available

DATA COLLECTION

Types of Data:

Primary Secondary

PRIMARY DATA

Primary Data Sources:

Informal discussions; brainstorming Focus groups Observational Methods Structured & Unstructured Surveys Experiments

PRIMARY DATA FOCUS GROUPS


Focus Groups = bring a small group of people (10-12) together for an interactive, spontaneous discussion of a particular topic or concept. Discussion is led by a trained moderator and usually lasts 1 hours. Typical Objectives:
To identify and define problems. To generate new ideas about products, services, delivery methods, etc. To test advertising themes, positioning statements, company and product names, etc. To discover new constructs and measurement methods. To understand customer needs, wants, attitudes, behaviors, preferences and motives.

PRIMARY DATA OBSERVATIONS

CONSIDERATIONS:

Methods human/mechanical/electronic. Useful where respondent cannot or will not


articulate the answer.

Cannot be used to measure thoughts, feelings,


attitudes, opinions, etc.

PRIMARY DATA QUESTIONNAIRES

PURPOSE OF QUESTIONNAIRES:
To obtain information that cannot be easily

observed or is not already available in written or electronic form. concepts/constructs.

Questionnaires enable researchers to measure

QUESTIONNAIRE DESIGN
Steps in Questionnaire Design:
1. Initial Considerations problem, objectives, target population, sampling, etc. 2. Clarification of Concepts select variables, constructs, measurement approach, etc. 3. Developing the Questionnaire
Length and sequence. Types of questions. Sources of questions.

Wording, coding, layout and instructions.

4. Pre-testing the Questionnaire. 5. Questionnaire Administration Planning.

QUESTIONNAIRE DESIGN

Two Types of Questions: 1. Open-ended 2. Closed-ended OPEN-ENDED QUESTIONS = PLACE NO CONSTRAINTS ON RESPONDENTS; I.E., THEY ARE FREE TO ANSWER IN THEIR OWN WORDS AND TO GIVE WHATEVER THOUGHTS COME TO MIND. CLOSED-ENDED QUESTIONS = RESPONDENT IS GIVEN THE OPTION OF CHOOSING FROM A NUMBER OF PREDETERMINED ANSWERS.

QUESTIONNAIRE DESIGN
EXAMPLES OF OPEN-ENDED QUESTIONS: HOW DO YOU TYPICALLY DECIDE WHICH
RESTAURANT YOU WILL IN FOR THE PAST YEAR? EAT AT?

WHICH MUTUAL FUNDS HAVE YOU BEEN INVESTING


HOW ARE YOUR INVESTMENT FUNDS PERFORMING?

DO YOU THINK AIRPORT SECURITY IS BETTER NOW


THAN IT WAS SIX MONTHS AGO?

QUESTIONNAIRE DESIGN
OPEN-ENDED QUESTIONS

Typically used in exploratory/qualitative studies. Typically used in personal interview surveys involving

small samples. Allows respondent freedom of response. Respondent must be articulate and willing to spend time giving a full answer. Data is in narrative form which can be time consuming and difficult to code and analyze. Possible researcher bias in interpretation. Narrative is analyzed using of content analysis. Software is available (e.g., NUD*IST).

QUESTIONNAIRE DESIGN

CLOSED-END QUESTIONS:

Single Answer Multiple Answer Rank Order Numeric Likert-Type Scales Semantic Differential

Examples of Closed-end Questions:


1. 2. 3. 4.
Did you check your email this morning? __ Yes __ No Do you believe Enron senior executives should be put in jail? __ Yes __ No Should the U.K. adopt the Euro or keep the Pound? __ Adopt the Euro __ Keep the Pound Which countries in Europe have you traveled to in the last six months? __ Belgium __ Germany __ France __ Holland __ Italy __ Switzerland __ Spain __ Other (please specify) _____________ How often do you eat at Samouels Greek Cuisine restaurant? __ Never __ 1 4 times per year __ 5 8 times per year __ 9 12 times per year __ More than 12 times per year

5.

QUESTIONNAIRE DESIGN

CLOSED-END QUESTIONS

Typically used in quantitative studies. Assumption is researcher has knowledge to pre-specify

response categories. Data can be pre-coded and therefore in a form amenable for use with statistical packages (e.g., SPSS, SAS) data capture therefore easier. More difficult to design but simplifies analysis. Used in studies involving large samples. Limited range of response options.

QUESTIONNAIRE DESIGN

BROAD CONSIDERATIONS

Sequencing of questions. Identification of concepts. How many questions are required to capture

each concept. Question wording. Overall length of questionnaire. Placing of sensitive questions. Ability of respondents. Level of measurement. Open-ended versus closed-end questions.

QUESTIONNAIRE DESIGN

QUESTIONNAIRE SEQUENCE
Opening Questions Research Topic Questions Classification Questions

QUESTIONNAIRE DESIGN

PREPARING GOOD QUESTIONS:

Use Simple Words. Be brief. Avoid Ambiguity. Avoid Leading Questions. Avoid Double-Barreled Questions. Check Questionnaire Layout. Prepare Clear Instructions. Watch Question Sequence.

QUESTIONNAIRE DESIGN

Double-Barreled Questions:
To what extent do you agree or disagree with the following statements?

Harrods employees are friendly and helpful. Harrods employees are courteous and knowledgeable.

QUESTIONNAIRE DESIGN
Pre-testing Questionnaires:

Objective: to identify possible shortcomings of questionnaire. Approaches informal or formal. Can assess:

clarity of instructions cover letter clarity of questions adequacy of codes and categories for pre-coded questions quality of responses likely response rate

ability to perform meaningful analyses time to complete the questionnaire cost of data collection which questions are relevant whether key questions have been overlooked sources of bias

No hard and fast rules.

Scale Development

SCALES = THE APPROACH USED TO MEASURE CONCEPTS (CONSTRUCTS).


Two Options: 1. Use published scales. 2. Develop original scales.

MEASUREMENT SCALES
TYPES OF SCALES:

Metric (interval & ratio) Likert-type Summated-Ratings (Likert) Numerical Semantic Differential Graphic-Ratings Nonmetric (nominal & ordinal) Categorical Constant Sum Method Paired Comparisons Rank Order Sorting

MEASUREMENT SCALES Metric


EXAMPLES OF LIKERT-TYPE SCALES: When I hear about a new restaurant , I eat there to see what it is like.
Strongly Agree Agree Somewhat 1 2 5 Neither Agree Disagree Strongly or Disagree Somewhat Disagree 3 4

When I hear about a new restaurant , I eat there to see what it is like.

Strongly Agree 1

Strongly Disagree 5

MEASUREMENT SCALES Metric


SUMMATED RATINGS SCALES:
A scaling technique in which respondents are asked to indicate their degree of agreement or disagreement with each of a number of statements. A subjects attitude score (summated rating) is the total obtained by summing over the items in the scale and dividing by the number of items to get the average.
Example: My sales representative is . . . . SD SA Courteous ___ ___ Friendly ___ ___ Helpful ___ ___ Knowledgeable ___ ___ D ___ ___ ___ ___ N ___ ___ ___ ___ A ___ ___ ___ ___

MEASUREMENT SCALES Metric


Alternative Approach to Summated Ratings scales:

When I hear about a new restaurant , I eat there to see what it is like.
Strongly Agree Agree Somewhat Disagree 1 2 5 Neither Agree Disagree Strongly or Disagree Somewhat 3 4

I always eat at new restaurants when someone tells me they are good. Strongly Agree Agree Somewhat Disagree 1 2 5 Neither Agree Disagree or Disagree Strongly Somewhat

This approach includes a separate labeled Likert scale with each item (statement). The summated rating is a total of the responses for all the items divided by the number of items.

MEASUREMENT SCALES Metric

NUMERICAL SCALES: Example:


Using a 10-point scale, where 1 is not at all important and 10 is very important, how important is ______ in your decision to do business with a particular vendor.
Note: you fill in the blank with an attribute, such as reliable delivery, product quality, complaint resolution, and so forth.

MEASUREMENT SCALES Metric


SEMANTIC DIFFERENTIAL SCALES:
A scaling technique in which respondents are asked to check which space between a set of bipolar adjectives or phrases best describes their feelings toward the stimulus object.
Example: My sales representative is . . . . Courteous ___ ___ ___ Discourteous Friendly ___ ___ ___ Unfriendly Helpful ___ ___ ___ Unhelpful Honest ___ ___ ___ Dishonest

___ ___ ___ ___

___ ___ ___ ___

MEASUREMENT SCALES Metric GRAPHIC-RATINGS SCALES:


A scaling technique in which respondents are asked to indicate their ratings of an attribute by placing a check at the appropriate point on a line that runs from one extreme of the attribute to the other. Please evaluate each attribute in terms of how important the attribute is to you personally (your company) by placing an X at the position on the horizontal line that most reflects your feelings. Not Important Very Important Courteousness _____________________________________ Friendliness _____________________________________ Helpfulness _____________________________________ Knowledgeable _____________________________________

MEASUREMENT SCALES Nonmetric CATEGORICAL SCALE:


Categorical scales are nominally measured opinion scales that have two or more response categories. How satisfied are you with your current job?
[ [ [ [ [ ] ] ] ] ]

Very Satisfied Somewhat Satisfied Neither Satisfied nor Dissatisfied Somewhat Dissatisfied Very Dissatisfied

Note: Some researchers consider this a metric scale when coded 1 5 .

MEASUREMENT SCALES Nonmetric CONSTANT-SUM METHOD:


A scaling technique in which respondents are asked to divide some given sum among two or more attributes on the basis of their importance to them. Please divide 100 points among the following attributes in terms of the relative importance of each attribute to you. Courteous Service Friendly Service Helpful Service Knowledgeable Service Total ____ ____ ____ ____ 100

MEASUREMENT SCALES Nonmetric


PAIRED COMPARISON METHOD:
A scaling technique in which respondents are given pairs of stimulus objects and asked which object in a pair they prefer most. Please circle the attribute describing a sales representative which you consider most desirable. Courteous Knowledgeable Friendly Helpful versus versus versus Helpful Courteous

MEASUREMENT SCALES Nonmetric

SORTING:
A scaling technique in which respondents are asked to indicate their beliefs or opinions by arranging objects (items) on the basis of perceived importance, similarity, preference or some other attribute.

MEASUREMENT SCALES Nonmetric


RANK ORDER METHOD:
A scaling technique in which respondents are presented with several stimulus objects simultaneously and asked to order or rank them with respect to a specific characteristic.
Please rank the following attributes on how important each is to you in relation to a sales representative. Place a 1 beside the attribute which is most important, a 2 next to the attribute that is second in importance, and so on.
Courteous Service Friendly Service Helpful Service Knowledgeable Service

___ ___ ___

___

Scale Development
PRACTICAL DECISIONS WHEN DEVELOPING SCALES:

Number of items (indicators) to measure a concept? Number of scale categories? Odd or even number of categories?
(Include neutral point ?) Balanced or unbalanced scales? Forced or non-forced choice? (Include Dont Know ?) Category labels for scales? Scale reliability and validity?

Scale Development
CATEGORY LABELS FOR SCALES?
Verbal Label: How important is the size of the hard drive in selecting a laptop PC to purchase? Very Somewhat Neither Important Somewhat Very Unimportant Unimportant or Unimportant Important Important 1 2 3 4 5 Numerical Label: How likely are you to purchase a laptop PC in the next six months? Very Very Unlikely Likely 1 2 3 4 5 Unlabeled: How important is the weight of the laptop PC in deciding which brand to purchase? Very Very Unimportant Important

___

___

___

___

___

MEASUREMENT SCALES

CHOOSING A MEASUREMENT SCALE:

Capabilities of Respondents. Context of Scale Application. Data Analysis Approach. Validity and Reliability.

MEASUREMENT SCALES

ASSESSING MEASUREMENT SCALES:

Validity

Reliability
Measurement Error = occurs when the values obtained in a survey (observed values) are not the same as the true values (population values).

RESEARCH DESIGN

Types of Errors:

Nonresponse = problem definition, refusal, sampling, etc. Response = respondent or interviewer. Data Collection Instrument: Construct Development. Scaling Measurement. Questionnaire Design/Sequence, etc. Data Analysis. Interpretation.

SECONDARY DATA

Data that has been gathered previously for other purposes.

SECONDARY DATA

Secondary Data Issues:

Availability Relevance

Accuracy
Sufficiency

RESEARCH PROCESS Identify and Define Research Problem

Theory / Practice Hypotheses / Conceptualization Research Design


Data collection Data Analysis Findings


Research Design & Data Collection

LEARNING CHECKPOINT:

Define a research problem to be studied. Identify the topics /concepts that will be covered
to answer research questions. Identify the types of questions and/or scaling you will use. How will you evaluate the questions/scales you use? Determine the best way to collect the data. Present group suggestions; defend.

You might also like