0% found this document useful (0 votes)
36 views33 pages

MODULE1

The document outlines the meaning, characteristics, and methodologies of research, emphasizing its systematic and scientific nature aimed at discovering new knowledge. It discusses key perspectives on research, the importance of defining research problems, and the criteria for good research, including clarity, relevance, and ethical conduct. Additionally, it covers the objectives and qualities of effective research, as well as the steps for framing research questions and the application of Bloom's Taxonomy in educational contexts.

Uploaded by

kundliashristi7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views33 pages

MODULE1

The document outlines the meaning, characteristics, and methodologies of research, emphasizing its systematic and scientific nature aimed at discovering new knowledge. It discusses key perspectives on research, the importance of defining research problems, and the criteria for good research, including clarity, relevance, and ethical conduct. Additionally, it covers the objectives and qualities of effective research, as well as the steps for framing research questions and the application of Bloom's Taxonomy in educational contexts.

Uploaded by

kundliashristi7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 33

MEANING AND

CHARACTERISTICS
OF
RESEARCH
MODULE 1
Aspect Methods Methodology
Definition Tools or techniques used to The rationale and philosophical
collect and analyse data. framework guiding the choice
and use of methods
Focus Operational (how the research Conceptual and strategic (why
is conducted). the methods are used in a
particular way).
Examples Surveys, interviews, Qualitative, quantitative, or
experiments, statistical mixed-method approaches;
analysis. theoretical frameworks.
Wayne Booth's Concerned with evidence Explains the reasoning behind
Perspective collection and analysis to selecting specific methods
address research questions.
Kothari's Perspective Focuses on the procedural Emphasizes the scientific logic
techniques for gathering data. guiding research design.
WHAT IS RESEARCH?

• Research is defined as a systematic and scientific investigation aimed at discovering new knowledge, verifying
and testing existing knowledge, and applying this knowledge to solve problems.
• "A scientific and systematic search for pertinent information on a specific topic."
• Key elements of the definition:
1. A Systematic Process: Research follows a structured and organized process, including defining a problem,
formulating a hypothesis, gathering data, analyzing it, and interpreting the findings.
2. Purpose-driven: The goal of research is to answer questions, solve problems, or discover new facts and principles.
3. Pertinence: The information sought should be relevant to the topic or problem under study.
4. Problem-Oriented Investigation: Booth and his co-authors argue that research begins with identifying a problem
or a gap in understanding, followed by efforts to address that problem through careful study.
5. A Dialogical Process: Research is framed as a conversation with existing knowledge, where the researcher
engages with others' work and contributes new insights.
6. A Path to Persuasion: The authors highlight that research is not just about discovering truths but also about
effectively presenting findings to persuade others of their validity and significance.
POPPER’S FALSIFICATION PRINCIPLE

• Karl Popper’s (Austrian-British) falsification principle is a cornerstone


of the philosophy of science.
• According to Popper, for a theory to be considered scientific, it must
be falsifiable, meaning it should make predictions that can be tested
and potentially proven false.
• Key Idea
• A theory is not scientific if it cannot, in principle, be refuted by
empirical evidence. Instead of seeking to prove a theory true, scientists
should attempt to disprove it.
• If a theory survives rigorous attempts at falsification, it gains
credibility but never becomes definitively proven.

• Example 1: Einstein's Theory of General Relativity


• Einstein's theory predicted that light from distant stars would bend
when passing near a massive object like the sun. This prediction was
tested during a solar eclipse in 1919 when light from stars near the sun
was observed to bend exactly as Einstein's theory predicted.
• Falsifiable Prediction: If the stars’ light did not bend, the theory of
general relativity would have been falsified.
• Outcome: The prediction matched observations, supporting the theory
DEFINITIONS OF RESEARCH

1. Redman and Mory in their book – “The Romance of Research , 1923”


• "Research is a systematized effort to gain new knowledge.”
• This definition emphasizes the structured and organized approach to investigating unknown aspects.

2. Clifford Woody in his article “The Values of Educational Research to the Classroom Teacher, 1927”
• "Research comprises defining and redefining problems, formulating hypotheses (a proposed explanation or
prediction about a specific phenomenon or relationship between variables ) or suggested solutions, collecting, organizing,
and evaluating data, making deductions, and reaching conclusions to determine whether they fit the formulated
hypothesis.”
• Woody's definition highlights the procedural and methodical steps involved in research.

3. Fred N. Kerlinger in his book “Foundations of Behavioural Research, 1973”


• "Research is a systematic, controlled, empirical, and critical investigation of hypothetical propositions about
the presumed relations among natural phenomena.”
• Kerlinger's perspective focuses on the scientific and empirical nature of research.
4. Best and Kahn in their book – “Research in Education, 1986”
• "Research is the systematic and objective analysis and recording of controlled observations that may lead to the
development of generalizations, principles, or theories, resulting in prediction and possibly ultimate control of
events.”
• This definition underscores the predictive and theoretical contributions of research.

5. John W. Creswell in his book – “A Concise Introduction to Mixed Methods Research , 2014”
• "Research is a process of steps used to collect and analyze information to increase our understanding of a topic or
issue.”
• Creswell’s definition simplifies the process of research into key steps, focusing on understanding.

6. Leedy and Ormrod in their book – “Practical Research Planning and Design, 2005”
• "Research is a systematic process of collecting, analysing, and interpreting information (data) in order to
increase our understanding of a phenomenon about which we are interested or concerned.”
• They stress the interpretive aspect of research.
PURPOSE OF RESEARCH

1. Exploration : Research is conducted to explore an area where little is known. It aims to uncover the nature of the
problem or generate new ideas and insights.

2. Description: Descriptive research focuses on portraying an accurate profile of situations, events, or phenomena.
It seeks to describe the characteristics of a particular group, situation, or phenomenon.

3. Diagnosis : Research aims to determine the causes of a specific issue or phenomenon. Diagnostic studies
investigate the underlying factors responsible for certain events or behaviors.

4. Hypothesis Testing : Research is often undertaken to test hypotheses or theories, either to validate or refute
them. This involves examining the relationships between variables and predicting outcomes.

5. Prediction and Control : Research can also aim to predict future events or behaviors based on current
knowledge and findings. It may also provide ways to control certain factors or conditions to achieve desired
outcomes.

6. Action-Oriented : Some research is conducted with the specific intent to solve problems or provide actionable
CRITERIA OF GOOD RESEARCH

1. Clarity in Purpose:
• Good research begins with a clear and well-defined question or problem.
• It articulates why the research matters and to whom it is relevant.

2. Relevance to the Community:


• Effective research contributes to the ongoing scholarly conversation or addresses a real-world issue of importance.

3. Systematic Approach:
• It follows a structured process for gathering, analysing, and interpreting evidence.
• The methodology should be appropriate to the research question.

4. Grounded in Evidence:
• Good research relies on credible, verifiable, and appropriately sourced evidence.
• Claims and conclusions must be supported by data and logical reasoning.

5. Engagement with Existing Knowledge:


• It situates itself within the context of prior research, acknowledging and building upon existing work.
6. Logical Argumentation:
• The reasoning should be coherent, well-structured, and free from fallacies.
• Arguments should follow logically from the evidence presented.

7. Original Contribution:
• Good research adds something new to the field, whether through insights, solutions, or innovative approaches.

8. Persuasive Communication:
• Findings and arguments should be presented clearly and effectively, engaging the intended audience.
• Good research anticipates counterarguments and addresses them convincingly.

9. Ethical Conduct:
• Respects ethical standards in the collection, use, and presentation of data.
• Avoids plagiarism and ensures proper citation of sources.

10. Reflective and Open-Minded:


• Acknowledges limitations and is open to alternative perspectives or interpretations.

11. Reproducibility and Transparency:


OBJECTIVES OF GOOD RESEARCH

1. To Address a Clear Research Question: Good research begins with a clear, focused research question or problem,
guiding the entire investigation.

2. To Contribute to Knowledge: The primary objective is to contribute new insights, theories, or perspectives to the
field by either challenging existing assumptions or filling gaps in knowledge.

3. To Engage with Existing Literature: Research should not only explore new ideas but also build upon or critically
engage with existing work, contributing to ongoing academic conversations.

4. To Provide Evidence-Based Solutions: A central aim is to gather, analyze, and present evidence that answers the
research question or resolves the problem effectively.

5. To Communicate Findings Effectively: Good research must be presented in a way that is clear, coherent, and
persuasive to its intended audience, ensuring the findings have an impact.

6. To Reflect and Review: The research process should involve continuous self-reflection, critically assessing one’s
QUALITIES OF GOOD RESEARCH

1. Clarity: Good research is clearly articulated. The research question, methodology, and findings
should be easy to understand and logically structured.

2. Relevance: The research should address an important issue that contributes meaningfully to the
field. It must have significance to the academic community or society.

3. Systematic and Methodical: Research should be well-organized, following a logical and


structured approach. The methodology must be suitable for answering the research question.

4. Critical Engagement with Sources: Effective research demonstrates a strong engagement with
existing literature, acknowledging previous work and positioning the new research within the
broader scholarly context.

5. Objectivity and Rigor: Good research is objective, based on sound reasoning, and avoids bias. It
6. Originality: Research should offer something new, whether it's new findings, a new
perspective, or a new approach to an existing problem.

7. Credibility: The research must be credible, with evidence and sources that are trustworthy, and
conclusions that logically follow from the data.

8. Ethical Integrity: Ethical considerations are central to good research. This includes proper
citation, honest reporting of results, and respect for participants (if applicable).

9. Persuasiveness: Good research should not only present evidence but also persuade others of its
value and the validity of its conclusions.

10. Replicability and Transparency: To ensure credibility and enable further research, good
research should be transparent in its methodology and data collection processes, allowing others
to replicate or build on the study.
DEFINING RESEARCH PROBLEMS

• WAYNE C. BOOTH (THE CRAFT OF RESEARCH) OUTLINE THE FOLLOWING KEY POINTS:

1. Distinguishing Practical Problems from Research Problems


• A practical problem involves real-world challenges requiring immediate solutions (e.g., improving
efficiency or solving operational issues).
• A research problem involves questions about knowledge that require deeper understanding (e.g., why
something happens or how it can be improved systematically).

2. Three Components of a Research Problem


• A Topic: A general area of interest or subject matter.
• A Question: A specific issue or aspect that needs to be explored.
• Significance: Why answering this question matters to the academic field or to a broader audience.

3. Developing a Problem Statement


• Booth encourages researchers to explicitly state what is not yet known or understood (RESEARCH GAP)
and why resolving this uncertainty is important.
• They propose framing the problem as a gap in existing knowledge and articulating how your research will
• EXAMPLE :

• Practical vs. Research Problems:


• Practical Problem Example: "Plastic waste is polluting rivers in urban areas."
• Focus: Solving a real-world issue (e.g., implementing waste management systems).
• Research Problem Example: "What are the socio-economic factors influencing the success of community-led
plastic waste management programs?"
• Focus: Understanding the underlying principles that contribute to solving the practical problem.

• Three Components of a Research Problem:

1.A Topic: What is the general subject or area of interest?


Example: "Environmental sustainability."
2.A Question: What is the specific issue or gap in understanding?
Example: "Why are community-led sustainability initiatives more successful in some regions than
others?"
3.Significance: Why is this question important to the academic field or society?
Example: "Understanding these factors can help policymakers design more effective sustainability
C.R. KOTHARI (RESEARCH METHODOLOGY: METHODS AND TECHNIQUES)

• C.R. Kothari provides a systematic approach to defining a research problem, focusing on clarity, feasibility, and
relevance. Key points include:
1. Identification of the Problem : A research problem is the issue that the researcher seeks to address. It should emerge
from thorough observation, discussion, and study of existing knowledge.

2. Criteria for Selecting a Problem


• Interest: The problem should align with the researcher's curiosity or area of expertise.
• Relevance: The problem should contribute meaningfully to the field of study.
• Feasibility: The problem should be researchable within the constraints of time, resources, and expertise.
• Originality: The problem should offer scope for new insights or findings.

3. Steps in Defining the Problem


• Understanding the Problem Clearly: Gather all possible information and analyse the background of the problem.
• Reformulation: Express the problem in precise terms, breaking it into sub-problems if necessary.
• Delimitation: Specify the boundaries of the problem to keep the research focused and manageable.

4. Necessity of Defining the Problem


• Clear problem definition ensures that research objectives are specific and the methodology is appropriate. Poorly
defined problems can lead to vague or irrelevant outcomes.
• Steps with Examples:

1.Identification of the Problem:


Example: A researcher observes high dropout rates in rural schools.

2.Understanding the Problem Clearly:


Gather data on factors such as family income, accessibility, teaching quality, and cultural
attitudes toward education.

3.Reformulation:
Restate the problem in specific terms:
"What socio-economic and institutional factors contribute to high dropout rates among rural
high school students?”

4.Delimitation:
Narrow the focus:
"How does family income affect dropout rates among rural high school students in the
FRAMING RESEARCH QUESTIONS

Wayne C. Booth : Steps for Framing Research Questions:


1. Start with a Topic: Identify a general area of interest.
• Example: "The role of technology in education.”

2. Move to a Specific Question: Narrow the topic to a focused inquiry.


• Example: "How does the use of AI in education affect student engagement in high school classrooms?”

3. Justify the Question: Explain why the question matters by showing its significance.
• Example: "Understanding the impact of AI can help educators design effective teaching strategies and enhance
learning outcomes.”

4. Evaluate the Feasibility: Ensure the question is researchable within the given constraints.
• Example Question: "What specific AI tools are most effective in increasing engagement among high school
students in STEM subjects?"
C R KOTHARI : Steps for Framing Research Questions:
1. Understand the Research Problem: Identify the core issue and its context.
• Example Problem: "Rising unemployment among graduates."

2. Break Down the Problem: Frame questions that address specific aspects of the problem.
• Example Questions:
• "What are the main reasons for graduate unemployment in urban areas?"
• "How do skill mismatches contribute to unemployment?"

3. Ensure Clarity and Focus: The questions should be clear, concise, and free from ambiguity.
• Example: Avoid broad questions like "Why is there unemployment?" and instead ask, "What skills are lacking among
unemployed graduates in the technology sector?"

4. Consider Research Scope and Feasibility: Frame questions that can be answered within the constraints of time,
resources, and data availability.
• Example: "What percentage of unemployed graduates lack industry-relevant certifications in urban centers?"
BLOOM'S TAXONOMY WHILE FRAMING RESEARCH QUESTIONS

• In 1956, Benjamin Bloom with collaborators Max Englehart, Edward Furst, Walter Hill, and David Krathwohl published a
framework for categorizing educational goals: Taxonomy of Educational Objectives.
• Bloom's Taxonomy consists of six levels of cognitive skills, arranged from lower-order to higher-order thinking:
1. Remembering (recall of facts and basic concepts)
2. Understanding (explaining ideas or concepts)
3. Applying (using information in new situations)
4. Analysing (breaking information into parts to explore relationships)
5. Evaluating (justifying a decision or stance)
6. Creating (producing new or original work)

EXAMPLE: Indian foreign policy framed at different cognitive levels of Bloom's Taxonomy

1. Remembering
• (Basic recall of facts and information)
• What are the key principles of India’s foreign policy as outlined in the Panchsheel Agreement?
• When did India establish diplomatic relations with its neighboring countries, such as China and Pakistan?
• What are the main objectives of India's Look East and Act East Policies?
2. Understanding
• (Explaining ideas or concepts)
• How does India's policy of non-alignment shape its foreign relations in a multipolar world?
• In what ways has India’s foreign policy evolved since independence?
• How does India balance its relations with the United States and Russia in contemporary geopolitics?

3. Applying
• (Using knowledge in new situations)
• How can India's foreign policy strategies be applied to enhance its role in the Indo-Pacific region?
• How has India's approach to multilateral forums like BRICS and the United Nations impacted its global influence?
• What lessons from India's foreign policy during the Cold War can be applied to its current stance on global power
dynamics?

4. Analysing
• (Breaking down information to explore relationships and patterns)
• What are the key differences between India’s Look East and Act East policies, and how have they impacted India-
ASEAN relations?
• How do economic considerations shape India's foreign policy towards the Gulf countries?
5. Evaluating
• (Assessing and justifying decisions or strategies)
• To what extent has India's foreign policy been successful in countering China’s Belt and Road Initiative?
• Are India's strategic partnerships with countries like Japan and Australia effective in countering regional security threats?
• How effective has India’s foreign policy been in achieving energy security through its engagement with West Asia?

6. Creating
• (Generating new ideas, frameworks, or strategies)
• What innovative foreign policy strategies could India adopt to strengthen its leadership in the Global South?
• How can India leverage its diaspora to enhance its soft power and influence in global affairs?
• What alternative approaches can India take to resolve long-standing border disputes with China and Pakistan while ensuring
regional stability?
Formulation of Hypothesis

• A hypothesis is a testable statement that predicts a relationship between two or more variables. Central to any hypothesis are the independent
• variable and the dependent variable.
• These variables help define and structure the hypothesis, guiding the research process.

1. Independent Variable
• Definition: The variable that is manipulated, controlled, or changed by the researcher to observe its effect.
• Role in Hypothesis: It acts as the "cause" or predictor in the relationship.
• Example: In the hypothesis "Increased study time leads to higher test scores," the independent variable is study time.

2. Dependent Variable
• Definition: The variable that is measured or observed in response to changes in the independent variable.
• Role in Hypothesis: It acts as the "effect" or outcome in the relationship.
• Example: In the same hypothesis "Increased study time leads to higher test scores," the dependent variable is test scores.

• Key Steps in Formulating a Hypothesis:


1. Identify the Research Problem: Understand the issue or phenomenon to be studied.
2. Conduct Preliminary Research: Gather background information from existing literature or observations.
3. Define Variables: Clearly identify the independent (cause) and dependent (effect) variables.
4. State the Hypothesis: Frame it as a clear, concise, and testable statement.
• Relationship Between Independent and Dependent Variables in a Hypothesis
• A hypothesis predicts how changes in the independent variable will result in changes to the dependent variable.
• The relationship may be:
• Positive: Both variables increase together (e.g., "Higher income leads to increased spending").
• Negative: One variable increases while the other decreases (e.g., "Increased screen time reduces sleep quality").
• No Relationship: The hypothesis may propose no significant effect (null hypothesis).

• Examples of Hypotheses with Variables in International Relations


1. Economic Sanctions and Political Behaviour
• Hypothesis: "Imposing economic sanctions leads to a decline in a target country's military expenditure."
• Independent Variable: Imposition of economic sanctions.
• Dependent Variable: Military expenditure of the target country.

2. Trade Agreements and Conflict Reduction


• Hypothesis: "Countries that engage in free trade agreements are less likely to engage in armed conflict with each other."
• Independent Variable: Participation in free trade agreements.
• Dependent Variable: Likelihood of armed conflict.
3. Military Alliances and National Security
• Hypothesis: "Membership in a strong military alliance increases a state's perception of national security."
• Independent Variable: Membership in a military alliance.
• Dependent Variable: Perception of national security.

4. Foreign Aid and Policy Alignment


• Hypothesis: "Higher levels of foreign aid from major powers result in greater alignment of recipient countries' voting
behaviour in the United Nations."
• Independent Variable: Levels of foreign aid received.
• Dependent Variable: Alignment in UN voting behaviour with the donor country.

5. Diplomatic Engagement and Conflict Resolution


• Hypothesis: "Increased diplomatic engagement leads to faster resolution of territorial disputes."
• Independent Variable: Level of diplomatic engagement (e.g., number of negotiations, summits).
• Dependent Variable: Time taken to resolve territorial disputes.

6. Climate Agreements and Emission Reductions


• Hypothesis: "Participation in international climate agreements reduces greenhouse gas emissions in signatory countries."
• Independent Variable: Participation in climate agreements.
• Dependent Variable: Greenhouse gas emissions.
7. Nuclear Weapons and Deterrence
• Hypothesis: "States with nuclear weapons face fewer direct military threats from non-nuclear states."
• Independent Variable: Possession of nuclear weapons.
• Dependent Variable: Number of direct military threats faced.

TWO MAIN ASPECTS: GENERALITY AND SPECIFICITY, WHICH DEFINE ITS SCOPE AND FOCUS.

Generality: refers to how broadly a hypothesis applies across different contexts, situations, or populations.
• Purpose: General hypotheses aim to identify overarching patterns or relationships that are applicable to a wide range
of scenarios.
• Advantages:
• Facilitates the development of theories or universal principles.
• Provides a foundation for exploratory studies.
• Example:
• In international relations: "Democracies are less likely to go to war with one another."
• This hypothesis is broad and seeks to capture a general pattern observed in international politics.
Specificity focuses on narrowly defined and precise relationships between variables that are testable in a given
context.
• Purpose: Specific hypotheses allow researchers to test concrete predictions with measurable variables.
• Advantages:
• Enhances accuracy and clarity in research.
• Facilitates rigorous empirical testing and detailed conclusions.
• Example:
• In international relations: "Countries with democratic governments that share trade agreements are 30% less
likely to engage in military conflicts over the next decade."
• This hypothesis specifies the type of government, trade agreements, likelihood of conflict, and a time frame.

Balancing Generality and Specificity :


• Effective hypotheses balance generality and specificity to ensure they are meaningful yet practical:
• Generality ensures relevance to broader theoretical or practical issues.
• Specificity ensures testability and empirical precision.
• For example:
• General Hypothesis: "Economic interdependence reduces the likelihood of conflict between nations."
• Specific Hypothesis: "A 10% increase in bilateral trade volume reduces the likelihood of military conflict between
trading nations by 5% over five years."
ASPECT HYPOTHESIS RESEARCH STATEMENT

Purpose • A tentative assumption made to test • A declarative sentence that


its validity through empirical research. conveys a fact, observation, or
• It serves as the foundation for inquiry opinion without the necessity of
and experimentation. empirical testing.

Testability • Testable and falsifiable through data • Does not require testing and may
collection and analysis. It predicts simply assert a fact or
relationships between variables. observation.
• Example: "Literacy rates are high in
• Example: "Higher literacy rates lead to developed countries."
improved economic development."

Relationship • Defines the relationship between • May or may not involve variables
Between dependent and independent variables. and does not necessarily propose a
Variables • Example: "Increasing public investment relationship.
in education reduces unemployment • Example: "Public investment in
rates." education is important for national
growth."

Exploratory • Focused and aimed at guiding research • General and does not necessarily
Role in Research • A central component of the research • Often serves as a contextual or supporting
process, guiding the formulation of assertion to set the stage for research but
research questions and shaping the design lacks the predictive element.
of the study.
Argumentative • Integral to constructing an argument because • May contribute to background information
Value it anticipates potential conclusions and or descriptive context without demanding
invites scrutiny. empirical validation.

• Example: "Countries that invest in renewable • Example: "Renewable energy sources are
energy see a decline in carbon emissions becoming more popular globally."
within a decade."

Empirical Focus • Demands empirical investigation, often • Can stand independently without the need
involving data collection, testing, and for empirical support.
validation.
NULL HYPOTHESIS

• The null hypothesis is like saying, "Nothing special is happening."


• It's a starting idea that says there is no difference or no effect in what you're testing.
• For example:
1. If you're testing a new drug, the null hypothesis says, "The drug doesn't work, it has no effect."
2. If you're flipping a coin, the null hypothesis says, "The coin is fair, and heads and tails are equally likely."
• You use the null hypothesis to check if your experiment shows something surprising.
• If your data shows something unexpected, you might reject the null hypothesis. If your data doesn't show anything
unusual, you keep the null hypothesis.
• Key Points:
• Default assumption: The null hypothesis typically assumes that any observed differences or effects are due to random
chance or natural variability, rather than a true effect.
• Objective of testing: In hypothesis testing, the null hypothesis is tested against an alternative hypothesis (denoted as H1
or HA ) which proposes that there is an effect, relationship, or difference.
• Reject or fail to reject: The test determines whether the data provide enough evidence to reject the null hypothesis in
favoUr of the alternative hypothesis.
• It's important to note that you cannot "prove" the null hypothesis; rather, you either reject it or fail to reject it.
• Examples:
1. Medical Testing Example: Suppose you're testing a new drug.
• Null Hypothesis : The new drug has no effect on patients (i.e., the mean improvement in health for the
treatment group is equal to the mean improvement in health for the control group).
• Alternative Hypothesis :The new drug does have an effect on patients (i.e., the mean improvement for the
treatment group is different from the control group).
2. Coin Flip Example: Testing if a coin is fair.
• Null Hypothesis :The coin is fair (i.e., the probability of heads is 50%).
• Alternative Hypothesis : The coin is not fair (i.e., the probability of heads is not 50%).

HYPOTHESIS : TYPE 1 AND TYPE 2 ERROR

• In statistical hypothesis testing, Type 1 and Type 2 errors are two possible mistakes that can occur when making a decision
about a hypothesis.
• 1. Type I Error (False Positive):
• Definition: A Type 1 error occurs when you reject the null hypothesis when it is actually true.
• Example: Suppose a pharmaceutical company is testing a new drug and the null hypothesis states that the drug has
no effect. If you conclude that the drug is effective (reject the null hypothesis) when, in fact, it has no effect, you
have made a Type 1 error.
• Consequences: A Type 1 error is often considered more serious because it leads to a conclusion that something
exists when, in reality, it does not (a false positive).
• Significance Level (α\alphaα):
• The probability of committing a Type 1 error is denoted by α\alphaα, which is the significance level of the test.
• For example, a 5% significance level (α=0.05\alpha = 0.05α=0.05) means there is a 5% chance of making a Type 1
error.

2. Type II Error (False Negative):


• Definition: A Type 2 error occurs when you fail to reject the null hypothesis when it is actually false.
• Example: Using the same pharmaceutical example, the null hypothesis would state that the drug has no effect. If
the drug is actually effective but you fail to reject the null hypothesis (concluding the drug is not effective), you
have made a Type 2 error.
• Consequences: A Type 2 error is less serious in some contexts, but it can still lead to missing important findings or
failing to detect an effect that exists (a false negative).
• Power (1−β1 - \beta1−β):
• The probability of avoiding a Type 2 error is called the power of the test. A higher power means the test is more
likely to detect an effect when one exists.
• The probability of committing a Type 2 error is denoted by β\betaβ.

• Balancing Type I and Type II Errors:


• There is often a trade-off between the two errors.
• Reducing the probability of a Type I error (α\alphaα) increases the likelihood of a Type II error (β\betaβ), and vice
versa.
• The goal is to choose a significance level (α\alphaα) that balances the risk of both types of errors based on the
context of the test.

You might also like