4683-Methods of Social Research-I (Assignment-II)
4683-Methods of Social Research-I (Assignment-II)
ASSIGNMENT No. 2
(Units 5-8)
Q.1 Why sampling is carried out? Briefly explain the sampling techniques used in research.
Q.3 What are demerits/weaknesses of personal experience, tradition and authority as sources of
knowledge? Discuss.
Q.4 How internet and computer has made the research more feasible? Discuss.
Sampling
In statistical analysis, sampling is the act of selecting a predefined number of observations from a larger
population. Depending on the sort of study being done, a variety of methods, including systematic
sampling and simply random sampling, may be employed to draw samples from a broader population.
❖ In order to assess the quality and completeness of account balances, Certified Public
Accountants sample the data during audits.
❖ Block sampling, judgement sampling, random sampling, and systematic sampling are examples
of sampling types.
❖ To determine the requirements and preferences of their target market, businesses employ
sampling as a marketing strategy.
Sampling is a strategy for choosing certain individuals or a subset of the population in order to draw
conclusions from them statistically and estimate the characteristics of the entire population. Researchers
frequently utilize various sampling techniques in market research so they do not have to study the full
community in order to gather useful information.
It serves as the foundation of any study design since it is also a time- and money-efficient strategy. For
the best derivation, sampling techniques may be utilized in research survey software.
It is nearly hard to perform a research study that includes every member of the population, for instance,
if a pharmaceutical company wants to investigate the negative side effects of a medicine on the
populace of the nation. The researcher selects a sample in this instance.
There are two forms of sampling used in market action research: probability sampling and non-
probability sampling. Let's examine these two sampling techniques in more detail.
Probability sampling is a sampling approach in which a researcher selects a few criteria and randomly
selects individuals of a population. With the use of this selection parameter, each member has an equal
chance of being included in the sample.
Non-probability sampling: In non-probability sampling, participants are chosen at random for the study.
This type of sampling is not a set or predetermined selection procedure. Due of this, it is challenging
to ensure that every component of a population has an equal chance of being represented in a sample.
Each person is picked in this scenario purely at random, and everyone in the population has an equal
likelihood of being chosen. Giving each person in a population a number and selecting which ones to
include from a database of random numbers is one method of producing a random sample. Use groups
of three numbers from the random number table to select your sample, for instance, if your sampling
frame consists of 1000 people labelled 0 to 999. Select the person designated "94" if the first three
digits in the random number table were 094, and so on.
Simple random sampling lowers selection bias while allowing the sample error to be quantified, as is
the case with all probability sampling techniques. The fact that it is the most user-friendly probability
sampling technique is a distinct benefit. Simple random selection has the drawback of maybe not
selecting enough people who have your feature of interest, especially if it is not common. Additionally,
it could be challenging to set up a thorough sampling frame and difficult to get in touch with them,
particularly if you need to use multiple communication methods (email, phone, and mail) and your
sample units are dispersed throughout a large geographic region.
2. Systematic sampling
At regular intervals, people are chosen from the sample frame. The intervals are selected to guarantee
a sufficient sample size. Every x/nth person in the population should be chosen for the sample if you
need a sample size of n. Select every 1000/100 = 10th member of the sampling frame, for instance, if
you needed a sample size of 100 from a population of 1000.
In many cases, systematic sampling is easier to use and more practical than random sample. If there are
underlying patterns in the order of the persons in the sample frame, for instance, and the sampling
approach corresponds with the periodicity of the underlying pattern, this might potentially result in
bias. The Student Record Department's central list of all students is set up so that the sexes alternate,
so if a group of students were to be sampled to learn their opinions on college amenities, choosing an
even interval (for instance, every 20th student) would result in a sample of either all males or all
females. Despite the fact that the bias in this scenario is clear and should be not always be the case, but
it is readily remedied.
3. Stratified sampling
In this approach, the population is initially separated into subgroups (or strata) that all have a common
trait. It is employed when we want to guarantee that all subgroups are represented and we can fairly
anticipate that the measurement of interest will vary between the various subgroups. For instance, to
assure equal representation of men and women, we may stratify the sample by sex in a research of
stroke outcomes. Next, equal sample sizes are taken from each stratum to create the research sample.
It may also be desirable to select non-equal sample sizes from each stratum in stratified sampling. For
instance, it would be reasonable to select the sample numbers from each hospital proportionally if there
were three hospitals in the county, each with a different number of nurses (Hospital A has 500 nurses,
Hospital B has 1000, and Hospital C has 2000). (e.g. 10 from hospital A, 20 from hospital B and 40
from hospital C). Instead of using a simple random sample, which would overrepresent nurses from
hospitals A and B, this guarantees a more realistic and accurate evaluation of the health outcomes of
nurses throughout the county. At the analysis stage, it is important to consider how the sample was
stratified.
The results are more accurate and representative thanks to stratified sampling because lowering
sampling error. It might be challenging to choose which feature or characteristics to stratify by since it
necessitates knowledge of the proper properties of the sample frame (the specifics of which are not
always available).
4. Clustered sampling
Subgroups of the population are utilized as the sampling unit rather than individuals in a clustered
sample. The population is split into clusters, which are groupings that are chosen at random to be a part
of the research. Typically, clusters have previously been recognized; for instance, certain general
practitioners' offices or towns may be designated as clusters. All participants in the selected clusters
are then included in the research when using single-stage cluster sampling. In a two-stage cluster
sampling procedure, some people are then randomly chosen from each cluster to be included. The
analysis has to take clustering into consideration. A (one-stage) cluster sample is a suitable illustration;
it is used yearly in England in the General Household survey. The whole population of the chosen
clusters (households) is included in the survey.
When research spans a large geographic area, cluster sampling may be more effective than conventional
random sample. For instance, it is simpler to reach many people in a few GP practices than a small
number of people in several GP practices. The danger of bias is enhanced if the selected clusters are
not representative of the population, which increases sampling error.
1. Convenience sampling
Because participants are chosen based on their availability and desire to participate, convenience
sampling is perhaps the simplest technique of sampling. Although useful findings can be achieved, they
are subject to considerable bias since the sample may not be representative of other factors, such as age
or sex, and because those who volunteer to participate may differ from those who opt not to (volunteer
bias). The possibility of volunteer bias exists for all non-probability sampling techniques.
2. Quota sampling
Market researchers frequently employ this sample technique. Interviewers are instructed to try to recruit
a certain number of participants of a particular category. For instance, the interviewer may be instructed
to go out and choose 20 adult males, 20 adult women, 10 adolescent girls, and 10 adolescent boys to
interview about their viewing of television. The quotas selected should ideally proportionally reflect
the traits of the underlying population.
The chosen sample may not be typical of other factors that weren't taken into account, despite the fact
that it is very simple and potentially representative (a consequence of the non-random nature of
sampling).
This strategy, sometimes referred to as selective or subjective sampling, depends on the researcher's
judgement when deciding who to ask to participate. Thus, to suit their purposes, researchers may
implicitly select a "representative" sample or target individuals who explicitly fit certain criteria. The
media frequently use this strategy while doing qualitative research and polling the public.
The benefit of judgement sampling is that it produces a variety of replies while taking up little time and
money (particularly useful in qualitative research). The findings, while theoretically inclusive, may not
necessarily be representative, and they are subject to researcher error in addition to volunteer bias.
4. Snowball sampling
When examining difficult-to-reach groups, the social sciences frequently employ this methodology. As
more subjects who are known to the existing subjects are nominated, the sample grows in size like a
snowball. For instance, participants may be asked to suggest more users for interview while a study of
risk behaviors among intravenous drug users is being conducted.
When it is difficult to choose a sample frame, snowball sampling may be useful. However, there is a
high chance of selection bias when choosing friends and acquaintances of individuals who have already
been examined (choosing a large number of people with similar characteristics or views to the initial
individual identified).
Answer No. 02
Hypothesis
An assumption is said to as a hypothesis when it is supported by evidence. Any inquiry that turns the
research questions into forecasts must start here. Variables, the population, and the relationships
between the variables are among its constituent parts. A hypothesis used to examine the link between
two or more variables is known as a research hypothesis.
Characteristics of Hypothesis
Sources of Hypothesis
Types of Hypothesis
❖ basic hypothesis
❖ complicated theory
❖ Directional supposition
❖ Undirected Hypothesis
❖ Negative hypothesis
❖ causal and associational hypothesis
Simple Hypothesis
One dependent variable and one independent variable are shown to be related. For instance, eating more
veggies will help you lose weight more quickly. In this scenario, increasing your vegetable intake is an
independent variable, while weight loss is the dependent variable.
Complex Hypothesis
It depicts the link between at least two dependent variables and at least two independent variables.
Increased consumption of fruits and vegetables lowers body weight, promotes healthy skin, and lowers
the risk of many illnesses including heart disease.
Directional Hypothesis
It demonstrates a researcher's intellect and dedication to a certain result. Furthermore, the nature of the
link between the variables may be predicted. For instance, four-year-olds who eat well over the course
of five years score better on the IQ scale than those who don't. This demonstrates the impact and its
trajectory.
Non-directional Hypothesis
When there is no theory involved, it is employed. Without speculating on the precise nature (direction)
of the link, it is stated that there is a relationship between the two variables.
Null Hypothesis
It makes a claim that conflicts with the premise. There is no connection between the independent and
dependent variables, which is a negative assertion. "HO" stands for the symbol.
A change in one variable causes a change in the other variable, which is known as an associative
hypothesis. The causal hypothesis, on the other hand, suggests a cause-and-effect relationship between
two or more variables.
Examples of Hypothesis
❖ An example of a straightforward hypothesis is that daily use of sugary beverages causes obesity.
❖ An illustration of a null hypothesis is that all lilies have the same number of petals.
❖ A person will feel less worn out if they receive 7 hours of sleep as opposed to sleeping fewer.
An illustration of a directed hypothesis is this.
Functions of Hypothesis
The following are the tasks that the theory carries out:
Researchers use hypotheses to record their ideas about how an experiment will go. The stages that make
up the scientific process are as follows:
❖ Developing a question
❖ background investigation
❖ forming a hypothesis
❖ making an experiment plan
❖ gathering of data
❖ Results evaluation
❖ A summary of the test
❖ delivering the results
Answer No. 03
SOURCE OF KNOWLEDGE
Introduction
There is no question that research has been the key to our culture's progress, eradicating ignorance in
certain sectors by revealing previously unknown information and inspiring improved processes and
goods. There is just one option: tell the truth, research is necessary, closer to the truth research
Human knowledge also functions on two levels. At the most fundamental level, it serves as the
foundation for everyday human behaviors as when a teacher helps pupils with arithmetic problems or
when a doctor applies his or her expertise to heal patients.
In the second level, the data is employed to identify improvements to the already existing ones.
All research is move forward with existing knowledge limits. It takes us beyond current
limits information.
Both advance current knowledge and pertinent research initiatives while also making new discoveries.
But methodical study contributes to the body of knowledge. It is important to stress this arrangement.
Research does not consist of a pointless collection of fresh knowledge.
Anyone who wants to may validate research-based information and confirm it.
The method by which it is acquired is replicated, i.e., it is possible to duplicate the file and verify the
findings. The goal also has the authority to protect a "third party."
People have, nonetheless, been making efforts to comprehend, communicate, explain, and control the
things and events around them. A few of the systems utilized to do this are: emotional perception,
reason, culture, authority, metaphysics, magic, expert opinion, personal experience, capture and re-
insertion science.
EXPERIENCE
Many of the questions you have can be answered by personal experience. Experience has led to the
transmission of a lot of wisdom from one generation to the next. Progress would be significantly slowed
down if individuals were unable to learn from their past experiences. In actuality, this capacity for
experience-based learning is a key aspect of intelligent behavior.
Experience has limits as a source of information. What an incident does to you relies on who you are.
In the same circumstance, two people will have totally different experiences. A forest that one person
finds to be a peaceful haven may be a dangerous wilderness to another. If one supervisor concentrated
on and reported the things that went right while the other supervisor concentrated on and reported the
things that went wrong, two supervisors monitoring the same classroom at the same time may
realistically provide quite different reports.
The fact that you frequently need to know things that you personally cannot learn through experience
is another drawback of experience.
AUTHORITY
People commonly turn to an authority, that is, they seek information from someone who has experience
with the issue or has some other source of expertise, for things that are difficult or impossible to know
from personal experience. People believe what acknowledged experts say to be true.
When we have concerns about our health or our finances, we see the doctor or the stockbroker,
respectively. In a dictionary, a learner can look up the preferred pronunciation of a term. When a new
teacher seeks an experienced one for advice, the beginning teacher can take the experienced teacher's
advice and try a certain method of teaching reading.
Even while authority is a great source of information, you should constantly inquire as to how they
came to their conclusions. A person's authority was once taken for granted merely because of the
position they held, such as king, chief, or high priest. People today are wary of putting their trust in
someone only because of their position or rank as an authority. They are more likely to believe what
an authority says when they themselves are considered authorities in the field.
Custom and tradition are closely tied to authority, and people rely on them for clarifications on a wide
range of issues pertaining to both professional and everyday issues. To put it another way, a common
question is, "How has this been done in the past? and then utilize the response as a springboard for
action. Custom and custom have had a significant impact on education, as teachers frequently look to
previous actions as a reliable reference. When the history of education is examined, it becomes clear
that many long-standing traditions had to be abandoned because they were subsequently proven to be
false. For years, it was common practice to humiliate misbehaving pupils with dunce hats and other
items. Before accepting custom and tradition as trustworthy sources, it is advisable to give them a
rigorous evaluation.
A quick and simple source of knowledge is authority. However, you must take into account the
limitations of authority as a source of information. To start, officials might be mistaken. When they
lack the information to support their claims, people frequently assert that they are authorities in a given
sector. Second, you could discover that experts differ with one another on some topics, showing that
their comments are frequently more opinions than facts.
Objectives
Effective knowledge management has many potential benefits, but there are also certain difficulties to
take into account.
Understanding a knowledge management system’s limit is essential to its success. Some of the typical
difficulties include:
❖ Create precise procedures for gathering, preserving, and sharing corporate knowledge.
❖ the goals and scope of any knowledge management activities should be specified.
❖ develop a business culture that values management and employee knowledge sharing
❖ Clearly define your objectives and tactics to assist you make advantage of the collective wisdom
(otherwise, it will be of no use to your business)
❖ For each new knowledge management system, take into account the budget, strategy, and
training requirements.
❖ When bringing new knowledge management techniques, take change management tactics into
account.
Answer No. 04
Everyone uses the internet on a regular basis. One might say that the internet and computers have
integrated themselves into daily life. Our daily activities, such as reading, socializing, finding
addresses, staying informed, watching movies, buying, talking, and contacting friends, are all being
impacted, either directly or indirectly. As a result, scientists are also doing their study in the virtual
realm. Nowadays, people frequently conduct their research online. Nearly all academic and scientific
institutions are involved. Internet has sped up and reduced the cost of data acquisition. Online surveys
and research are simple to perform and economical in terms of gathering and analyzing data. The
development of interactive new media technologies has had an impact on this emerging trend. Online
survey research is a new and developing field of technology. It is a creative activity that necessitates a
working knowledge of HTML code, scripting languages, and web creation software. These days, it is
simple to get such programmes online. In addition, there are many web portals that enable researchers
to build surveys and conduct them online. This saves time and effort for the researcher. This allows a
researcher to simultaneously serve a huge number of respondents from various geographic locations.
This essay will discuss how the internet is used in social science research and provide guidance on how
to conduct an online survey.
Everyone uses the internet on a regular basis. One might say that the internet and computers have
integrated themselves into daily life. Our daily activities, such as reading, socializing, finding
addresses, staying informed, watching movies, buying, talking, and contacting friends, are all being
impacted, either directly or indirectly. Therefore, by sharing their findings online, researchers are also
taking their studies into the virtual realm. It's becoming more common recently for Indian researchers
and academics to do study online. It has developed into a helpful hand for researchers in terms of
compiling relevant reviews, creating questionnaires, gathering data, analyzing data, and publishing
reports. Additionally, it gives researchers a forum to publish their research papers.
The development of interactive new media technologies has had an impact on this emerging trend. It is
a creative activity that necessitates a working knowledge of HTML code, scripting languages, and web
creation software. There are several web portals that enable researchers to design surveys and conduct
them online. This saves time and effort for the researcher. This allows a researcher to simultaneously
serve a huge number of respondents from various geographic locations. We all use computers on a
daily basis. Data collecting has become quick and affordable because to this study methodology. The
respondents also have the freedom to fill out the questionnaire at a time that is convenient for them in
this fashion. S. G. Williams (2012) said in her essay The Ethics of Internet Research that Research done
online is an economical method of data collecting, analysis, and recruiting.
In distant places where doing research is difficult or impossible, researchers may reach people
anywhere in the globe for their study. This possibility raises ethical issues include choosing a secure
location to complete a survey, doing research in a virtual setting, data protection, confidentiality, and
conducting secondary analysis on archived support group data. Author also emphasized the need of
researchers abiding by the fundamental ethical values of fairness, respect for others, and beneficence.
The development of interactive new media technologies has had an impact on this emerging trend. A
researcher should always be aware of the rules and morals governing the internet. This will safeguard
him from getting into problems as well as let him use the internet for study in a better way. The way
we undertake scholarly study might alter as a result of the Internet. It removes researchers' barriers
based on geography.
Whether a research project is in the review stage, the data gathering stage, the data analysis stage, or
the report writing stage, computers and the internet are both tightly tied and crucial to each step.
When conducting research, it is important to look into new fields of study and learn about related
studies.
In order to be aware of the most recent updates and studies that have been done in the same subject by
others. Any researcher must also comprehend his research's methodology and subject by reading
pertinent information.
After the research it enables researcher in knowing about to see what impact his work has had to
develop ideas for further researches etc.
The obvious place to read are the libraries, particularly when one is conducting research in academic
setting. Libraries may be wide ranging or specialized resources, general or academic in function, for
reference only or available for borrowing. The researcher can use public libraries, university library.
All these can be browsed on internet also. Internet has a vast source of information. There are virtual
libraries which are free and user friendly also. From where anyone can download or upload any book
or article. Some of these are paid also, where one need to signup first and pay some amount to use it.
Every scholar can benefit from these resources since they can utilize them whenever, wherever, and
however they're most comfortable. Here are some instances of it:
The amount of information available online has significantly increased during the last several years.
There are several bibliographic sources accessible on CD-ROM and online via the internet, among
other media. Information on books, journal articles, conference papers, as well as statistics, maps,
contacts for organizations, email addresses, and other data may all be found online. Many original
materials, including journal articles, have complete texts available online for researchers to read.
Information on the internet is easily accessible, which makes it a very desirable source for study.
Despite the fact that it takes a lot of time to read due of the volume of material, it contains a lot of useful
information. Internet searching for the purpose of reviewing literature must thus be methodical and
thoroughly controlled because it necessitates a check on the caliber of the material one is obtaining.
B. Data Gathering
Direct email, web surveys, and other electronic tools can all be used to gather data through the internet.
Despite the fact that it is an insecure medium, it helps researchers by removing all geographical and
time restrictions. Numerous web sites exist that enable researchers to create questionnaires, upload
them, and gather data. such as iperception.com, Psychsurveys.com, Surveymonkey.com, etc.
❖ An internet permission form should be formatted similarly to a cover letter and contain, if
necessary, all the components of a traditional signed consent. Web-based surveys should
provide participants the choice to "Agree" or "Not Agree." Online consent may not be
acceptable for studies containing extremely sensitive information. The signature line should
state, "By completing the survey, you are consenting to participate in the research."
❖ In the permission procedure, the researcher should disclose any sensitive data transfer, for
example: "This research involves the transmission of data over the internet. Although every
conceivable effort has been made to ensure the efficient use of technology, online
communication secrecy cannot be guaranteed.
❖ Where applicable, an alternate method of completing the survey, such as printing it and sending
it in, should be made available.
❖ There should be two buttons at the conclusion of the survey: one to submit the data and the
other to delete it. These buttons serve to both guarantee that subjects can withdraw at any
moment and inform them that, in the event that they do, even after finishing the survey, their
data may be deleted before being forwarded to the researcher.
C. Data Analysis
The data are prepared for analysis after they have been coded. They may be programmed using the data
collection website, or if the data is gathered manually, MS Excel or Access can be used to code it. The
researcher must choose the inferential statistics needed after coding as well as the programmes he or
she will use to analyses the data. For data analysis, there are several programmes and pieces of software
accessible, including MS Excel, MS Access, SPSS, STATFIT, and others. There are websites that let
researchers to submit data and do web-based data analytics. These websites automatically summaries
the provided data and often offer a graphical depiction.
When producing reports on a computer, the writing process becomes simpler. This will make it
straightforward for the researcher to access the areas he wants to update or edit, move text blocks
around, make minor changes everywhere throughout the text, and check for spelling and grammatical
errors. Additionally, this aids in saving the researcher's time and effort. While writing the report, the
computer is also useful for creating graphs, diagrams, and tables.
A researcher can utilize clever arts to make his report fascinating and appealing. Following authoring
it, the researcher might post his report online.
Answer No. 05
Power of Prediction
Being ability to foresee the future is one of a good hypothesis' valuable qualities. It not only resolves
the difficult scenario that is currently occurring, but it also makes predictions about what will happen
in the future. Due of the strength of prediction, hypothesis is the greatest way to direct research work.
A hypothesis needs to be closely related to what can be observed. It is based on observation rather than
on air castles. those entities and things that we are unable to perceive since such a hypothesis cannot
be put forward. An hypothesis can be verified using observable phenomena.
Simplicity
According to P.V. Young, "A hypothesis would be straightforward, if a researcher had more in sight
towards the problem," hypotheses should be accessible to all laypeople. A hypothesis "should be as
keen as a razor's blade," according to W-Ocean. A good hypothesis must thus be straightforward and
uncomplicated.
Clarity
The hypothesis must make sense conceptually. It should be obvious from unclear information. It must
employ vocabulary that is understandable to everyone.
Testability
A sound theory should be practically tested. It has to be verified and thoroughly observed before being
expressed and developed. Therefore, the main characteristic of a strong hypothesis is its testability.
A excellent hypothesis would be one that is pertinent to a certain issue. A hypothesis serves as direction
for identifying and fixing an issue, thus it must be in line with the problem.
Specific
It need to be created to address a particular issue. There shouldn't be any generalizations in it. A
hypothesis cannot get to the proper conclusions if generalization occurs.
It need to be able to offer fresh ideas and sources of information. It must lead to fresh scientific
discoveries. One of the most renowned researchers, J.S. Mill, asserts that "hypotheses are the finest
source of new information since they open up fresh avenues for discovery."
One of the most important qualities of a strong hypothesis is internal harmony and consistency. It ought
to come from inconsistencies and conflicts. Variables that depend on one another must be closely
related to one another.
❖ A theory should be able to be tested empirically. It should be made clear that some logical
conclusions may be drawn that are near to the level of concrete observation, allowing them to
be verified by field observation. In other words, the theories should be supported by empirical
data. The notions contained in the hypothesis must be adequately specified and have a clear
empirical relationship. Since "bad" cannot be defined precisely, the statement "Bad parents
beget Bad Children" is scarcely a statement that can qualify as a valid hypothesis.
❖ The closest to what can be observed should be used as a hypothesis. Without this, it would be
impossible to compare their agreement with actual data. As Cohen and Nagel correctly point
out, "[a] hypothesis must be written in such a way that inferences may be drawn from it and, as
a result, a determination as to whether it does or does not explain the studied facts can be made."
❖ Conceptually, the hypotheses must make sense. The preceding criterion makes this point
implicit. The terms used in the hypothesis should have formal definitions as well as, if practical,
operational definitions. While the operation definition will remove any doubt regarding what
would constitute the empirical proof or indicator of the concept on the plane of reality, the
formal definition or explication of the ideas will make clear what a certain concept stands for.
It is impossible to test an ambiguous hypothesis that is characterized by vague or poorly defined
concepts because there is, naturally, no basis for knowing what observable facts would
constitute the test. The ideas represented in the hypotheses should be defined in a way that is
understandable and widely accepted. This would guarantee ongoing research.
❖ Specific theories are required. One may confidently predict that something will occur in the
next five minutes, but just because it is disproven does not mean that the prediction contains
any useful details. We need to know what will happen, and once we commit to one point of
view or another, we become susceptible; if what was predicted does not occur, our forecast is
invalidated. A scientific claim is valuable to the extent that it is open to probable rebuttal. The
researchers are frequently tempted to formulate hypotheses that are so broad and expansive in
scope that they are impossible to evaluate. This inclination could lead to suicide.
❖ It would be wise for the researchers to steer clear of using notions in their hypothesis for which
appropriate concrete indicators have not yet arisen. A detailed definition of the indexes that will
be used should be included in the hypothesis. For instance, it is necessary to define social class
in terms of factors like income, employment, education, etc. The obvious benefit of such precise
definitions is that they guarantee that research will be relevant and doable. Additionally, it
contributes to the veracity of the findings since the less speculative a statement or forecast is,
the less likely it is that it would come true by accident or coincidence.
❖ The hypotheses should, it is advised, be connected to some body of theory or theoretical
direction. This need relates to a hypothesis's theoretical justification, namely, what will be the
theoretical benefits of testing the hypothesis? Research will serve to qualify, support, correct,
or deny the theory if the hypothesis is connected to one. Only via exchange between the current
body of data and theory can a science become cumulative. Will forays into new disciplines
where no clear theoretical system has formed be stifled if theories are not generally derived
from some theoretical base? Will such assumptions not result in pointless repetitions? Some
people could voice doubts of this magnitude.
❖ These criticisms lack much weight since artistically formulated hypotheses like these can serve
to not only elaborate, expand, and improve a theory but also to indicate significant connections
between it and other theories. Deriving hypotheses from a corpus of theory can therefore lead
to a scientific leap into fresh fields of understanding. According to Parsons, "Theory informs us
what we wish to know in addition to formulating what we know." If a body of theory served as
the foundation for a hypothesis, it would be able to articulate that hypothesis as a statement
about what would happen. This would give the hypothesis the ability to predict the future.
❖ The ability to forecast is one of a good hypothesis' most useful qualities. The effectiveness of
hypotheses for predicative purposes represents a significant development in scientific
understanding.
❖ The preferable hypothesis, according to Cohen and Nagel, "is one which can forecast what will
happen, and from which we may deduce what has already happened, even if we did not know
(it had happened) at the time the hypothesis was developed."
❖ In the earlier example, the prediction that Catholics should have lower suicide rates than
Protestants does not only have a predictive potential but also provides the theoretical
groundwork for claiming that married people, a minority group, or a tribal community should
be expected to have lower suicide rates because suicide rates would be lower in areas with good
social cohesiveness.
In this respect, a "good" hypothesis aids in our ability to infer information about the past even though
we were not aware of it at the time.
The accessible methodologies should be connected to the hypotheses. Of course, if one is assessing a
problem's research potential, this is a reasonable methodological requirement that applies to any issue.
It is difficult for a researcher to come up with good research questions if they are unaware of the
methods that may be used to examine their hypothesis. To put it another way, the hypotheses should
only be developed after careful consideration of the approaches and procedures that may be utilized to
gauge the ideas or variables included in the hypotheses. This should not be interpreted as indicating
that it is improper to formulate theories that are now too sophisticated for modern technology to
comprehend. We must not lose sight of the fact that if a problem is substantial enough to serve as a
viable frame of reference, it may be helpful regardless of whether it can be verified or tested using
current approaches. Even if at the time their broader concepts couldn't be handled by the methods at
hand, Marx and Durkheim's writings have been of utmost significance to sociology.
Finally, it's important to keep in mind that asking seemingly difficult questions can really spur
technological advancement. It is undeniable that critiques of important studies that were at the time
deemed insufficient due to the limits of current procedures provided some of the motivation for modern
improvements in methodology.