Computers and Researchers: Some of The Limitations of Computer Are As Follows
Computers and Researchers: Some of The Limitations of Computer Are As Follows
1. Performing calculations almost at the speed of light, the computer has become one of the
most useful research tools in modern times. Computers are ideally suited for data analysis
concerning large research projects. Researchers are essentially concerned with huge
storage of data, their faster retrieval when required and processing of data with the aid of
various techniques. In all these operations, computers are of great help.
2. The computers can perform many statistical calculations easily and quickly. Computation
of means, standard deviations, correlation coefficients, „t‟ tests, analysis of variance,
analysis of covariance, multiple regression, factor analysis and various nonparametric
analyses are just a few of the programs and subprograms that are available at almost all
computer centres.
3. Techniques involving trial and error process are quite frequently employed in research
methodology. This involves lot of calculations and work of repetitive nature. Computer is
best suited for such techniques, thus reducing the drudgery of researchers on the one hand
and producing the final result rapidly on the other.
4. The storage facility which the computers provide is of immense help to a researcher for
he can make use of stored up data whenever he requires doing so.
The above description indicates clearly the usefulness of computers to researchers in data
analysis. Researchers, using computers, can carry on their task at faster speed and with
greater reliability.
In spite of all this sophistication we should not forget that basically computers are machines
that Only compute, they do not think. The human brain remains supreme and will continue to
be so for all times. As such, researchers should be fully aware about the following limitations
of computer-based analysis:
No Self-Intelligence
Computer does not have intelligence of its own to complete the tasks. They give wrong
output if the inputs given by humans are wrong. It works according to the instructions given
to it by the user.
The computer cannot think itself. The concept of artificial intelligence shows that computer
can think. But still this concept is dependent on set of instructions. It cannot take any
decision. It can only perform the tasks that are instructed by the users.
No Feeling
Lack of feeling is another limitation of computer. A computer cannot feel like us. It does not
have emotions, feelings, knowledge etc. It does not get tired and keep on doing its tasks. It
can do very risky works which are not capable by human beings.
No Learning Power
Computer has no learning power. Computer cannot perform the tasks without instructions. It
cannot read the same instructions time and again. Once the instructions are given it will work
for one time. It can solve the problems but it cannot learn the problems. It can only work
according to the instructions given.
We have described above some important test often used for testing hypotheses on the basis
of which important decisions may be based. But there are several limitations of the said tests
which should always be borne in mind by a researcher. Important limitations are as follows:
1. The tests should not be used in a mechanical fashion. It should be kept in view that
testing is not decision-making itself; the tests are only useful aids for decision-
making. Hence “proper interpretation of statistical evidence is important to intelligent
decisions.”
2. Tests do not explain the reasons as to why do the difference exist, say between the
means of the two samples. They simply indicate whether the difference is due to
fluctuations of sampling or because of other reasons but the tests do not tell us as to
which is/are the other reason(s) causing the difference.
3. Results of significance tests are based on probabilities and as such cannot be
expressed with full certainty. When a test shows that a difference is statistically
significant, then it simply suggests that the difference is probably not due to chance.
4. Statistical inferences based on the significance tests cannot be said to be entirely
correct evidences concerning the truth of the hypotheses. This is specially so in case
of small samples where the probability of drawing erring inferences happens to be
generally higher. For greater reliability, the size of samples is sufficiently enlarged.
MULTIVARIATE ANALYSIS
In univariate statistics there are one or more independent variables (X1, X2), and only one
dependent variable (Y). Multivariate analysis is concerned with two or more dependent
variables, Y1, Y2, being simultaneously considered for multiple independent variables, X1,
X2, etc. The manual effort used to solve multivariate problems was an obstacle to its
earlier use. Recent advances in computer software and hardware have made it possible
to solve more problems using multivariate analysis. Some of the software programs available
to solve multivariate problems include: SPSS, S-Plus, SAS, and Minitab.
SPSS is the most popular tool for statisticians. SPSS stands for Statistical Package for Social
Sciences. The latest version of SPSS is IBM SPSS STATISTICS 20 (purchased by IBM after
version 19). It provides all analysis facilities like following
The ANOVA technique is important in the context of all those situations where we want to
compare more than two populations such as in comparing the yield of crop from several
varieties of seeds, the gasoline mileage of four automobiles, the smoking habits of five
groups of university students and so on. In such circumstances one generally does not want to
consider all possible combinations of two populations at a time for that would require a great
number of tests before we would be able to arrive at a decision. This would also consume lot
of time and money, and even then certain relationships may be left unidentified (particularly
the interaction effects). Therefore, one quite often utilizes the ANOVA technique and through
it investigates the differences among the means of all the populations simultaneously.
The information and Library Network ((NFLIBNET) program was started by the University
Grants Commission (UGC) in April 1991. It is a cooperative venture for pooling, sharing,
and optimization of library resources in the country. It aims to provide a channel to the
academicians and researchers for exchange of information from sources within the country
and abroad. It is a major program towards modernization of libraries and information services
in the country, using computer and communication technologies. INFLIBNET include
participants form colleges, universities, R&D institutes, institutes of higher learning,
information centers, institutes of national importance, and document resource centers
(DRCs). All the disciplines such as science, technology, medicine, agriculture, fine arts,
humanities, social sciences, etc., be covered under this program The INFLIBNET
programmes has been set up with the following objectives:
To promote R&D and develop necessary facilities and create technical positions for
realizing the objectives of the Centre;
MATLAB
With so many specialist software packages available, why use Excel for statistical analysis?
Convenience and cost are two important reasons: many of us have access to Excel on our
own computers and do not need to source and invest in other software. Another benefit,
particularly for those new to data analysis, is to remove the need to learn a software program
as well as getting to grips with the analysis techniques. Excel also integrates easily into other
Microsoft Office software products which can be helpful when preparing reports or
presentations.
As a spreadsheet, Excel can be used for data entry, manipulation and presentation but it also
offers a suite of statistical analysis functions and other tools that can be used to run
descriptive statistics and to perform several different and useful inferential statistical tests that
are widely used in business and management research. In addition, it provides all of the
standard spreadsheet functionality, which makes it useful for other analysis and data
manipulation tasks, including generating graphical and other presentation formats. Finally,
even if using bespoke statistical software, Excel can be helpful when preparing data for
analysis in those packages.
Many basic analysis projects involving primarily data exploration, descriptive statistics and
simple inferential statistics can be successfully completed using standard Excel. More
advanced projects, especially those involving multivariate analysis are more challenging in
Excel and in such cases it is worth considering using specialist analysis software such as IBM
SPSS.
Excel includes a large number of tools that can be used for general data analysis. Here our
primary concern is those that are relevant to the statistical and related analysis techniques
Statistical functions
Excel offers a broad range of built-in statistical functions. These are used to carry out specific
data manipulation tasks, including statistical tests. An example is the AVERAGE 1 function
that calculates the arithmetic mean of the cells in a specified range.
Data Analysis Tool Pak
The Data Analysis Tool Pak is an Excel add-in. It contains more extensive functions,
including some useful inferential statistical tests. An example is the Descriptive Statistics
routine that will generate a whole range of useful statistics in one go.
Charts
Excel‟s in-built charts (graphs) cover most of the chart types introduced are invaluable in
data exploration and presentation.
Pivot tables
Pivot tables provide a way of generating summaries of your data and organising data in ways
that are more useful for particular tasks. They are extremely useful for creating contingency
tables, cross-tabulations and tables of means or other summary statistics. A brief introduction
to creating pivot tables is given in the guide Data exploration in Excel: univariate analysis.
MICROSOFT POWERPOINT
Daily life uses of Power point: - Microsoft PowerPoint is application software used to
present data and information by using text, images, diagrams with animations and transitional
effects etc. in slides that helps to explain the topic or idea in front of audience easily and
practically. It can be a powerful tool in creating clear, well-structured presentations that have
a strong visual impact. However, over-use or misuse can detract from your presentation.
Following the guidelines in this study guide will ensure that you use PowerPoint
effectively to support your presentation and engage your audience.
Writing a thesis is stressful but preparing an oral defense can be even more painful. But it
doesn‟t have to be, with proper preparation and a good presentation you will be able to better
equip yourself come time to present your thesis defense. A proper presentation helps you
with your thesis defense because it helps you to capture the panel‟s attention and give you
cues and reminders on what to say as well. It also helps keep your data organized, while
visually looking good and provides a flow structure for the rest of your presentation. The
Right PowerPoint Templates for Your Thesis Defense and a powerful outline composed of
best practices and layouts are specifically designed to help you defend your thesis in both
written and oral presentation.
During the presentation, PowerPoint has a variety of advantages to the presenter and
listeners. To progress through a slide show, the presenter just need to click a button, which
allows the presenter to maintain eye contact with your audience and use your hands for
emphasis. A Powerpoint presentation often has a nice appearance and interesting graphics,
which keeps the audience interested. Moreover, it can be projected on a big screen for a large
auditorium or classroom
IMPORTANCE AND USE DATABASES
Data are facts, numbers, letters, and symbols that describe an object, idea, condition,
situation, or other factors. A data element is the smallest unit of information to which
reference is made. Data in a database may be characterized as predominantly word
oriented (e.g., as in a text, bibliography, directory, dictionary), numeric (e.g., properties,
statistics, experimental values), image (e.g., fixed or moving video, such as a film of
microbes under magnification or time-lapse photography of a flower opening), or sound (e.g.,
a sound recording of a tornado or a fire).
The now-common practice of downloading material from online databases has made it easy
for researchers and other users to acquire data, which frequently have been produced with
considerable investments of time, money, and other resources.
Factual data are both an essential resource for and a valuable output from scientific research.
It is through the formation, communication, and use of facts and ideas that scientists conduct
research. Throughout the history of science, new findings and ideas have been recorded and
used as the basis for further scientific advances and for educating students. The process of
scientific inquiry typically has begun with the formulation of a working hypothesis, based
usually on limited observation and data, followed by experimentation designed to test the
hypothesis. The experimentation results in the accumulation of new data used to confirm or
refute the original hypothesis.
Your topic statement determines the type of database, kind of information, and the date of the
sources that you will use. It is important to clarify whether your topic will require research
from journals, magazines, newspapers, and books or just journals.