0% found this document useful (0 votes)
276 views7 pages

Appreciation of Computing in Different Fields

Computers are used widely in many fields such as education, business, medicine, and more. In education, computers are used for computer-aided learning, distance learning, and online exams. In business, computers are used for marketing, stock exchanges, and automating tasks. In medicine, computers are used for hospital management systems, storing patient history, monitoring patients, life support systems, and diagnosis. Computers have made tasks more efficient and reduced costs across many domains.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
276 views7 pages

Appreciation of Computing in Different Fields

Computers are used widely in many fields such as education, business, medicine, and more. In education, computers are used for computer-aided learning, distance learning, and online exams. In business, computers are used for marketing, stock exchanges, and automating tasks. In medicine, computers are used for hospital management systems, storing patient history, monitoring patients, life support systems, and diagnosis. Computers have made tasks more efficient and reduced costs across many domains.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

APPRECIATION OF COMPUTING IN DIFFERENT FIELDS

Computers are used in so many fields in our daily life. From Engineers to Doctors, Students,
Teachers, and Government Organization they all use computers to perform specific tasks, for
entertainment or just to finish office work. Computers have made our life easier. With greater
precision and accuracy and less time taking computers can do a lot in short time while that task can
take a lot of time while doing manually. Computers have taken industries and businesses to a
whole new level. They are used at home for work and entertainment purposes, at office, in
hospitals, in government organizations. Here we are going to discuss some of the uses of
computers in various fields.
Biology
Computational biology involves the development and application of data-analytical and
theoretical methods, mathematical modeling and computational simulation techniques to the study
of biological, behavioral, and social systems. The field is broadly defined and includes foundations
in biology, applied mathematics, statistics, biochemistry, chemistry, biophysics, molecular
biology, genetics, genomics, computer science and evolution.
Computational biology is different from biological computing, which is a subfield of computer
science and computer engineering using bioengineering and biology to build computers, but is
similar to bioinformatics, which is an interdisciplinary science using computers to store and process
biological data.
Computational Biology, which includes many aspects of bioinformatics, is the science of using
biological data to develop algorithms or models to understand biological systems and relationships.
Until recently, biologists did not have access to very large amounts of data. This data has now
become commonplace, particularly in molecular biology and genomics. Researchers were able to
develop analytical methods for interpreting biological information, but were unable to share them
quickly among colleagues.
Computational biology has been used to help sequence the human genome, create accurate
models of the human brain, and assist in modeling biological systems.
Applications of Computational Biology
Initially, computational biology focused on the study of the sequence and structure of biological
molecules, often in an evolutionary context. Beginning in the 1990s, however, it extended
increasingly to the analysis of function. Functional prediction involves assessing the sequence and
structural similarity between an unknown and a known protein and analyzing the proteins’
interactions with other molecules. Such analyses may be extensive, and thus computational biology
has become closely aligned with systems biology, which attempts to analyze the workings of large
interacting networks of biological components, especially biological pathways.
Biochemical, regulatory, and genetic pathways are highly branched and interleaved, as well as
dynamic, calling for sophisticated computational tools for their modeling and analysis. Moreover,
modern technology platforms for the rapid, automated (high-throughput) generation of biological
data have allowed for an extension from traditional hypothesis-driven experimentation to data-
driven analysis, by which computational experiments can be performed on genome-wide databases
of unprecedented scale. As a result, many aspects of the study of biology have become unthinkable
without the power of computers and the methodologies of computer science.
Sociology
Computational sociology is a branch of sociology that uses computationally intensive methods
to analyze and model social phenomena. Using computer simulations, artificial intelligence, complex
statistical methods, and analytic approaches like social network analysis, computational sociology
develops and tests theories of complex social processes through bottom-up modeling of social
interactions.
It involves the understanding of social agents, the interaction among these agents, and the effect
of these interactions on the social aggregate. Although the subject matter and methodologies
in social science differ from those in natural science or computer science, several of the approaches
used in contemporary social simulation originated from fields such as physics and artificial
intelligence. Some of the approaches that originated in this field have been imported into the
natural sciences, such as measures of network centrality from the fields of social network
analysis and network science.
In relevant literature, computational sociology is often related to the study of social
complexity. Social complexity concepts such as complex systems, non-linear interconnection among
macro and micro process, and emergence, have entered the vocabulary of computational sociology.
A practical and well-known example is the construction of a computational model in the form of an
"artificial society", by which researchers can analyze the structure of a social system.
Gaming
An important use of computers at home is playing games. Different types of games are available.
These games are a source of entertainment and recreation. Many games are available that are
specially developed to improve your mental capability and thinking power
Other Areas
Home Budget
Computer can be used to manage Home Budget. You can easily calculate your expenses and
income. You can list all expenses in one column and income in another column. Then you can apply
any calculation on these columns to plan your home budget. There are also specialize software
that can manage your income and expenses and generate some cool reports.
Working from Home
People can manage the office work at home. The owner of a company can check the work of the
employees from home. He can control his office while sitting at home.
Entertainment
People can find entertainment on the internet. They can watch movies, listen to songs, and watch
videos download different stuff. They can also watch live matches on the internet.
Information
People can find any type of information on the internet. Educational and informative websites are
available to download books, tutorials etc. to improve their knowledge and learn new things.
Chatting & Social Media
People can chat with friends and family on the internet using different software like Skype etc. One
can interact with friends over social media websites like Facebook, Twitter & Google Plus. They can
also share photos and videos with friends.
Uses of Computers in Education
CBT are different programs that are supplied on CD-ROM. These programs include text, graphics
and sound. Audio and Video lectures are recorded on the CDs. CBT is a low cost solution for
educating people. You can train a large number of people easily.
Computer Aided Learning (CAL)
Computer aided learning is the process of using information technology to help teaching and
enhance the learning process. The use of computer can reduce the time that is spent on preparing
teaching material. It can also reduce the administrative load of teaching and research. The use of
multimedia projector and PowerPoint presentations has improved the quality of teaching. It has
also helped the learning process.
Distance Learning
Distance learning is a new learning methodology. Computer plays the key role in this kind of
learning. Many institutes are providing distance learning programs. The student does not need to
come to the institute. The institute provides the reading material and the student attends virtual
classroom. In virtual classroom, the teacher delivers lecture at his own workplace. The student can
attend the lecture at home by connecting to a network. The student can also ask questions to the
teacher.
Online Examination
The trend of online examination is becoming popular. Different examination like GRE, GMAT and
SAT are conducted online all over the world. The questions are marked by computer. It minimizes
the chance of mistakes. It also enables to announce the result in time.
Uses of Computers in Business
The use of computer technology in business provides many facilities. Businessmen are using
computers to interact with their customers anywhere in the world. Many business tasks are
performed more quickly and efficiently. Computers also help them to reduce the overall cost of
their business. Computer can be used in business in the following ways.
Marketing
An organization can use computers for marketing their products. Marketing applications provide
information about the products to customers. Computer is also used to manage distribution system,
advertising and selling activities. It can also be used in deciding pricing strategies. Companies can
know more about their customers and their needs and requirements etc.
Stock Exchange
Stock Exchange is the most important place for businessmen. Many stock exchanges use computers
to conduct bids. The stockbrokers perform all trading activities electronically. They connect with the
computer where brokers match the buyers with sellers. It reduces cost as no paper or special
building is required to conduct these activities.
Uses of computers in Medical Field
Hospital Management System
Specialized hospital management software are used to automate the day to day procedures and
operations at hospitals. These tasks may be Online appointments, payroll admittance and discharge
records etc.
Patient History
Hospital management systems can store data about patients. Computers are used to store data
about patients, their diseases & symptoms, the medicines that are prescribed.
Patients Monitoring
Monitoring systems are installed in medical wards and Intensive care units to monitoring patients
continuously. These systems can monitor pulse, blood pressure and body temperature and can
alert medical staff about any serious situations.
Life Support Systems
Specialized devices are used to help impaired patients like hearing aids.
Diagnosis Purpose
A variety of software are used to investigate symptoms and prescribed medication accordingly.
Sophisticated systems are used for tests like CT Scan, ECG, and other medical tests.

ASSOCIATION OF COMPUTING MACHINE


ACM brings together computing educators, researchers, and professionals to inspire dialogue, share
resources, and address the field's challenges. As the world’s largest computing society, ACM
strengthens the profession's collective voice through strong leadership, promotion of the highest
standards, and recognition of technical excellence. ACM supports the professional growth of its
members by providing opportunities for life ‐long learning, career development, and professional
networking.
Founded at the dawn of the computer age, ACM’s reach extends to every part of the globe, with
more than half of its 100,000 members residing outside the U.S. Its growing membership has led
to Councils in Europe, India, and China, fostering networking opportunities that strengthen ties
within and across countries and technical communities. Their actions enhance ACM’s ability to raise
awareness of computing’s important technical, educational, and social issues around the world.

Special Interest Groups Form around ACM’s Powerful, Vibrant Communities


Networking opportunities in ACM’s 37 Special Interest Groups (SIGs) are always expanding,
reflecting the growth of computing’s discrete disciplines and technical communities. The leading
representatives of their fields, ACM SIGs sponsor annual conferences, workshops, and symposia
serving practitioner‐ and research‐based constituencies. Because they provide objective arenas for
novel, often competing ideas, many of these meetings have become premier global events.
Chapters: ACM's "Local Neighborhoods"
ACM’s broad‐based infrastructure supports more than 860 professional and student chapters
worldwide. These "local neighborhoods" offer opportunities for members to gain access to critical
research and establish personal networking systems.

ACM, Member-driven, Volunteer-led


ACM offers volunteer opportunities for members and non ‐members that create networking
possibilities and enhance career development. At the grassroots level, ACM volunteers serve a
growing international community of researchers, practitioners and students by lending valuable
assistance at conferences, publications, webinars, and other events.
Volunteers have a direct and critical impact on ACM’s governance through the ACM Council, the
highest governing authority. Volunteers also serve on the ACM Executive Committee and numerous
other boards and task forces.
Volunteers – members and non-members alike – hold leadership roles in ACM journal publications
as Editors‐in‐Chief, Associate Editors, and reviewers. They also comprise the ACM Education Board,
which provides curriculum recommendations for four‐year universities as well as community
colleges. The Board’s 2013 recommendations in computer science have even been translated into
Chinese.
ACM’s ‘Big Tent’ Philosophy Embraces Diversity
The ACM community is as diverse as the subfields that comprise computer science, from educators
and researchers in academia to practitioners in project management, industrial research, and
software development, engineering, and application design.
This diversity extends to their gender and ethnicities. The ACM Women’s Council (ACM ‐W)
advocates internationally for full engagement of women in all aspects of computing. The ACM‐
sponsored Richard Tapia Celebration of Diversity in Computing brings together students, faculty,
researchers, and professionals from all backgrounds. It provides a supportive networking
environment for under‐represented groups across a range of computing and information
technology fields.
Supporting Tomorrow’s Problem Solvers Today
ACM Student Chapters enable students to fully engage in its professional activities. Participants
from more than 500 colleges and universities worldwide enhance their learning through the
exchange of ideas with other students and established professionals.
ACM offers $1.5 million in scholarships and an affordable Student Membership. Both
undergraduate and graduate student members can compete in ACM Student Research
Competitions, an internationally recognized venue hosted at ACM conferences and sponsored by
Microsoft. They benefit by sharing their research with peers and academic and industry luminaries,
gaining recognition and experience, and earning rewards.
The IBM‐sponsored ACM International Collegiate Programming Contest (ACM ‐ICPC) is a renowned
competition for student programmer teams worldwide. ACM also presents the Doctoral Dissertation
Award to recognize superior research in computer science and engineering.

Giving Credit where Credit Is Due


ACM recognizes excellence through its eminent awards for technical and professional achievements
and contributions in computer science and information technology. It also names as Fellows and
Distinguished Members those members who, in addition to professional accomplishments, have
made significant contributions to ACM's mission.
ACM's prestigious A.M. Turing Award is accompanied by a $1 million prize provided by Google for
contributions of lasting and major technical importance to the computing field. Other prominent
ACM awards recognize achievements by young computing professionals, educators, theoretical
computer scientists, software systems innovators, and pioneers who have made humanitarian and
cross-discipline contributions.
Providing Tireless Advocacy of Critical Public Policy Issues
ACM leverages its international respect and leadership to shape public policy worldwide. Through
its geographically distributed policy entities in Europe and North America, ACM helps develop policy
statements, issue briefs, white papers, and reports to provide policymakers with knowledge ‐based
analysis that accelerates computing innovations which benefit society. It also delivers expertise on
education policy, women in computing, and diversification of computing.
ACM Publications - Advancing Research, Technology, and Innovation
As a leading global source for scientific information, ACM promotes computer research and
innovation through its journals, magazines, and the proceedings of more than 170 annual
conferences and symposia. ACM authors are among the world's leading thinkers in computing and
information technologies, providing original research and firsthand perspectives.
ACM also provides access to the ACM Digital Library (DL), a comprehensive and expanding
database of literature and detailed bibliographic resources for computing professionals from a wide
range of publishers. The DL currently includes more than 1 million articles authored by leading
researchers in computing. The flagship magazine Communications of the ACM provides industry
news, commentary, observations, and practical research.
Guiding Members with a Framework of Ethical Conduct
The ACM Code of Ethics identifies the elements of every member’s commitment to ethical
professional conduct. It outlines fundamental considerations that contribute to society and human
well-being and those that specifically relate to professional responsibilities, organizational
imperatives, and compliance with the code.

HISTORY OF COMPUTERS: A BRIEF TIMELINE


The computer was born not for entertainment or email but out of a need to solve a serious
number-crunching crisis. By 1880, the U.S. population had grown so large that it took more than
seven years to tabulate the U.S. Census results. The government sought a faster way to get the job
done, giving rise to punch-card based computers that took up entire rooms.
Today, we carry more computing power on our smartphones than was available in these early
models. The following brief history of computing is a timeline of how computers evolved from their
humble beginnings to the machines of today that surf the Internet, play games and stream
multimedia in addition to crunching numbers.

1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to
automatically weave fabric designs. Early computers would use similar punch cards.
1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine
that would be able to compute tables of numbers. The project, funded by the English government,
is a failure. More than a century later, however, the world's first computer was actually built.
1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing
the task in just three years and saving the government $5 million. He establishes a company that
would ultimately become IBM.
1936: Alan Turing presents the notion of a universal machine, later called the Turing machine,
capable of computing anything that is computable. The central concept of the modern computer
was based on his ideas.
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to
build the first computer without gears, cams, belts or shafts.
1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a Palo Alto, California,
garage, according to the Computer History Museum.
1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29
equations simultaneously. This marks the first time a computer is able to store information on its
main memory.
1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build
the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital
computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.
1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the
Census Bureau to build the UNIVAC, the first commercial computer for business and government
applications.
1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent
the transistor. They discovered how to make an electric switch with solid materials and no need for
a vacuum.
1953: Grace Hopper develops the first computer language, which eventually becomes known as
COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the
IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.
1954: The FORTRAN programming language, an acronym for FORmula TRANslation, is developed
by a team of programmers at IBM led by John Backus, according to the University of Michigan.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby
was awarded the Nobel Prize in Physics in 2000 for his work.
1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a
graphical user interface (GUI). This marks the evolution of the computer from a specialized
machine for scientists and mathematicians to technology that is more accessible to the general
public.
1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed
compatibility issues. Written in the C programming language, UNIX was portable across multiple
platforms and became the operating system of choice among mainframes at large companies and
government entities. Due to the slow nature of the system, it never quite gained traction among
home PC users.
1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM)
chip.
1971: Alan Shugart leads a team of IBM engineers who invent the "floppy disk," allowing data to
be shared among computers.
1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting
multiple computers and other hardware.
1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM
5100, Radio Shack's TRS-80 — affectionately known as the "Trash 80" — and the Commodore PET.
1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the
"world's first minicomputer kit to rival commercial models." Two "computer geeks," Paul Allen and
Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the
success of this first endeavor, the two childhood friends form their own software company,
Microsoft.
1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and roll out the
Apple I, the first computer with a single-circuit board, according to Stanford University.
1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the
first time, non-geeks could write programs and make a computer do what they wished.
1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer
Faire. It offers color graphics and incorporates an audio cassette drive for storage.
1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet
program.
1979: Word processing becomes a reality as MicroPro International releases WordStar. "The
defining change was to add margins and word wrap," said creator Rob Barnaby in email to Mike
Petrie in 2000. "Additional changes included getting rid of command mode and adding a print
function. I was the technical brains — I figured out how to do it, and did it, and documented it."
1981: The first IBM personal computer, code-named "Acorn," is introduced. It uses Microsoft's MS-
DOS operating system. It has an Intel chip, two floppy disks and an optional color monitor. Sears &
Roebuck and Computerland sell the machines, marking the first time a computer is available
through outside distributors. It also popularizes the term PC.
1983: Apple's Lisa is the first personal computer with a GUI. It also features a drop-down menu
and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable
computer with the familiar flip form factor and the first to be marketed as a "laptop."
1985: Microsoft announces Windows, according to Encyclopedia Britannica. This was the
company's response to Apple's GUI. Commodore unveils the Amiga 1000, which features advanced
audio and video capabilities.
1985: The first dot-com domain name is registered on March 15, years before the World Wide
Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a
small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later,
only 100 dot-coms had been registered.
1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed
comparable to mainframes.
1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva,
develops HyperText Markup Language (HTML), giving rise to the World Wide Web.
1993: The Pentium microprocessor advances the use of graphics and music on PCs.
1994: PCs become gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme
Park," "Magic Carpet," "Descent" and "Little Big Adventure" are among the games to hit the
market.
1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.
1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple's
court case against Microsoft in which it alleged that Microsoft copied the "look and feel" of its
operating system.
1999: The term Wi-Fi becomes part of the computing language and users begin connecting to the
Internet without wires.
2001: Apple unveils the Mac OS X operating system, which provides protected memory
architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft
rolls out Windows XP, which has a significantly redesigned GUI.
2003: The first 64-bit processor, AMD's Athlon 64, becomes available to the consumer market.
2004: Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant Web browser.
Facebook, a social networking site, launches.
2005: YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based
mobile phone operating system.
2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well
as an Intel-based iMac. Nintendo's Wii game console hits the market.
2007: The iPhone brings many computer functions to the smartphone.
2009: Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and
advances in touch and handwriting recognition, among other features.
2010: Apple unveils the iPad, changing the way consumers view media and jumpstarting the
dormant tablet computer segment.
2011: Google releases the Chromebook, a laptop that runs the Google Chrome OS.
2012: Facebook gains 1 billion users on October 4.
2015: Apple releases the Apple Watch. Microsoft releases Windows 10.
2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been
any quantum-computing platform that had the capability to program new algorithms into their
system. They're usually each tailored to attack a particular algorithm," said study lead author
Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College
Park.
2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular
Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties
that we may be able to harness for rapid, scalable information storage and processing," Anne
Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of
molecules exist, and each molecule has a unique three-dimensional atomic structure as well as
variables such as shape, size, or even color. This richness provides a vast design space for
exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current
logic-based, digital architectures." [Computers of the Future May Be Minuscule Molecular Machines]

You might also like