0% found this document useful (0 votes)
28 views35 pages

Cprog9 ch01

Uploaded by

husaba8
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views35 pages

Cprog9 ch01

Uploaded by

husaba8
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 35

C How to Program

Ninth Edition, Global Edition

Chapter 1
Introduction to Computers
and C

Copyright © 2023 Pearson Education, Ltd. All Rights Reserved


1.1 Introduction
• C is one of the world’s most senior computer programming languages and,
according to the Tiobe Index, the world’s most popular
• Software (that is, the C instructions you write, which are also called code)
controls hardware (that is, computers and related devices)
• C is widely used in industry for a wide range of tasks

• Today’s popular desktop operating systems—Windows, macOS and Linux—


are partially written in C in addition to web browsers (e.g., Google Chrome
and Mozilla Firefox), database management systems (e.g., Microsoft S QL
Server, Oracle and MySQL) and more
• data hierarchy : From individual bits (1 and 0) to databases, which store the
massive amounts of data that organizations need to implement
contemporary applications such as Google Search, Netflix, Airbnb
• Many development environments (IDE) are available in which you can
compile, build and run C applications
Hardware and Software
• Computers can perform calculations ( + - * \) and make logical decisions
( & | …) phenomenally faster than human beings can
• Today’s personal computers and smartphones can perform billions of
calculations in one second—more than a human can perform in a lifetime
• Supercomputers already perform thousands of trillions (quadrillions) of
instructions per second!
– As of December 2020, Fujitsu’s Fugaku is the world’s fastest
supercomputer—it can perform 442 quadrillion calculations per
second (442 petaflops)!
• Computers process data under the control of sequences of instructions
called computer programs
• A computer consists of various physical devices referred to as hardware
– keyboard, screen, mouse, solid-state disks, hard disks, memory, D VD drives
and processing units
Computer Organization
• Regardless of physical differences, computers can be envisioned as divided
into various logical units or sections
• Input Unit,
• Output Unit
• Memory Unit : R A M (Random Access Memory).
• Arithmetic and Logic Unit (A L U)
• Control unit
• multicore processors implement multiple processors on a single IC chip and
can perform many operations simultaneously (2 dual, quad..72 cores)
• Secondary Storage Unit (solid-state drives (S SD s), U S B, hard drives
Data Hierarchy
C uses the ASCII (American Standard Code for
Information Interchange) character set by default
Machine Languages, Assembly Languages and High-Level Languages
1. Machine languages.
2. Assembly languages.
3. High-level languages.
• Any computer can directly understand only its own machine language, defined by its hardware design
• consist of strings of 1s and 0s) that instruct computers to perform their most elementary
operations one at a time
• Machine languages are machine-dependent—a particular machine language
can be used on only one type of computer. They are cumbersome for humans
• assembly languages are English-like abbreviations to represent elementary operations.
• Translator programs called assemblers were developed to convert assembly-
language programs to machine language at computer speeds

• high-level languages were developed in which single statements could accomplish


substantial tasks
• Translator programs called compilers convert high-level-language source code into
machine language
• High-level languages look almost like everyday English and contain math notations
• C is among the world’s most widely used high-level programming languages.
• Compiling a large high-level language program into machine
language can take considerable computer time
• Interpreters execute high-level language programs directly
• Interpreters avoid compilation delays, but your code runs
slower than compiled programs
• Java and Python, use a clever mixture of compilation and
interpretation to run programs
1.6 The C Programming Language
• The C Programming Language
– C evolved from two earlier languages, BCPL and B
– C is widely used to develop systems that demand performance
– operating systems
– embedded systems
– real-time systems
– communications systems:

• A third-party and open-source C libraries are GitHub lists over 32,000


repositories in their C category https://github.com/topics/c
– Other Popular Programming Languages ? Which is best?
Typical C Program- IDE
• C systems generally consist of several parts
– a program-development environment
– the language
– the C standard library
• C programs typically go through six phases to be executed
– Edit
– Preprocess
– Compile
– Link
– Load
– Execute
Phase 1: Creating a Program
• Consists of editing a file in an editor program:
Phases 2 and 3: Preprocessing and Compiling a C Program
• You give the command to compile the program
• Compiler translates it into machine-language code
• The compilation command invokes a preprocessor first
– Performs text manipulations on a program’s source-code files
– Inserting the contents of other files
– Text replacements
• A syntax error occurs when the compiler cannot recognize
a statement because it violates the language rules
– The compiler issues an error message to help you
locate and fix the incorrect statement.
• The C standard does not specify the wording for error
• Syntax errors are also called compile errors or compile-
time errors.
Phase 4: Linking
• C programs use functions defined elsewhere
– standard libraries, open-source libraries or private libraries of a
particular project.
• The object code produced by the C compiler typically contains
“holes”
• Linker links a program’s object code with the code for the missing
functions to produce an executable image (with no missing pieces)
Phase 5: Loading
• Before a program can execute, the operating system
must load it into memory
• Loader transfers the executable image from disk to
memory
• Additional components from shared libraries that support
the program also are loaded
Phase 6: Execution
• Finally, the computer, under the control of its CPU,
executes the program one instruction at a time
Problems That May Occur at Execution Time

• Errors that occur as programs run are called runtime


errors or execution-time errors
• Fatal errors cause a program to terminate immediately
without successfully performing its job
• Nonfatal errors allow programs to run to completion, often
producing incorrect results
Standard Input, Standard Output and Standard Error Streams
• Most C programs input and/or output data

• Certain C functions take their input from stdin (the standard input stream),
which is normally the keyboard
• Data is often output to stdout (the standard output stream), which is normally
the computer screen
• Data also may be output to devices such as disks and printers

• There’s also a standard error stream referred to as stderr, which is normally


connected to the screen and used to display error messages
Internet, World Wide Web, the Cloud and I oT
• Late 19 60s, A R P A—the Advanced Research Projects Agency of the United
States Department of Defense—rolled out plans for networking the main computer
systems of approximately a dozen A R P A-funded universities and research
institutions
• Became known as the A R P A NET, the precursor to today’s Internet

• Today’s fastest Internet speeds are on the order of billions of bits per second, with
trillion-bits-per-second (terabit) speeds already being tested
– In 2020, Australian researchers successfully tested a 44.2 terrabits per second
Internet connection
• A R P A NET main benefit proved to be the capability for quick and easy
communication via what came to be known as electronic mail (e-mail)
• Billions of people worldwide now use the Internet to communicate quickly and easily

• The protocol (set of rules) for communicating over the A R P A NET became known
as the Transmission Control Protocol (T C P)
• Ensured that messages, consisting of sequentially numbered pieces called packets,
were properly delivered from sender to receiver, arrived intact and were assembled
in the correct order
The Internet: A Network of Networks
• In parallel with the early evolution of the Internet, organizations worldwide
were implementing their own networks
• One challenge was to enable these different networks to communicate with
each other
• ARPA accomplished this by developing the Internet Protocol (IP), which
created a true “network of networks,” the Internet’s current architecture
• The combined set of protocols is now called TCP/IP

• Each Internet-connected device has an IP address

• Businesses rapidly realized that, by using the Internet, they could improve
their operations and offer new and better services to their clients
• As a result of their investments, Internet bandwidth—the information-carrying
capacity of communications lines—has increased tremendously, while
hardware costs have plummeted
The World Wide Web: Making the Internet User-Friendly
• The World Wide Web (simply called “the web”) is a collection of hardware
and software associated with the Internet that allows computer users to
locate and view documents on almost any subject
• 1989, Tim Berners-Lee of CERN (the European Organization for Nuclear
Research) began developing HyperText Markup Language (H TML)—
the technology for sharing information via “hyperlinked” text documents
• Also wrote communication protocols such as HyperText Transfer
Protocol (HTTP) to form the backbone of his new hypertext information
system
• In 19 94, Berners-Lee founded the World Wide Web Consortium

(W3C, https://www.w3.org ), devoted to developing web technologies


The Cloud
• More and more computing today is done “in the cloud”—that is, using
software and data distributed across the Internet worldwide
• Cloud computing allows you to increase or decrease computing
resources to meet your needs at any given time
– Shifts to the service provider the burden of managing these apps
(such as installing and upgrading the software, security, backups
and disaster recovery).
• The apps you use daily are heavily dependent on various cloud-
based services. These services use massive clusters of computing
resources (computers, processors, memory, disk drives, etc.) and
databases that communicate over the Internet with each other and
the apps you use. A service that provides access to itself over the
Internet is known as a web service.
The Cloud—Software as a Service
• Cloud vendors focus on service-oriented architecture (S OA)
technology
• They provide “as-a-Service” capabilities that applications
connect to and use in the cloud.
• Common services provided by cloud vendors include:
– Big data as a Service (BDaaS)
– Hadoop as a Service (HaaS)
– Infrastructure as a Service (I aaS)
– Platform as a Service (PaaS)
– Software as a Service (SaaS)
– Storage as a Service (SaaS)
The Cloud—Mashups

• mashups enable you to rapidly develop powerful software


applications by combining (often free) complementary web
services and other forms of information feeds
• ProgrammableWeb ( https://programmableweb.com/ ) provides
a directory of nearly 24,000 web services and almost 8,000
mashups.
• They also provide how-to guides and sample code for working
with web services and creating your own mashups
• Some of the most widely used web services are Google Maps
and others provided by Facebook, Twitter and YouTube
The Internet of Things
• A thing is any object with an I P address and the ability to send, and in some
cases receive, data automatically over the Internet
– a car with a transponder for paying tolls,
– monitors for parking-space availability in a garage,
– a heart monitor implanted in a human,
– water-quality monitors,
– a smart meter that reports energy usage,
– radiation detectors,
– item trackers in a warehouse,
– mobile apps that can track your movement and location,
– smart thermostats that adjust room temperatures based on weather
forecasts and activity in the home, and
– intelligent home appliances.
Software Technologies
• Refactoring
– Reworking programs to make them clearer and easier to maintain while
preserving their correctness and functionality.
– Many IDEs contain built-in refactoring tools

• Design patterns
– Proven architectures for constructing flexible and maintainable object-
oriented software
– The field of design patterns tries to enumerate those recurring patterns,
encouraging software designers to reuse them to develop better-quality
software using less time, money and effort.
• Software Development Kits (SDKs)
– The tools and documentation that developers use to program
applications.
How Big Is Big Data?
• Data is now as crucial as writing programs
• According to IBM, approximately 2.5 quintillion bytes (2.5
exabytes) of data are created daily, and 90% of the
world’s data was created in the last two years
• According to IDC, the global data supply will reach 175
zettabytes (equal to 175 trillion gigabytes or 175 billion
terabytes) annually by 2025
How Big Is Big Data? Megabytes (MB)

• One megabyte is about one million (actually 220 ) bytes


• Many of the files we use daily require one or more M Bs of
storage
– MP3 audio files—High-quality MP3s range from 1 to 2.4 MB
per minute.
– Photos—JPEG format photos taken on a digital camera can
require about 8 to 10 MB per photo.
– Video—Each minute of video can require many megabytes
of storage. For example, on one of our iPhones, the
Camera settings app reports that 1080p video at 30 frames-
per-second (FPS) requires 130 MB/minute and 4K video at
30 FPS requires 350 MB/minute.
How Big Is Big Data? Gigabytes (G B)

• One gigabyte is about 1000 megabytes (actually 230 bytes)


• A dual-layer DVD can store up to 8.5 GB, which translates to:
– as much as 141 hours of MP3 audio,
– approximately 1000 photos from a 16-megapixel camera,
– approximately 7.7 minutes of 1080p video at 30 F PS, or
– approximately 2.85 minutes of 4K video at 30 F PS.
• The current highest-capacity Ultra H D Blu-ray discs can store
up to 100 GB of video
• Streaming a 4K movie can use between 7 and 10 G B per hour
(highly compressed).
How Big Is Big Data? Terabytes (T B)

• One terabyte is about 1000 gigabytes (actually 240 bytes)


• Recent disk drives for desktop computers come in sizes up to
20 TB, which is equivalent to:
– approximately 28 years of MP3 audio,
– approximately 1.68 million photos from a 16-megapixel
camera,
– approximately 226 hours of 1080p video at 30 F PS, or
– approximately 84 hours of 4K video at 30 F PS.
• Nimbus Data now has the largest solid-state drive (S SD) at
100 TB, which can store five times the 20-T B examples of
audio, photos and video listed above
How Big Is Big Data? Petabytes, Exabytes and Zettabytes
• There are over four billion people online, creating about
2.5 quintillion bytes of data each day
– 2500 petabytes (each petabyte is about 1000
terabytes) or 2.5 exabytes (each exabyte is about
1000 petabytes).
• A March 2016 Analytics-Week article stated that by 2021
there would be over 50 billion devices connected to the
Internet and, by 2020, there would be 1.7 megabytes of
new data produced per second for every person on the
planet
• That’s about
– 13 petabytes of new data per second,
– 780 petabytes per minute,
– 46,800 petabytes (46.8 exabytes) per hour, or
– 1,123 exabytes per day—1.123 zettabytes (ZB) per
day (each zettabyte is about 1000 exabytes)
• Equivalent to over 5.5 million hours (over 600 years) of
4K video every day or approximately 116 billion photos
every day!
How Big Is Big Data? Computing Power Over the Years

• Today’s processor performance is often measured in terms of FLOP


S (floating-point operations per second)
• Currently, the fastest supercomputer—Fujitsu’s Fugaku—is capable
of 442 petaflops.
• Distributed computing can link thousands of personal computers via
the Internet to produce even more F LOPS
• Companies like IBM are now working toward supercomputers

capable of exaflops (1018 FLOPS)


• The quantum computers now under development
theoretically could operate at 18,000,000,000,000,000,000
times the speed of today’s “conventional computers”
– In one second, a quantum computer theoretically could do
staggeringly more calculations than the total that have
been done by all computers since the world’s first
computer appeared
– Could wreak havoc with blockchain-based
cryptocurrencies like Bitcoin
– Engineers are already rethinking blockchain to prepare for
such massive increases in computing power.
How Big Is Big Data? Processing the
World’s Data Requires Lots of Electricity
• Data from the world’s Internet-connected devices is exploding

• Processing that data requires tremendous amounts of energy

• According to a recent article, energy use for processing data in 2015 was
growing at 20% per year and consuming approximately three to five percent
of the world’s power
– Total data-processing power consumption could reach 20% by 2025

• Another enormous electricity consumer is the blockchain-based


cryptocurrency Bitcoin
• Processing just one Bitcoin transaction uses approximately the same amount
of energy as powering the average American home for a week
• According to some estimates, a year of Bitcoin transactions consumes more
energy than many countries
How Big Is Big Data? Big-Data
Opportunities
• The big-data explosion is likely to continue exponentially for years to come.

• It’s crucial for businesses, governments, the military, and even individuals to
get a handle on all this data
• Big data’s appeal to big business is undeniable, given the rapidly
accelerating accomplishments
• Many companies are making significant investments and getting valuable
results through technologies like big data, machine learning and natural-
language processing
• This is forcing competitors to invest as well, rapidly increasing the need for
computing professionals with computer-science and data-science experience
• This growth is likely to continue for many years

You might also like