0% found this document useful (0 votes)
21 views5 pages

Com Fund Notes 1

Uploaded by

Rm Ot
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views5 pages

Com Fund Notes 1

Uploaded by

Rm Ot
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

EPS 142:

COMPUTER FUND. AND PROGRAMMING

COMPUTER METHODS OF ELECTRONIC DATA-


- A machine used for “data- PROCESSING
processing”.
- Converts raw data (input) and 1. Batch Processing
information (output). 2. Real Time
3. Online – 24/7 operational
DATA vs. INFORMATION 4. Distributive System
 Data – collection of facts AREAS OF ELECTRONIC DATA-PROCESSING
- raw material
- e.g., letters in keyboard 1. Business
 Information – a thing that undergo 2. Scientific
processing. Recording
- processed data
- anything that is new to you - is usually a manual operation.
- refers to the transfer of data onto some
form of documents.
DATA PROCESSING OPERATIONS
(RVDCSCSMSRF) - relates to documentation, resulting from
calculations.
1. Recording
- e.g., computing gross pay.
2. Verifying
Verifying
3. Duplicating
- recorded data are carefully checked for
4. Classifying any errors.
5. Sorting - reread for correctness.
6. Calculating - e.g., punched card and type reports.
7. Summarizing and Reporting Duplicating
8. Merging
- to copy or duplicate data.
9. Storing
- reproducing data unto many forms or
10. Retrieving documents.
11. Feedback - e.g., duplicating using carbon paper.
Classifying - arithmetic manipulation of the data.
- separates data into categories. - is a crucial phase of data manipulation,
because the outcome of this operation
- also known as “Identifying”
becomes part of the output.
Identifying
Summarizing and Reporting
- are arranging items with like
- a collection of data is condensed.
characteristics into groups or classes.
- certain conclusions from the data are
Coding represented in a meaningful format.
- is the method of Classifying. To be of value, data must often be condensed or
sifted so that the resulting output reports will be
clear, concise and effective.

3 TYPES OF CODES Merging

1. Numeric - takes two or more sets of data.


- a person’s social security number or ID. - puts together to form a single sorted set
of data.
2. Alphabetic
- grades as A, B, and C or names of - e.g., sales reports from different store
persons. branches are merge to form an overall
sales report.
3. Alphanumeric Storing
- automobile license plate or course and
year. - placing similar data into files for future
references.
Sorting.
- arranging data in a specific order.
- to arrange or rearrange them in a
predetermined sequence to facilitate
processing.
METHODS OF STORING (MEE)
- Numeric sorting usually requires less
time than alphabetic sorting in machine- 1. Manual (ledger book)
based processing. 2. Electromechanical (punched cards)
- e.g., telephone book is sorted into 3. Electronic (memory of the computer)
alphabetical order. Data should be stored only if the value of having
Key them in the future exceeds the storage cost.

- the data item which determines the Retrieving


sorting. - recovering stored data and/or information
Calculating when needed.
Feedback 3. Real-time Processing
- is the comparison of the output(s) and - is like transaction processing.
the goal set in advance.
- is used in situations where output is
- discrepancy is analyzed, corrected, and expected in real-time.
fed back to the proper stage in the
- computes incoming data as quickly as
processing operation.
possible.
GPS-tracking applications are the most common
METHODS OF DATA PROCESSING:
example of real-time data processing.
1. Transaction Processing
Real-time processing is preferred over
- is deployed in mission-critical situations. transaction processing in cases where
approximate answers suffice.
- these are situations, which, if disrupted,
will adversely affect business operations.  Stream processing is a common
- e.g., processing stock exchange application of real-time data processing.
transactions. - popularized by Apache Storm,
stream processing analyzes data as it
Availability comes in
- is the most important factor in transaction  Google BigQuery and Snowflake are
processing. examples of cloud data platforms that
employ real-time processing.
Availability can be influenced by factors such as:
4. Batch Processing
1. Hardware
- is when chunks of data, stored over a
2. Software
period, are analyzed together, or in
batches.
2. Distributed Processing - required when a large volume of data
needs to be analyzed for detailed
- breaks down large datasets and stores
insights.
them across multiple machines or
- it saves on computational resources.
servers.
- preferred over real-time processing when
- it rests on Hadoop Distributed File accuracy is more important than speed.
System (HDFS).
The efficiency of batch processing is also
- has a high fault tolerance. measured in terms of throughput. Throughput is
- Distributed processing can also be the amount of data processed per unit time.
immensely cost saving.

5. On-line Processing
- is an automated way to enter and (3). On the basis of Data Processed and
process data or reports continuously as
use as the source documents are  Analog
available.  Digital
 Hybrid
- A good example of online processing is
bar code scanning. (4). On the basis of Purpose
CYCLE OF DATA-PROCESSING (EXPANDED  Special Purpose
CYCLE)  General Purpose

ORGANIZATION
FIRST GENERATION: BETWEEN (1940-1956)
- used vacuum tubes for circuitry and
INPUT magnetic drums for memory.
. - relied on machine language, the lowest-
level programming language understood
PROCESS STORAGE by computers, to perform operations.
- they could only solve one problem at a
time.
OUTPUT
- punched cards and paper tape were
used for input, and output was displayed
DISTRIBUTION on printouts.

CLASSIFICATIONS OF COMPUTERS SECOND GENERATIONS (1956-1963) :


Computer can be classified into FOUR broadly TRANSISTORS
categories: - far superior to the vacuum tube, allowing
(1). On the basis of Generation. computers to become smaller, faster,
cheaper, more energy-efficient and more
 First Generation
reliable than their first-generation
 Second Generation
 Third Generation predecessors.
 Fourth Generation - used magnetic core technology.
 Fifth Generation
- Assembly languages was invoked during
(2). On the basis of Size, second generation, which allowed
 Super Computers programmers to specify instructions in
 Mainframe words.
 Minicomputers
 Mobile computers
 Micro Computers
THIRD GENERATION (1964-1971):
INTEGRATED CIRCUITS
- Integrated circuit development was the
hallmark of the third generation of
computers.
- Users interacted with third generation
computers through keyboards and
monitors.
- These were interfaced with an operating
system.
- allowed the device to run multi process
concurrently at one time with a central
program that monitored the memory.

FOURTH GENERATION (1971-PRESENT):


MICROPROCESSORS
- brought microprocessor, which makes it
possible to build thousands of integrated
circuits onto a single silicon chip.
Intel 4004 chip (1971)
- located all the components of the
computer - from the central processing
unit and memory to input/output controls
- on a single chip.

FIFTH GENERATION - PRESENT AND


BEYOND: ARTIFICIAL INTELLIGENCE
- based on Quantum computation,
molecular, nanotechnology and artificial
intelligence
- Developing computing device that
respond to natural language input and
are capable of learning and self-
organization.

You might also like