0% found this document useful (0 votes)
13 views7 pages

ICT - Reviewer

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views7 pages

ICT - Reviewer

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

History of Programming

Jacquard Loom (1801) – Invented by Joseph Marie Jacquard, it used punched cards to
control patterns woven on a loom, influencing early computing.
Analytical Engine (1837) – Designed by Charles Babbage, this was a proposed mechanical
general-purpose computer.
First Algorithm (1842-1843) – Ada Lovelace developed the first machine algorithm for
Babbage’s Difference Engine, laying the foundation for programming.
Tabulating Machine (1890) – Herman Hollerith created an electromechanical machine using
punched cards for data processing.
Turing Machine (1936) – Alan Turing introduced the theoretical computing machine,
foundational for modern computing theory.
Plankalkul (1943-1945) – Konrad Zuse designed an early high-level programming language
but did not implement it.
Short Code (1949) – John Mauchly developed one of the first high-level languages for an
electronic computer.
Autocode (1952) – Alick Edwards Glennie developed the first compiled high-level
programming language.
Fortran (1957) – Created by John Backus for numeric and scientific computing, it is one of the
oldest languages still in use today. Formula Translator
Algol (1958) – Developed by a committee for scientific use, it influenced languages like Pascal,
C, and Java. Algorithmic Language
COBOL (1959) – Grace Murray Hopper created this language for business computing, widely
used in finance and administration. Common Business Oriented Language
LISP (1959) – John McCarthy created this language for artificial intelligence research, which is
still in use today. List Processing Language
BASIC (1964) – Developed by John George Kemeny and Thomas Eugene Kurtz, BASIC was
designed for simplicity and ease of use, especially in education. Beginners All purpose
Symbolic Instruction code
Pascal (1970) – Developed by Niklaus Wirth, it was used for structured programming and data
structuring. Pattern Analysis Statistical Modelling And Computational Learning
Smalltalk (1972) – Alan Kay, Adele Goldberg, and Dan Ingalls created this language, which
influenced Python, Java, and Ruby.
C (1972) – Dennis Ritchie at Bell Labs developed C, which became the most widely used
programming language globally.
SQL (1972) – Donald D. Chamberlin and Raymond F. Boyce developed this language for
managing and querying databases. Structured Query Language
MATLAB (1978) – Cleve Moler created this language for mathematical computing, especially in
academia and research. Matrix Laboratory
C++ (1983) – Bjarne Stroustrup expanded C with object-oriented features, widely used in
commercial applications and games.
Perl (1987) – Larry Wall created Perl, primarily for text processing and system administration.
Practical Extracting and Report language
Python (1991) – Guido van Rossum developed Python as a simple, versatile language, now
widely used in web development, software, and data analysis.
Java (1995) – James Gosling developed this language for cross-platform applications,
especially in web and mobile app development.
PHP (1995) – Created by Rasmus Lerdorf, it became a popular language for server-side web
development. Personal Home Page
Ruby (1995) – Yukihiro Matsumoto created this language to be fun and productive, widely
used in web development.
JavaScript (1996) – Brendan Eich of Netscape developed this language for dynamic web
development and user interactions.
C# (2000) – Developed by Microsoft, it is widely used in Windows applications and is similar to
Java.
Swift (2014) – Apple developed this language to replace C, C++, and Objective-C, making it
easier to write software for iOS and macOS.

Five Generations of Programming Language

First Generation: Machine Language (1GL)

● Programming Language: Binary (0s and 1s).


● Circuitry: Vacuum Tubes.
○ Pros:
■ Direct hardware interaction, fast execution.
■ Full control over hardware.
○ Cons:
■ Complex, error-prone, and difficult to learn.
■ Non-portable, requires unique code for different hardware.
■ Vacuum tubes were large, unreliable, and generated heat.

Second Generation: Assembly Language (2GL)

● Programming Language: Mnemonics (assembly code).


● Circuitry: Transistors.
○ Pros:
■ Easier than machine language and more readable.
■ Offers control over hardware while reducing complexity.
■ Faster than high-level languages.
○ Cons:
■ Still complex and requires knowledge of hardware architecture.
■ Non-portable and machine-dependent.
■ Requires an assembler to translate code.
Third Generation: High-Level Languages (3GL)

● Programming Language: C, FORTRAN, COBOL, Pascal, BASIC.


● Circuitry: Integrated Circuits (ICs).
○ Pros:
■ Easier to use with English-like syntax, abstracting hardware complexities.
■ Portable across platforms, allowing code to be transferred between
different systems.
■ Efficient for development, making writing and debugging faster.
○ Cons:
■ Slower execution compared to lower-level languages.
■ Requires a compiler or interpreter to translate the high-level language into
machine code.
■ Errors in logic are still common, even if the code is easier to read.

Fourth Generation: Declarative and Visual Languages (4GL)

● Programming Language: SQL, Python, Perl, Ruby.


● Circuitry: Microprocessors.
○ Pros:
■ Microprocessors allowed for greater computational power and efficiency,
driving the rise of more user-friendly programming environments.
■ Less programming effort: 4GLs allow developers to specify what they
want, rather than how to achieve it.
■ Reduced time to develop software, making it suitable for database
management, data analysis, and rapid application development.
■ Visual programming tools allow users to drag and drop components,
simplifying the development process.
○ Cons:
■ Reduced control over hardware, making it less efficient for
resource-intensive tasks.
■ Sometimes results in bloated or inelegant code, difficult to maintain.
■ Not as flexible or powerful as lower-level languages.
○ Impact: Microprocessors revolutionized computing in the 1970s, making 4GLs
popular for data-driven and visual applications like database queries, report
generation, and rapid application development.
Fifth Generation: Artificial Intelligence and Logic Programming (5GL)

● Programming Language: Prolog, Lisp, Python (AI-focused).


● Circuitry: Sensors.
○ Pros:
■ Designed for AI and logic-based programming, focusing on solving
complex problems through logical rules rather than traditional procedural
code.
■ Sensors enable interaction with real-world data, making 5GL
programming crucial for fields like robotics, AI, machine learning, and
IoT (Internet of Things).
■ Programmers define rules and constraints, and the system generates
solutions, often using data collected through sensors.
■ Great for tasks such as natural language processing, machine learning,
and knowledge representation.
○ Cons:
■ Complexity makes it unsuitable for general-purpose programming.
■ Logic-based languages can be slower for tasks outside of AI.
■ The reliance on real-world data and sensors can introduce complexity in
maintaining and interpreting data.
○ Impact: Sensors in the 5th generation enabled computers to interact with the
physical world. Combined with AI, they allowed for innovations like autonomous
vehicles, smart homes, and robotics. 5GLs excel in reasoning and
decision-making processes, benefiting from real-time data collected through
sensors.

Conversions

1. Weighted Multiplication (Multiplying each digit by its base raised to its


positional power)

● This method is used when converting a number from a base (binary, octal, or
hexadecimal) to decimal. Each digit is multiplied by the base raised to the power of its
position (starting from 0 on the right).

Used for:

● Binary to Decimal Formula: Decimal = dₙ × 2ⁿ + dₙ₋₁ × 2ⁿ⁻¹ + ... + d₀ × 2⁰


Example: 1011₂ = (1 × 2³) + (0 × 2²) + (1 × 2¹) + (1 × 2⁰) = 11₁₀
● Octal to Decimal Formula: Decimal = dₙ × 8ⁿ + dₙ₋₁ × 8ⁿ⁻¹ + ... + d₀ × 8⁰
Example: 237₈ = (2 × 8²) + (3 × 8¹) + (7 × 8⁰) = 159₁₀
● Hexadecimal to Decimal Formula: Decimal = dₙ × 16ⁿ + dₙ₋₁ × 16ⁿ⁻¹ + ... + d₀ × 16⁰
Example: 1A3₁₆ = (1 × 16²) + (A × 16¹) + (3 × 16⁰) = 419₁₀
2. Division and Remainder (Repeated division by the target base and
reading remainders)

● This method is used for converting from decimal to any other base. The decimal
number is continuously divided by the target base, and the remainders are read in
reverse order.

Used for:

● Decimal to Binary Formula: Binary = remainder from dividing by 2, read bottom to top
Example: 13₁₀ → 1101₂
● Decimal to Octal Formula: Octal = remainder from dividing by 8, read bottom to top
Example: 159₁₀ → 237₈
● Decimal to Hexadecimal Formula: Hexadecimal = remainder from dividing by 16, read
bottom to top
Example: 419₁₀ → 1A3₁₆

3. Group by 3 (Binary to Octal)

● This method is used when converting binary to octal. The binary digits are grouped into
sets of 3 (starting from the right), and each group is converted to its octal equivalent.

Used for:

● Binary to Octal Formula: Octal = convert each group of 3 binary digits to octal
Example: 110101₂ → 65₈

4. Group by 4 (Binary to Hexadecimal)

● This method is used when converting binary to hexadecimal. The binary digits are
grouped into sets of 4 (starting from the right), and each group is converted to its
hexadecimal equivalent.

Used for:

● Binary to Hexadecimal Formula: Hexadecimal = convert each group of 4 binary digits


to hexadecimal
Example: 1011011₂ → 5B₁₆
5. Octal to Binary (Each Octal Digit to 3-bit Binary Equivalent)

● This method is used when converting octal to binary. Each octal digit is directly
converted into its 3-bit binary equivalent.

Used for:

● Octal to Binary Formula: Binary = convert each octal digit to 3-bit binary
Example: 237₈ → 010011111₂

6. Hexadecimal to Binary (Each Hexadecimal Digit to 4-bit Binary


Equivalent)

● This method is used when converting hexadecimal to binary. Each hexadecimal digit is
directly converted into its 4-bit binary equivalent.

Used for:

● Hexadecimal to Binary Formula: Binary = convert each hexadecimal digit to 4-bit binary
Example: 1A3₁₆ → 000110100011₂

7. Octal to Hexadecimal (First Convert to Binary, then Convert to


Hexadecimal)

● This method is used when converting octal to hexadecimal. First, convert the octal
number to binary by replacing each octal digit with its 3-bit binary equivalent. Then,
group the binary digits in sets of 4 and convert them to hexadecimal.

Used for:

● Octal to Hexadecimal Formula: Hexadecimal = octal → binary → group into 4-bit, then
convert to hexadecimal
Example: 237₈ → 010011111₂ → 13F₁₆

8. Hexadecimal to Octal (First Convert to Binary, then Convert to Octal)

● This method is used when converting hexadecimal to octal. First, convert the
hexadecimal number to binary by replacing each hexadecimal digit with its 4-bit binary
equivalent. Then, group the binary digits in sets of 3 and convert them to octal.
Used for:

● Hexadecimal to Octal Formula: Octal = hexadecimal → binary → group into 3-bit, then
convert to octal
Example: 1A3₁₆ → 000110100011₂ → 0643₈

Summary of Formulas Grouped

● Weighted Multiplication: Binary to Decimal, Octal to Decimal, Hexadecimal to Decimal


● Division and Remainder: Decimal to Binary, Decimal to Octal, Decimal to Hexadecimal
● Group by 3: Binary to Octal
● Group by 4: Binary to Hexadecimal
● Octal to Binary: Convert each octal digit to a 3-bit binary equivalent
● Hexadecimal to Binary: Convert each hexadecimal digit to a 4-bit binary equivalent
● Octal to Hexadecimal: First convert to binary, then group into 4-bit and convert to
hexadecimal
● Hexadecimal to Octal: First convert to binary, then group into 3-bit and convert to octal

You might also like