0% found this document useful (0 votes)
12 views

Final Computer Architecture

This document discusses key aspects of computer architecture including qubits, quantum properties, qubit representation, readout, parallelism, algorithms, and hardware. It provides definitions for qubits and describes how they can exist in superposition and how this enables massive parallelism. It outlines some important quantum algorithms like Shor's and Grover's and notes they provide exponential or quadratic speedups over classical algorithms. It also describes the major hardware components needed for quantum computing like qubit technology, cryogenics, logic gates, and control electronics.

Uploaded by

ALVIN CHEGE
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Final Computer Architecture

This document discusses key aspects of computer architecture including qubits, quantum properties, qubit representation, readout, parallelism, algorithms, and hardware. It provides definitions for qubits and describes how they can exist in superposition and how this enables massive parallelism. It outlines some important quantum algorithms like Shor's and Grover's and notes they provide exponential or quadratic speedups over classical algorithms. It also describes the major hardware components needed for quantum computing like qubit technology, cryogenics, logic gates, and control electronics.

Uploaded by

ALVIN CHEGE
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

COMPUTER ARCHITECTURE

EASTSIDE: Group 4
1. Felicia Njambi Njoroge - BIT/2023/72645
2. Clifford Mbithuka - BSCCS/2023/67094
3. Gloria Jebichii - BSCCS/2023/59987
4. Baraka Victor Gagné - BSCCS/2023/72811
5. Sheilah Akinyi Onyango - BIT/2023/61685.
6. Ruth Waithira - BIT/2023/72091
7. Makena Debra Karimi - BSCCS/2023/61880
8. Kinyua Beatrice Wanjiku - BSCCS/2023/73026
9. Kevin Nyangweso - BIT/2024/33671
10.Rael Odong - BIT/2023/62536
11.Lynne Mungai - BIT/2023/72047
12.Abdiwahab Mahamud - BSCCS/2023/60787
13.Mark Kabugi BSCCS/2023/62623
14.Erick Nashon - BSCIS/2021/96255
15.Jackson kilong - BIT/2023/60434
16. Jackton Owuor - BIT/2023/63527
17.Tiffany Wanjiku - BIT/2023/73126
18.Sharon atabo - BSCCS/2023/72873
19.Yahya kala - BIT/2023/68934
20. JURKUCH THON BARACH - BIT/2024/34750
21.Rachael Mutethya - BIT/2023/72371
22.Sheila Waithera Muriithi - BIT/2024/32455
23.Reberatha Nalemba - BIT/2023/72331
24.Morris James Thuo - BIT/2023/61080
25.Patrick Kirubi - Bsccs/2023/66604
26.Robert Ojock - BSCCS/2023/59727
Quantum computing
Quantum computing is a new paradigm of computing that harnesses quantum mechanical
phenomena like superposition and entanglement to perform calculations. Here are some key
things to know about quantum computing:

QUBITS
Qubits: The basic unit of information in quantum computing. Unlike traditional binary bits that
can only be 0 or 1, qubits can exist in a superposition of 0 and 1 simultaneously due to quantum
effects. This allows qubits to encode much more information.

Definition
Qubits, or quantum bits, are the quantum analogue of the classical bits 0 and 1 used in traditional
computing. However, instead of storing distinct 0 or 1 states, qubits can exist in a superposition
of both states simultaneously.

Example: A qubit could be in a linear combination or superposition of |0⟩ and |1⟩ quantum
states like α|0⟩ + β|1⟩ where α and β are probability amplitudes. When measured, the
probability of outcome |0⟩ = |α|2 and outcome |1⟩ = |β|2.

Physical Implementation:
Qubits utilize quantum properties like atomic spin or photon polarization to represent
information. Different physical systems used include trapped ions, superconducting circuits,
quantum dots, etc. They require cooling to near absolute zero to preserve quantum coherence.

Quantum Properties:
When multiple qubits exhibit quantum entanglement, instead of discrete 0/1 states, they can
represent a complex, exponentially increasing number of states with each added qubit. This
massive parallelism provides quantum computing power.

Qubit Representation:
The state of a single qubit is represented by a vector with complex components (α, β) and
denoted as ψ in Dirac representation. The evolution in time of qubits and quantum logic
operations are defined through unitary transforms on these vectors.

Readout:
Special quantum circuits are used to measure qubits in the 0/1 basis at the end of computation.
Repeat measurements determine probabilities of each discrete state which provide usable results.
Maintaining a coherent superposition through the entire computation remains an engineering
challenge.

In summary, qubits are quantum systems with unique counterintuitive properties allowing new
approaches to information encoding, manipulation, and processing for next generation quantum
computers.
Parallelism
Thanks to superposition and entanglement, operations can be performed on qubits in
parallel, allowing quantum algorithms to evaluate many possibilities simultaneously. This
massive parallelism provides tremendous computational power.

● Quantum parallelism is one of the key features that gives quantum computing its
incredible potential power. It arises from a uniquely quantum effect known as
superposition.
● Instead of bits restricted to classical 0 or 1 states, qubits can exist in a linear combination
or superposition of 0 and 1 states simultaneously. For an n qubit system with each qubit
in an equal superposition of 0 and 1, there are 2n combinations of all possible n-bit
strings encoded at once.

For example:
2 qubits can encode 4 states: |00⟩, |01⟩, |10⟩ and |11⟩
3 qubits encode 8 states - from |000⟩ to |111⟩

● This ability to simultaneously represent exponentially many computational paths makes


quantum calculations intrinsically parallel.

● Now, operations on superposed qubits affect all the states encoded in parallel. A
sequence of operations transforms this massive superposition interference and amplitude
changes.

● At the end of the calculation, upon measurement, the quantum parallelism collapses to a
single state yielding a result. Making clever use of quantum effects gives algorithms like
Grover's search and Shor's factorization exponential speed up over classical methods.

● However, the catch is that directly observing all computational paths in detail is
forbidden by quantum mechanics! Still, performing special quantum transformations and
interference before measurement provides computational speedups.

In summary, qubits can encode many classically distinct states in superposition and using
quantum gates we can manipulate all these possibilities in parallel to solve problems
faster - this highly useful phenomenon is quantum parallelism.

ALGORITHM
● Quantum algorithms like Shor's algorithm for factoring integers and Grover's algorithm
for search can provide exponential or quadratic speedups over their best known classical
counterparts. More quantum algorithms are being actively researched.

● Quantum algorithms are sequences of quantum logic operations used to perform


computational tasks on quantum computers. They harness key quantum effects to obtain
significant speedups over classical algorithms.

Some examples of important quantum algorithms:

● Shor's Algorithm: Used for integer factorization into primes. Provides exponential
speedup over classical counterparts and can break widely used RSA encryption.

● Grover's Algorithm: Quantum search algorithm that achieves


quadratic speedup. Used to search unsorted databases with high
probability in O(√N) vs O(N) classical complexity.
● Quantum Fourier Transform: Efficiently computes discrete Fourier transform of quantum
wave functions with exponential speedup. Useful quantum subroutine.

● Quantum Phase Estimation: Estimates eigenvalues of a unitary operator and powers


quantum Fourier transform. Used in many quantum chemistry simulations.

● Quantum Walk Based Algorithms: Quantum walks allow traversal of graphs similar to
classical random walks but faster, offering quadratic to exponential speedups for certain
graph problems.

● In general, quantum algorithms encode inputs as quantum superpositions, apply carefully


designed sequences of unitary quantum logic gates to transform the superposition, and
finally obtain outputs by measuring the result.

● Designing these algorithms involves understanding quantum interference, optimizing


gate sequences to achieve sufficient amplitude amplification, and mapping solutions to
classical problems via superposition states.

Hardware
● With rapid progress in quantum hardware development, practical applications of quantum
algorithms across chemistry, optimization, machine learning and cryptography could
unlock immense computing power.

● Building quantum computers requires isolating and controlling qubits to maintain fragile
quantum states for sufficient time to perform operations. Many different physical systems
like trapped ions, superconducting circuits, and photons are being engineered for this
purpose.

● Quantum computers rely on specialized hardware to generate, manipulate, and measure


quantum bits (qubits) to carry out computational tasks. The key hardware components
are:

● Qubit Technology: The underlying physical system used to implement qubits. Popular
qubit implementations include superconducting circuits, trapped ions, and photons. They
leverage quantum mechanical phenomena like energy levels, spin, and polarization to
represent information.

● Cryogenics: Most qubit technologies require extremely cold temperatures achieved using
cryostats, dilution refrigerators, etc. This protects the fragile quantum states from external
noise and environmental interference. Temperatures below 1 K (-272°C) are typical.

● Quantum Logic Gates: These hardware devices utilize electromagnetic pulses to


manipulate and entangle qubits. Similar to classical logic gates, quantum gates form the
building blocks of quantum circuits transforming information encoded in qubits during
algorithm execution.
● Qubit Control Electronics: Carefully designed electronics provide precise delivery of
microwave pulses, instruction sequences, error detection, and real-time qubit readout.
High-frequency arbitrary waveform generators and field-programmable gate array
(FPGA) controllers are commonly used.

● Wiring Infrastructure: Different metal coatings and geometries including coaxial cables,
waveguides, and resonators connect the external electronics to the processor chip hosting
the qubits itself.

● Input/Output Equipment: Used to encode input problems into qubits before computation
and record qubit measurement results at the end into classical bits for usable output after
some post-processing.

● The quantum hardware has to maintain coherence, calibration precision, and tight
integrations between these heterogeneous underlying physical components for successful
operation.

Applications of quantum computing


If large, practical quantum computers can be built, they hold the potential to accelerate progress
in fields like chemistry, AI, optimization, finance, and cryptography. Useful applications may
still be years away due to engineering challenges.

1. Cryptanalysis - Quantum algorithms like Shor's can break commonly used RSA, ECC, AES
encryption. This ability poses security threats to current communication systems and data.
Quantum-secure cryptography is being developed to counter this.

2. Quantum Chemistry Simulation - Highly accurate simulation of chemical systems and


reactions by representing atoms and bonds as qubits. Expected to advance drug discovery,
battery development, production processes in sectors like agriculture and energy.

3. Financial Modeling - Complex risk analysis, fraud detection, portfolio optimization using
quantum algorithms on financial data to make markets more efficient.

4. Logistics & Optimization - Finding optimal routes, resource allocation plans for industries
like transportation, logistics and supply chains through quantum optimization algorithms.

5. Artificial Intelligence - Quantum machine learning holds promise for pattern recognition,
classification and modeling complex data like images, speech and texts at superior scales and
accuracies.

6. Climate Forecasting - Leveraging quantum simulation to create highly detailed models for
weather forecasting, climate change analysis exceeding capabilities of current supercomputers.

7. Space Exploration - Spacecraft on-board quantum computers running complex computations


like navigation algorithms, image analysis, communication encoding etc.
The unique information encoding and processing properties of quantum computing promises to
push the frontiers in these fields. Advances towards fault tolerant, scalable quantum computers
will turn many more proposed use cases into reality.

Challenges facing quantum computing


The loss of quantum properties into classical behavior remains a huge obstacle. Error correcting
codes and fault tolerant design architectures are needed for this technology to properly mature.
Here are some of the major challenges facing practical realizations of quantum computing:

1. Qubit Fidelity and Coherence Time: Current quantum bits have high error rates and lose
their quantum state quickly before useful computation can be done. Maintaining quantum
coherence remains difficult.

2. Quantum Error Correction: Algorithms have been developed to detect qubit errors using
redundancy and correct them. However, the overheads are still impractical for current noisy
qubits. Lower physical error rates are needed.

3. Qubit Scalability: A large number of qubits (hundreds at least) needs to be created and
entangled reliably on a single quantum chip for meaningful applications. Current quantum
computers have less than 100 qubits.

4. Hardware Design: Engineering hardware equipment like cryostats for cooling, cabling,
electronics for control signals without introducing noise remains challenging. Supplying
microwave signals without crosstalk also needs precision.

5. Platform Uniformity: Variations in manufacturing quantum processors introduce control and


behavioral differences between qubits. This lack of consistency makes it hard to model systems
accurately during software design.

6. Software Tools: Limited availability of developer tools, compilers, simulators and


frameworks hampers ease of programming quantum computers and mapping algorithms to real
hardware.

7. Application Maturity: Practical quantum use cases require developing sophisticated


algorithms, analysis of speedup over classical techniques, and pathways to achieving quantum
advantage - an area still evolving.

While rapid innovations are helping overcome some key limitations, quantum computing
technology still has open fundamental physics and engineering problems to solve.
VECTOR PROCESSING
Definition:
Vector processing is a computing technique that operates on vectors or arrays of data as a
single unit, rather than scalar data items one at a time. It relies extensively on parallelism to
achieve high throughput and efficiency.

Vector Processors: Computers designed for vector processing have components like vector
registers, pipelines, and arithmetic units optimized for matrix/vector operations. Vector lengths
denote the maximum number of vectors they can process per instruction.

Vectorized Code: Software programs implementing algorithms that leverage vectors maximize
speedups from vector hardware. Mathematical operations like additions/multiplications of entire
arrays happen in one cycle compared to loops processing element-by-element.

Vectorization: Compilers can auto-vectorize code by turning nested loops performing repetitive
math into single instructions operating on vectors feeding vector pipelines. Explicit vectorization
using SIMD extensions like AVX speeds up vector processing further.

Advantages:
● Vector processing reduces instruction fetches/decodes and exploits data level parallelism
effectively. It results in substantial speed ups for data/math intensive HPC applications
involving simulations, modeling, machine learning etc with structured and predictable
memory access.
● Comparison with GPUs: Modern GPUs also use the vector processing paradigm and have
more parallelism capabilities. However, vector processors have faster context switching
and are still better for applications requiring complex flow control.

In summary, vector processing leverages pipelines and parallel execution units to efficiently
process vector/array data using a single instruction, bringing out massive throughput for
mathematical and scientific applications.

COMPUTER PERFORMANCE AND


OPTIMIZATION
Computer Performance
● Speed: The time it takes to complete a processing task. Speed is measured in time units
like nanoseconds, clock cycles, etc. Faster completion indicates better performance.

● Throughput: The number of tasks completed per unit time. More tasks getting done per
second indicates better performance.

● Latency: The delay between the start of a task and getting the first response. Lower
latency signals better performance.

● Bandwidth: The amount of data that can be transferred or processed per unit time.
Higher bandwidth represents better performance.

Factors Affecting Performance


● Hardware resources: Number of processing cores, CPU clock speed, memory size &
speed, storage I/O rates etc. Better hardware improves capability for faster processing.

● Software efficiency: Well written programs utilizing parallelism, efficiently using


memory & storage, minimizing overhead result in better utilization of available
hardware.

● Configuration settings: BIOS settings, OS services/environment variables tuned for a


workload type can help or negatively impact performance.

Computer Optimization
Different techniques used to improve computer performance:

❖ Code optimization: Refactoring code to use less memory, improve readability & finding
areas to use parallel execution.
❖ Hardware upgrades: Increasing CPU cores, adding RAM/SSD storage,
GPU/Accelerators offloads suitable workloads.

❖ Cache optimization: Improper cache usage leads to frequent misses degrading


performance. Optimized access patterns improve hit rates.

❖ Storage enhancements: Faster media like SSDs over HDDs reduce access delays during
storage I/O operations.
❖ Balancing workloads: Evenly distributing concurrent transactions/jobs across hardware to
prevent congestion in one area.

❖ Benchmarking & profiling: Measuring existing performance via benchmark tests


highlights improvement areas to focus optimizations to realize bigger gains.

The goal behind computer performance tuning is to speed up computing tasks within given
hardware, software and cost constraints. It takes a structured approach of defining metrics,
measurement and focused enhancements.

COMMON TECHNIQUES AND METRICS USED FOR


MEASURING COMPUTER SPEEDS.
Techniques for Measuring Speeds
1. Benchmarking: Running standardized benchmark suite tests that simulate real-world
workloads and usage scenarios and measure responsiveness. Popular CPU benchmarks include
Geekbench, PassMark, Cinebench.

2. Monitoring tools: Using operating system monitors and utilities like Windows
Performance Monitor, top, to track utilization of computing resources like CPU, memory, disk,
network. Spikes indicate bottlenecks.

3. Profiling: Techniques like code instrumentation and using profiling software to understand
the execution times of specific code segments/functions/operations. Helps optimize slow areas.

Popular Speed Metrics


1. Clock speed: Rate at which a processor's logic units and components are synchronized
typically measured in GHz. Does not directly translate to real world speeds.

2. Throughput: Number of computations or tasks completed per unit of time - indicates overall
processing rates.
3. Instructions per second (IPS): Measures number of machine language instructions a CPU
core can process each second. Dependent on clock speeds.

4. Floating Point Operations Per Second (FLOPS): Common measure for computing
capacity useful for math/scientific programs using floating point calculations.

5. Execution time: Total end-to-end time taken to run a program or complete a benchmark
providing a real-world speed indication for tasks.

REAL WORLD COMPUTER


ARCHITECTURE
Real-world computer architectures refer to the actual design and organization of components in
computers that we use on a daily basis, as opposed to abstract theoretical models. These
architectures encompass hardware components, software systems, and their interaction to fulfill
specific computational requirements.Here are some examples of real-world computer
architectures:
1. x86 Architecture: The x86 architecture, developed by Intel and AMD, is one of the most
widely used CPU architectures in personal computers, servers, and workstations. It powers a vast
array of devices, from laptops and desktops to high-performance servers, and is known for its
compatibility, performance, and scalability.
2. ARM Architecture: The ARM architecture is prevalent in mobile devices, embedded
systems, and IoT devices due to its power efficiency and performance. ARM processors are used
in smartphones, tablets, wearable devices, and many other consumer electronics, as well as in
industrial automation and automotive systems.
3. Cloud Computing Architectures: Cloud computing platforms, such as Amazon Web
Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), employ complex
architectures comprising distributed systems, virtualization technologies, networking
infrastructure, and storage solutions. These architectures enable scalable, on-demand access to
computing resources over the internet, supporting a wide range of services and applications.
4. Embedded Systems Architectures: Embedded systems are specialized computer systems
designed to perform specific functions within larger systems or devices. Architectures like
Arduino, Raspberry Pi, and FPGA-based systems are used in applications such as IoT devices,
automotive electronics, industrial control systems, and consumer electronics.

Today's personal computers and smartphones predominantly use the Von Neumann architecture
consisting of:

★ CPU (Central Processing Unit)


- Execution units like ALUs, FPUs
- Caches - L1, L2, L3
- Registers and control unit
★ Memory
- RAM (Random Access Memory)
- ROM (Read Only Memory)

★ Buses
- Interconnect CPU to memory and peripherals
- Address bus, data bus, control signals

★ I/O Interfaces
- Standard ports and controllers to handle keyboard, mouse, USB devices, display,
network, storage drives

★ Storage Devices
- Hard disk drives (HDD)
- Solid state drives (SSD)

★ Operating System
- Resource management
- Driver software
- API and application interfaces

★ Programming Languages and System Software


- Languages to write applications
- Compilers, linkers, debuggers

The components are engineered to balance computing capabilities, power efficiency, cost and
space requirements for general purpose or specialized needs.

Modern microprocessor design strives to maximize real-world performance through:


- Parallel execution units
- Pipelining
- Caching mechanisms
- Branch prediction capabilities
The architecture organization and choice of components differ across form factors like personal
computers, high performance servers, mobile devices, embedded devices and supercomputers -
each optimized for a set of priorities.

You might also like