Final Computer Architecture
Final Computer Architecture
EASTSIDE: Group 4
1. Felicia Njambi Njoroge - BIT/2023/72645
2. Clifford Mbithuka - BSCCS/2023/67094
3. Gloria Jebichii - BSCCS/2023/59987
4. Baraka Victor Gagné - BSCCS/2023/72811
5. Sheilah Akinyi Onyango - BIT/2023/61685.
6. Ruth Waithira - BIT/2023/72091
7. Makena Debra Karimi - BSCCS/2023/61880
8. Kinyua Beatrice Wanjiku - BSCCS/2023/73026
9. Kevin Nyangweso - BIT/2024/33671
10.Rael Odong - BIT/2023/62536
11.Lynne Mungai - BIT/2023/72047
12.Abdiwahab Mahamud - BSCCS/2023/60787
13.Mark Kabugi BSCCS/2023/62623
14.Erick Nashon - BSCIS/2021/96255
15.Jackson kilong - BIT/2023/60434
16. Jackton Owuor - BIT/2023/63527
17.Tiffany Wanjiku - BIT/2023/73126
18.Sharon atabo - BSCCS/2023/72873
19.Yahya kala - BIT/2023/68934
20. JURKUCH THON BARACH - BIT/2024/34750
21.Rachael Mutethya - BIT/2023/72371
22.Sheila Waithera Muriithi - BIT/2024/32455
23.Reberatha Nalemba - BIT/2023/72331
24.Morris James Thuo - BIT/2023/61080
25.Patrick Kirubi - Bsccs/2023/66604
26.Robert Ojock - BSCCS/2023/59727
Quantum computing
Quantum computing is a new paradigm of computing that harnesses quantum mechanical
phenomena like superposition and entanglement to perform calculations. Here are some key
things to know about quantum computing:
QUBITS
Qubits: The basic unit of information in quantum computing. Unlike traditional binary bits that
can only be 0 or 1, qubits can exist in a superposition of 0 and 1 simultaneously due to quantum
effects. This allows qubits to encode much more information.
Definition
Qubits, or quantum bits, are the quantum analogue of the classical bits 0 and 1 used in traditional
computing. However, instead of storing distinct 0 or 1 states, qubits can exist in a superposition
of both states simultaneously.
Example: A qubit could be in a linear combination or superposition of |0⟩ and |1⟩ quantum
states like α|0⟩ + β|1⟩ where α and β are probability amplitudes. When measured, the
probability of outcome |0⟩ = |α|2 and outcome |1⟩ = |β|2.
Physical Implementation:
Qubits utilize quantum properties like atomic spin or photon polarization to represent
information. Different physical systems used include trapped ions, superconducting circuits,
quantum dots, etc. They require cooling to near absolute zero to preserve quantum coherence.
Quantum Properties:
When multiple qubits exhibit quantum entanglement, instead of discrete 0/1 states, they can
represent a complex, exponentially increasing number of states with each added qubit. This
massive parallelism provides quantum computing power.
Qubit Representation:
The state of a single qubit is represented by a vector with complex components (α, β) and
denoted as ψ in Dirac representation. The evolution in time of qubits and quantum logic
operations are defined through unitary transforms on these vectors.
Readout:
Special quantum circuits are used to measure qubits in the 0/1 basis at the end of computation.
Repeat measurements determine probabilities of each discrete state which provide usable results.
Maintaining a coherent superposition through the entire computation remains an engineering
challenge.
In summary, qubits are quantum systems with unique counterintuitive properties allowing new
approaches to information encoding, manipulation, and processing for next generation quantum
computers.
Parallelism
Thanks to superposition and entanglement, operations can be performed on qubits in
parallel, allowing quantum algorithms to evaluate many possibilities simultaneously. This
massive parallelism provides tremendous computational power.
● Quantum parallelism is one of the key features that gives quantum computing its
incredible potential power. It arises from a uniquely quantum effect known as
superposition.
● Instead of bits restricted to classical 0 or 1 states, qubits can exist in a linear combination
or superposition of 0 and 1 states simultaneously. For an n qubit system with each qubit
in an equal superposition of 0 and 1, there are 2n combinations of all possible n-bit
strings encoded at once.
For example:
2 qubits can encode 4 states: |00⟩, |01⟩, |10⟩ and |11⟩
3 qubits encode 8 states - from |000⟩ to |111⟩
● Now, operations on superposed qubits affect all the states encoded in parallel. A
sequence of operations transforms this massive superposition interference and amplitude
changes.
● At the end of the calculation, upon measurement, the quantum parallelism collapses to a
single state yielding a result. Making clever use of quantum effects gives algorithms like
Grover's search and Shor's factorization exponential speed up over classical methods.
● However, the catch is that directly observing all computational paths in detail is
forbidden by quantum mechanics! Still, performing special quantum transformations and
interference before measurement provides computational speedups.
In summary, qubits can encode many classically distinct states in superposition and using
quantum gates we can manipulate all these possibilities in parallel to solve problems
faster - this highly useful phenomenon is quantum parallelism.
ALGORITHM
● Quantum algorithms like Shor's algorithm for factoring integers and Grover's algorithm
for search can provide exponential or quadratic speedups over their best known classical
counterparts. More quantum algorithms are being actively researched.
● Shor's Algorithm: Used for integer factorization into primes. Provides exponential
speedup over classical counterparts and can break widely used RSA encryption.
● Quantum Walk Based Algorithms: Quantum walks allow traversal of graphs similar to
classical random walks but faster, offering quadratic to exponential speedups for certain
graph problems.
Hardware
● With rapid progress in quantum hardware development, practical applications of quantum
algorithms across chemistry, optimization, machine learning and cryptography could
unlock immense computing power.
● Building quantum computers requires isolating and controlling qubits to maintain fragile
quantum states for sufficient time to perform operations. Many different physical systems
like trapped ions, superconducting circuits, and photons are being engineered for this
purpose.
● Qubit Technology: The underlying physical system used to implement qubits. Popular
qubit implementations include superconducting circuits, trapped ions, and photons. They
leverage quantum mechanical phenomena like energy levels, spin, and polarization to
represent information.
● Cryogenics: Most qubit technologies require extremely cold temperatures achieved using
cryostats, dilution refrigerators, etc. This protects the fragile quantum states from external
noise and environmental interference. Temperatures below 1 K (-272°C) are typical.
● Wiring Infrastructure: Different metal coatings and geometries including coaxial cables,
waveguides, and resonators connect the external electronics to the processor chip hosting
the qubits itself.
● Input/Output Equipment: Used to encode input problems into qubits before computation
and record qubit measurement results at the end into classical bits for usable output after
some post-processing.
● The quantum hardware has to maintain coherence, calibration precision, and tight
integrations between these heterogeneous underlying physical components for successful
operation.
1. Cryptanalysis - Quantum algorithms like Shor's can break commonly used RSA, ECC, AES
encryption. This ability poses security threats to current communication systems and data.
Quantum-secure cryptography is being developed to counter this.
3. Financial Modeling - Complex risk analysis, fraud detection, portfolio optimization using
quantum algorithms on financial data to make markets more efficient.
4. Logistics & Optimization - Finding optimal routes, resource allocation plans for industries
like transportation, logistics and supply chains through quantum optimization algorithms.
5. Artificial Intelligence - Quantum machine learning holds promise for pattern recognition,
classification and modeling complex data like images, speech and texts at superior scales and
accuracies.
6. Climate Forecasting - Leveraging quantum simulation to create highly detailed models for
weather forecasting, climate change analysis exceeding capabilities of current supercomputers.
1. Qubit Fidelity and Coherence Time: Current quantum bits have high error rates and lose
their quantum state quickly before useful computation can be done. Maintaining quantum
coherence remains difficult.
2. Quantum Error Correction: Algorithms have been developed to detect qubit errors using
redundancy and correct them. However, the overheads are still impractical for current noisy
qubits. Lower physical error rates are needed.
3. Qubit Scalability: A large number of qubits (hundreds at least) needs to be created and
entangled reliably on a single quantum chip for meaningful applications. Current quantum
computers have less than 100 qubits.
4. Hardware Design: Engineering hardware equipment like cryostats for cooling, cabling,
electronics for control signals without introducing noise remains challenging. Supplying
microwave signals without crosstalk also needs precision.
While rapid innovations are helping overcome some key limitations, quantum computing
technology still has open fundamental physics and engineering problems to solve.
VECTOR PROCESSING
Definition:
Vector processing is a computing technique that operates on vectors or arrays of data as a
single unit, rather than scalar data items one at a time. It relies extensively on parallelism to
achieve high throughput and efficiency.
Vector Processors: Computers designed for vector processing have components like vector
registers, pipelines, and arithmetic units optimized for matrix/vector operations. Vector lengths
denote the maximum number of vectors they can process per instruction.
Vectorized Code: Software programs implementing algorithms that leverage vectors maximize
speedups from vector hardware. Mathematical operations like additions/multiplications of entire
arrays happen in one cycle compared to loops processing element-by-element.
Vectorization: Compilers can auto-vectorize code by turning nested loops performing repetitive
math into single instructions operating on vectors feeding vector pipelines. Explicit vectorization
using SIMD extensions like AVX speeds up vector processing further.
Advantages:
● Vector processing reduces instruction fetches/decodes and exploits data level parallelism
effectively. It results in substantial speed ups for data/math intensive HPC applications
involving simulations, modeling, machine learning etc with structured and predictable
memory access.
● Comparison with GPUs: Modern GPUs also use the vector processing paradigm and have
more parallelism capabilities. However, vector processors have faster context switching
and are still better for applications requiring complex flow control.
In summary, vector processing leverages pipelines and parallel execution units to efficiently
process vector/array data using a single instruction, bringing out massive throughput for
mathematical and scientific applications.
● Throughput: The number of tasks completed per unit time. More tasks getting done per
second indicates better performance.
● Latency: The delay between the start of a task and getting the first response. Lower
latency signals better performance.
● Bandwidth: The amount of data that can be transferred or processed per unit time.
Higher bandwidth represents better performance.
Computer Optimization
Different techniques used to improve computer performance:
❖ Code optimization: Refactoring code to use less memory, improve readability & finding
areas to use parallel execution.
❖ Hardware upgrades: Increasing CPU cores, adding RAM/SSD storage,
GPU/Accelerators offloads suitable workloads.
❖ Storage enhancements: Faster media like SSDs over HDDs reduce access delays during
storage I/O operations.
❖ Balancing workloads: Evenly distributing concurrent transactions/jobs across hardware to
prevent congestion in one area.
The goal behind computer performance tuning is to speed up computing tasks within given
hardware, software and cost constraints. It takes a structured approach of defining metrics,
measurement and focused enhancements.
2. Monitoring tools: Using operating system monitors and utilities like Windows
Performance Monitor, top, to track utilization of computing resources like CPU, memory, disk,
network. Spikes indicate bottlenecks.
3. Profiling: Techniques like code instrumentation and using profiling software to understand
the execution times of specific code segments/functions/operations. Helps optimize slow areas.
2. Throughput: Number of computations or tasks completed per unit of time - indicates overall
processing rates.
3. Instructions per second (IPS): Measures number of machine language instructions a CPU
core can process each second. Dependent on clock speeds.
4. Floating Point Operations Per Second (FLOPS): Common measure for computing
capacity useful for math/scientific programs using floating point calculations.
5. Execution time: Total end-to-end time taken to run a program or complete a benchmark
providing a real-world speed indication for tasks.
Today's personal computers and smartphones predominantly use the Von Neumann architecture
consisting of:
★ Buses
- Interconnect CPU to memory and peripherals
- Address bus, data bus, control signals
★ I/O Interfaces
- Standard ports and controllers to handle keyboard, mouse, USB devices, display,
network, storage drives
★ Storage Devices
- Hard disk drives (HDD)
- Solid state drives (SSD)
★ Operating System
- Resource management
- Driver software
- API and application interfaces
The components are engineered to balance computing capabilities, power efficiency, cost and
space requirements for general purpose or specialized needs.