ITC Lect 02 (Computer Components AI) - REVISED
ITC Lect 02 (Computer Components AI) - REVISED
Components of Computer
Data Representation
and
AI
Dr. Zahid Halim
Parts of a Computer
• There are two main parts of computers, hardware and software
– Hardware is all of the parts of the computer you can see ( ? ) and touch.
E.g.,
• Monitor, Keyboard, Processor, Memory, Circuits, Cables
etc
– Software refers to parts of the computer which do not have a material
form. E.g.,
• Data, Programs, Protocols etc
Power Supply
RJ=Registered Jack
DRAM
Power Memory Slot for
Connector Slot Microprocesso
r
Motherboard: Sockets & Slots
Ghulam Ishaq Khan Institute of Engineering Sciences and Technology, Topi
Lecture 02: Components of Computer CS 101: Introduction to Computing
Computer:
Internal Components (-4-)
Microprocessor
Graphics Card
Data Representation
Data Representation
• Modern computers are digital devices
– A digital device works with discrete data, such as the digits
1 and 0
– An analog device works with continuous data
Number System
Conversions
What is Intelligence?
Hardware
1011 neurons
1014 synapses
cycle time: 10-3 sec
107 transistors
1010 bits of RAM
cycle time: 10-9 sec
•Conclusion
– In near future we can have computers with as many processing
elements as our brain, but:
far fewer interconnections (wires or synapses)
much faster updates.
Classical AI
• The principles of intelligence are separate from any hardware / software /
wetware implementation
• Look for these principles by studying how to perform tasks that require
intelligence
• Mycin (1980)
– Expert level performance in diagnosis of blood infections
• Today: 1,000’s of systems
– Everything from diagnosing cancer to designing dentures
– Often outperform doctors in clinical trials
– Major hurdle today – non-expert part – doctor/machine interaction
Autonomous Systems
• Commonsense Knowledge
– needed to operate in messy, complex, open-ended
worlds
• Your kitchen vs. GM factory floor
– understand unconstrained Natural Language
• Speech Recognition
– “word spotting” feasible today
– continuous speech – rapid progress
– turns out that “low level” signal not as ambiguous as we once
thought
• Translation / Understanding
– very limited progress
The spirit is willing but the flesh is weak. (English)
The vodka is good but the meat is rotten. (Russian)
• Alternatives?
• Open Mind
• KnowItAll
Historical Perspective
Recurrent Themes
• Neural nets vs AI
– McCulloch & Pitts 1943
– Died out in 1960’s, revived in 1980’s
• Neural nets vastly simplified model of real neurons, but still
useful & practical – massive parallelism
• particular family of learning and representation techniques
• Logic vs Probability
– In 1950’s logic seemed more computationally &
expressively attractive (McCarthy, Newell)
• attempts to extend logic “just a little” to deal with the fact that the
world is uncertain!
– 1988 – Judea Pearl’s work on Bayes nets
• provided efficient computational framework
– Today – no longer rivals
• hot topic: combining probability & first-order logic
(Re-)Current Themes
• Combinatorial Explosion
• Micro-world successes don’t scale up.
• How to Organize and accumulate large amounts of knowledge?
• How to translate from informal, ill-structured statements to
formal reasoning (e.g., understand a story)?
• What are reasonable simplifying assumptions?
Why “Learn” ?
• Machine learning is programming computers to optimize a
performance criterion using example data or past experience.
• There is no need to “learn” to calculate payroll
• Learning is used when:
– Human expertise does not exist (navigating on Mars),
– Humans are unable to explain their expertise (speech recognition)
– Solution changes in time (routing on a computer network)
– Solution needs to be adapted to particular cases (user biometrics)
Data Mining
• Retail: Market basket analysis, Customer relationship
management (CRM)
• Finance: Credit scoring, fraud detection
• Manufacturing: Optimization, troubleshooting
• Medicine: Medical diagnosis
• Telecommunications: Quality of service optimization
• Bioinformatics: Motifs, alignment
• Web mining: Search engines
• ...
Classification
• Example: Credit
scoring
• Differentiating
between low-risk
and high-risk
customers from their
income and savings
Classification: Applications
• Aka Pattern recognition
• Face recognition: Pose, lighting, occlusion (glasses, beard),
make-up, hair style
• Character recognition: Different handwriting styles.
• Speech recognition: Temporal dependency.
– Use of a dictionary or the syntax of the language.
– Sensor fusion: Combine multiple modalities; eg, visual (lip image) and
acoustic for speech
• Medical diagnosis: From symptoms to illnesses
• ...
Face Recognition
Test images
Regression
Regression Applications
α1
Unsupervised Learning
• Learning “what normally happens”
• No output
• Clustering: Grouping similar instances
• Example applications
– Customer segmentation in CRM
– Image compression: Color quantization
– Bioinformatics: Learning motifs
Reinforcement Learning
• Learning a policy: A sequence of outputs
• No supervised output but delayed reward
• Credit assignment problem
• Game playing
• Robot in a maze
• Multiple agents, partial observability, ...