0% found this document useful (0 votes)
15 views

Computer code

Uploaded by

baruarohit09
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views

Computer code

Uploaded by

baruarohit09
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Teacher Lesson

XI Class
Omar Faruk Siddique
Lecturer in ICT Chapter: 3rd
BAF Shaheen College Number System
Chattogram
Have a look at these pictures
Topic

Computer Code
What is code

• Generally, code means a system of words, letters, figures, or symbols


used to represent others, especially for the purposes of secrecy.

• For example, The Americans cracked their diplomatic code

• In computing point of view code means programming instructions.

• For example, assembly code, machine code etc.


Types of Code
❑ Numerical Code
✓ BCD Code – Binary Coded Decimal
✓ Binary Code – (0,1)
✓ Octal Code – (0 to 7)
✓ Decimal code (0 to 9)
❑ Alphanumerical Code
✓ ASCII Code – American Standard Code for Information Interchange.
✓ EBCDIC Code – Extended Binary Coded Decimal Interchange Code.
✓ Unicode – Universal Code
BCD code
• Binary coded decimal (BCD) is a way to represent decimal
numbers using binary digits.
• for example, 8421 is BCD Code. It is also known as NBCD
ASCII CODE
• ASCII (American Standard Code for Information Interchange) is the
most common character encoding format for text data in computers
and on the internet. In standard ASCII-encoded data, there are unique
values for 128 alphabetic, numeric or special additional characters and
control codes.
• ASCII – 7
• ASCII – 8
EBCDIC Code
• Extended Binary Coded Decimal Interchange Code (EBCDIC) is an
eight-bit encoding scheme that standardizes how computers interpret
characters, punctuation, and other symbols.
• 8 bit
Unicode
• an international encoding standard for use with different languages
and scripts, by which each letter, digit, or symbol is assigned a
unique numeric value that applies across different platforms and
programs.
• 16 bit
Thank you

You might also like