How many bits would you need to represent both the lower and uppercase letters in the English alphabet?

  • Every time a character is typed on a keyboard a code number is transmitted to the computer.
  • The code numbers are stored in binary on computers as Character Sets called ASCII.
  • The table below shows a version of ASCII that uses 7 bits to code each character. The biggest number that can be held in 7-bits is 1111111 in binary (127 in decimal). Therefore 128 different characters can be represented in the ASCII character set (Using codes 0 to 127). More than enough to cover all of the characters on a standard English-Language keyboard.
  • Click here for the full ASCII table.

How many bits would you need to represent both the lower and uppercase letters in the English alphabet?

"Originally based on the English alphabet, ASCII encodes 128 specified characters into 7-bit binary integers as shown by the ASCII chart above.The characters encoded are numbers 0 to 9, lowercase letters a to z, uppercase letters A to Z, basic punctuation symbols, control codes that originated with Teletype machines, and a space. For example, lowercase j would become binary 1101010 and decimal 106. ASCII includes definitions for 128 characters: 33 are non-printing control characters (many now obsolete)that affect how text and space are processed and 95 printable characters, including the space." – from wikipedia
  • The ASCII has been used for a long time. But it has some serious shortcomings:
    1. It only uses English alphabets.
    2. It is limited to 7-bits, so it can only represent 128 distinct characters.
    3. It is not usable for non-latin languages, such as Chinese.
  • Character form of a decimal digit In ASCII, the number character is not the same as the actual number value. For example, the ASCII value 011 0100 will print the character '4', the binary value is actually equal to the decimal number 52. Therefore ASCII cannot be used for arithmetic.

* History of data encoding

As discussed last time, one of the fundamental requirements for a code set to be useful in WAN communications is that the sender and the receiver must agree on the meaning of each combination of ones and zeros.  A 2-bit code set, for example, can have only four discrete meanings: one meaning each for the combinations 00, 01, 10, and 11.  Go to three bits and you get eight codes; four bits yield 16, and five bits yield 32.

The first widely accepted code set was Baudot code, developed more than 100 years ago.  By having five bits - and 32 code combinations - there were enough bit combinations available to have a unique code for each of the 26 letters of the alphabet. 

However, 26 letters plus the 10 digits 0 through 9 exceed the 32 combinations.  Rather than going to an additional bit, two unique codes are used to signal a shift between the "letters" interpretation of the code and the "figures" interpretation.  Since both "letters" and "figures" tend to come in groups, this works fine for simple applications.

However, there's one big problem.  With just five bits, there's no way to distinguish between UPPERCASE and lowercase letters.  Going to a 6-bit code with 64 combinations would still be minimal, because it would take 62 combinations for the letters and digits, with only two codes left for punctuation. 

Consequently, the minimal code set must consist of seven bits, and that's exactly what the American Standard Code for Information Interchange (ASCII) uses.  This code, which has become the de facto standard for data communications, has 128 combinations, with a unique code for each letter in both uppercase and lowercase.  In fact the binary code for each uppercase and lowercase letter is the same except for one bit, which is sometimes called the "shift" bit.

Learn more about this topic

SOS @morse.code

Network World Convergence Newsletter, 03/03/04

IETF ponders internationalized e-mail

Network World ISP News Report Newsletter, 11/24/03

Industry group to promote internationalized domain names

Network World ISP News Report Newsletter, 12/01/03

ASCII art generator

Network World

Q&A: Dunn discusses comeback of Nortel

Network World, 03/08/04

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

Copyright © 2004 IDG Communications, Inc.

1a. What is the minimum number of bits that are required to uniquely represent the characters of English alphabet? (Consider upper case characters alone)

The number of unique bit patterns using i bits is 2i We need at least 26 unique bit patterns. The cleanest approach is to compute log2 26 and take the ceiling .This yields 5 as the answer. Trial and error is also an acceptable solution.

1b. how many more characters can be uniquely represented without requiring additional bits?

With 5 bits, we can represent up to 32 (25) unique bit patterns; we can represent

32 - 26 = 6 more characters without requiring additional bits.

2 Using 7 bits to represent each number, write the representations of 23 and -23 in signed

Magnitude and 2's complement integers.

Signed Magnitude

1's Complement

2's Complement

23

0010111

0010111

0010111

-23

1010111

1101000

1101001

Answer & Explanation

Solved by verified expert

<p>a molestie consequat, ultrices ac magna. Fusce dui lectus, congue vel laoreet ac, dictum vitae odio. Donec aliquet. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam lacinia pulvinar tortor nec facilisis. Pellentesque dapibus efficitur laoreet.</p> Fusce dui lectus, congue vel laoreet ac, dictum vitae odio. Donec aliquet

Unlock full access to Course Hero

Explore over 16 million step-by-step answers from our library

Subscribe to view answer

sque dapibus efficitur laoreet. Nam risus ante, dapibus a molestie consequat, ultrices ac magna. Fusce dui lectus, congue vel laoreet ac, dictum vitae odio. Donec aliq

usce dui lectus, congue vel laoreet ac, dictum vitae odio. Donec aliquet. Lorem ipsum dolongue vel laoreet ac, dictum vitae odio. Donec aliquet. Lorem ipsum dolor sit amet, consectetur adipiscing

et, consectetur adipiscing elit. Nam lac

F

gue vel laoreet ac, dictum vitae odio. Donec aliquet. Lorem ipsum dolor sit amet, consectetur adipiscing e

nec facilisis. Pellentesque dapibus

Fusce dui lFusce dui lFusce dui lsFusce dui lFusce dui lFusce dui lsFusce dui lFusce dui lFusce dui ls
et, consectetur

i

u
et, consectetur

Fusce dui lectus,

u
et, consectetur

e vel laoreet a

u
et, consectetur

m risus ante, d

et, consectetur

i

u
et, consectetur

o

u
et, consectetur

nec

u
et, consectetur

inia pulvinar to

u
et, consectetur

, ultric

et, consectetur

i

u
et, consectetur

cing

u
et, consectetur

sus ant

u
et, consectetur

usce dui

u
et, consectetur

ipsum

et, consectetur

i

u

l

How many bits are needed to represent the A letters of the alphabet?

If you want to represent one character from the 26-letter Roman alphabet (A-Z), then you need log2(26) = 4.7 bits. Obviously, in practice, you'll need 5 bits.

How many bits would be needed to count all of the students in the alphabet?

5 bits (2^5) gives you 32 unique values, which is enough unique values to hold the alphabet and a space.

What is the average number of bits required to encode a character?

In ASCII, every character is encoded with the same number of bits: 8 bits per character.

How many ways are there to encode the 26

0,1}; 2^8=256 it results that we have Arrangements taken 26 with 256 elements.