Difference Between Similar Terms and Objects

Difference Between EBCDIC and ASCII

EBCDIC vs ASCII

The American Standard Code for Information Interchange and the Extended Binary Coded Decimal Interchange Code are two character encoding schemes; which are more commonly known by their respective acronyms, ASCII and EBCDIC. The main difference between the two is the number of bits that they use to represent each character. EBCDIC uses 8 bits per character while the original ASCII standard only used 7, due to concerns that using 8 bits for characters that can be represented with 7 is much less efficient.

The main consequence of that difference is the number of characters that can be accommodated with each one. EBCDIC can accommodate up to 28 characters for a total of 256 while the 27 of ASCII has a maximum of 128 characters.

Although EBCDIC was very popular, due largely to the popularity of IBM machines at the time, it had several problems that irritated programmers. The first one is how it arranges the letters of the alphabet. In ASCII, all the letters are in consecutive order. Capital letters are grouped together while small letters also have their own group. In EBCDIC, the letters are grouped 9 at a time. This non-intuitive layout comes from the EBCDIC’s punch card origins and is quite difficult for programmers to deal with.

IBM’s hold of the EBCDIC coding has led to many problems when it comes to updates. ASCII and EBCDIC later had updates to increase the number of characters that they can accommodate. ASCII pages had some code points replaced while maintaining most of the other code points. With EBCDIC, the different versions are highly incompatible with each other.

As the encoding needs of computer outgrew both ASCII and EBCDIC, other standard emerged. The most recent is Unicode, which incorporated ASCII. The first 128 characters of Unicode are from ASCII. This lets Unicode open ASCII files without any problems. On the other hand, the EBCDIC encoding is not compatible with Unicode and EBCDIC encoded files would only appear as gibberish.

Summary:

1.EBCDIC uses 8 bits while ASCII uses 7 before it was extended
2.EBCDIC contained more characters than ASCII
3.ASCII uses a linear ordering of letters while EBCDIC does not
4.Different versions of ASCII are mostly compatible while different versions of EBCDIC are not
5.EBCDIC isn’t compatible with modern encodings while ASCII is

Sharing is caring!


Search DifferenceBetween.net :




Email This Post Email This Post : If you like this article or our site. Please spread the word. Share it with your friends/family.


9 Comments

  1. not pleased with the difference between ASCII and EBCDIC

  2. sorry that” noy “is actually” not”

  3. I need to know where the ASCII and EBCDIC standard overlap

  4. not so cool………..

  5. the widely used code in data communication is
    a) Ascii
    b) ebcdic

  6. This is very informative, heard of 3 dimensional earth, but now they have also given Unicode platform, seems to be like earth now, 1st Dimension ASCII, 2nd Dimension EBCDIC and final Dimension Unicode.
    Obviously things have been derived based on Set theory (SET SUBSET Concepts). Many of the Coding platforms, including these platforms works on same principle (I THINK)
    Thanks for getting bored out of this small note,
    B Sadanand

  7. EBCDIC is more used in data communication because there are 8bits.

Leave a Response

Please note: comment moderation is enabled and may delay your comment. There is no need to resubmit your comment.

Articles on DifferenceBetween.net are general information, and are not intended to substitute for professional advice. The information is "AS IS", "WITH ALL FAULTS". User assumes all risk of use, damage, or injury. You agree that we have no liability for any damages.


See more about :
Protected by Copyscape Plagiarism Finder