您的位置:寻梦网首页编程乐园Java天地Core JavaJava Lecture Notes

Introduction

Content

Apply

Reflect

Extend

previous.gif
 (3087 bytes)

next.gif
 (2959 bytes)


Reflect Index

Reflect Page # 7

Discussion Topic 2 — Why use the UNICODE rather than ASCII character set?

Computers only `understand' numbers. If we want to manipulate letters and symbols, we must allocate a number to each different letter and symbol. It doesn't matters what number we give to any symbol, as long as everyone does the same.

Most computer programming languages use something called the ASCII coding scheme to do this. In this scheme, for example, the letter `A' is number 65, `B' is 66 and so on. This scheme has a maximum of 256 symbols (because this is the largest number that can be stored in one byte). Java, on the other hand, uses the UNICODE coding scheme, where each symbol has a number between 0 and 65535 (two bytes). This means that a text string in Java requires twice as much memory as it would in most other languages.

Why did the Java developers make this decision? What are the advantages and disadvantages?

Contribution for Discussion Topic 2

Back to top

basicline.gif (169 bytes)

RITSEC - Global Campus
Copyright ?1999 RITSEC- Middlesex University. All rights reserved.
webmaster@globalcampus.com.eg