I noticed your question was about the number of char you can store, not the number of bytes. The difference is here (see SO answer Difference between BYTE and CHAR in column datatypes):
Let us assume the database character set is UTF-8, which is the recommended setting in recent versions of Oracle. In this case, some characters take more than 1 byte to store in the database.
If you define the field as VARCHAR2(11 BYTE), Oracle can use up to 11 bytes for storage, but you may not actually be able to store 11 characters in the field, because some of them take more than one byte to store, e.g. non-English characters.
By defining the field as VARCHAR2(11 CHAR) you tell Oracle it can use enough space to store 11 characters, no matter how many bytes it takes to store each one. A single character may require up to 4 bytes.
Sample about the difference between chars and bytes: http://mothereff.in/byte-counter
Also, that character length semantics do not affect the 4000 byte maximum (Oracle 11g, see doc Oracle doc) length for a VARCHAR2. Declaring a VARCHAR2(4000 CHAR) will allow fewer than 4000 characters if some of the characters require multiple bytes of storage.