tl;dr
Character.toString( Integer.parseInt( "e13a" , 16 ) )
See this code run at Ideone.com.
Code point
Parse your input string as a hexadecimal number, base 16. Convert to a decimal number, base 10.
That number represents a code point, the number permanently assigned to each of the over 144,000 characters defined in Unicode. Code points range from zero to just over one million, with most of that range unassigned.
String input = "e13a" ;
int codePoint = Integer.parseInt( input , 16 ) ;
Instantiate a String
object whose content is the character identified by that code point.
String output = Character.toString( codePoint ) ;
Avoid char
The char
type has been essentially broken since Java 2, and legacy since Java 5. As a 16-bit value, char
is physically incapable of representing most characters.
To work with individual characters, use code point integers as seen above.