I have a problem with UTF-8 encoding in java. I have an UTF-8 encoded .txt file. I have checked in notepad++ that the file actually is UTF-8 encoded. I try to read the file, but the special letters are not shown correctly.
I use the following peace of code:
try {
Scanner sc = new Scanner(new FileInputStream("file.txt"), "UTF-8");
String str;
while(sc.hasNextLine()) {
str = sc.nextLine();
roadNames.add(str);
System.out.println(str);
}
sc.close();
} catch(IOException e1) {
System.out.println("The file was not found....");
}
It shows the special letters correctly in eclipse where I have defined the default encoding to be UTF-8, but not in my generated jar file.
The only thing that actually works for me, is to make a .bat file with the following arguments "java -Dfile.encoding=utf-8 -jar executable.jar" but I do not think that is a good solution.
Furthermore, this also works:
PrintStream out = new PrintStream(System.out, true, "UTF-8");
out.println(str);
Update
When I say
The special letters are not shown correctly
I mean that the System.out.println prints a string where the special letters are replaced by ├à in stead of å for example.
It turns out the
PrintStream out = new PrintStream(System.out, true, "UTF-8");
out.println(str);
does not work afterall - sorry about that.
The real problem is not that I want the console to print out what is inside the text document, but each line in the text document contains a name, and this name is added to an ArrayList. Then I have a JTextField which, when I begin typing inside it, tries to autocomplete what I typed by searching for the best matching name inside the ArrayList. This works perfectly if it was not for the encoding problem because the special letters inside the JTextField is not show correctly. It is only shown correctly when I use the Dfile.encoding=utf-8 argument