I am a client of an Oracle DB. I use the odbc driver "Oracle in instant_client_12_2" to access the DB via R and other tools. The issue I have is the following:
Let's say I create the following data frame in R and write it to a table in the DB:
library(odbc)
con <- dbConnect(odbc::odbc(), "MYDB", timeout = 10, name, pwd, encoding = 'utf8')
df <- data.frame(TEST = c('abcd', 'eöäü', 'ßdef' ))
dbWriteTable(con, "ENCODING_TEST", df, overwrite = TRUE)
The result I get is:
There are several topics on this, but maybe I did something wrong or it does not work for me. Here an extract of what I have already tried:
con <- dbConnect(odbc::odbc(), "MYDB", timeout = 10, name, pwd, encoding = 'windows-1252')
con <- dbConnect(odbc::odbc(), "MYDB", timeout = 10, name, pwd, encoding = 'cp1252')
con <- dbConnect(odbc::odbc(), "MYDB", timeout = 10, name, pwd, encoding = 'latin1')
con <- dbConnect(odbc::odbc(), "MYDB", timeout = 10, name, pwd)
Sys.setenv(NLS_CHARACTERSET = "AL32UTF8") # I tried also other combinations
Sys.setenv(NLS_LANG = "AL32UTF8")
When I run the following command in DBeaver (SQL Client):
SELECT * FROM NLS_DATABASE_PARAMETERS WHERE PARAMETER LIKE '%ERSET';
I get:
NLS_NCHAR_CHARACTERSET AL16UTF16
NLS_CHARACTERSET AL32UTF8
By the way, in DBeaver I have no such issues. I can export everything fine while setting the encoding to windows-1252 in the export wizard.
Another problem but maybe the same issue: When I load data from the oracle server into Power BI, the characters ä,ü,ö are transformed into a,u,o.
Does anyone has an idea what I have to change so it will write the correct encoded values into the db?