0

Let's consider this simple creation of a csv file with a dataframe that contains special characters:

d <- data.frame(x = "Édifice", y="Arrêt")
write.table(x = d, file = "test.csv", sep = ",", row.names = F, col.names = F, quote = F, fileEncoding = "UTF-8")

The csv file looks like expected

Édifice,Arrêt

But when I open this csv in excel I get:

excel

I have tried using readr, collapsing columns and then writing them with writeLines, writing using write.xlsx, checked for encoding options. None worked.

My constraint is that the input is a dataframe, and the output must be a csv readable in excel.

Xavier Prudent
  • 1,570
  • 3
  • 25
  • 54
  • 2
    R produces your CSV just fine---it seems the issue is how to open a UTF-8 encoded CSV in Excel. – Gregor Thomas Mar 05 '20 at 21:33
  • 1
    Suggested duplicate: [Is it possible to force Excel to recognize UTF-8 CSVs automatically?](https://stackoverflow.com/q/6002256/903061) – Gregor Thomas Mar 05 '20 at 21:35
  • Though, inspired by some of the answers there, you could potentially try saving the file in UTF-16 encoding or adding a BOM as descibed in `?file` – Gregor Thomas Mar 05 '20 at 21:39

1 Answers1

3

Same problem with german umlaute. I use write_excel_csv from readr:

library(readr)
write_excel_csv(x = d, path = "test.csv", col_names = F)
stefan
  • 90,330
  • 6
  • 25
  • 51