I have a csv file on French
, that contain a special character like (é, à, è, ç)
. I put this csv file in hdfs via spark 2 - scala 2.11
.
I did the transformation of data, then a transfert my dataframe to Elasticsearch 5.6.
These special character appears like a bizarre character in the Dashboard kibana.
I want replace these spacial character by there normal letter, like:
é = e
è = e
à = a
I did it, using two possible:
val urlCleaner = (joined_df2:String) => {
if (s == null) null else s.replaceAll("é","e")
}
And
val newsjoined_df2=My_Dataframe.withColumn('nom_equipe', when(col('nom_equipe').equalTo('é'), 'e').otherwise(col('nom_equipe'))
But, it do not work. Someone please can suggest me a solution ?