While loading data from Oracle and writing to PostgreSQL facing weird issue. Unable to write string with space to postgres. Facing below issue
Caused by: java.sql.BatchUpdateException: Batch entry 0 INSERT INTO xyz("col1","col2") VALUES ('7643'::numeric,'xyz/xyz xyzxy xyz/xyz xyzxy ') was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch
So trying to trim col in dataframe but that is not working. Before and after trimming data is same.
data= data.withColumn("trimmed", trim(col("col2")))
Am very new to pyspark and data cleaning, any help is highly appreciated.