If you use the from_unixtime
then it will not give you the .SSS
milliseconds. So, in this case, the casting is better to use.
val df1 = spark.createDataFrame(Seq(("1", 43784.2892847338))).toDF("id", "Column1")
val finalDF = df1.withColumn("Column1_timestamp", expr("""(Column1 -25569) * 86400.0""").cast("timestamp"))
finalDF.show(false)
+---+----------------+-----------------------+
|id |Column1 |Column1_timestamp |
+---+----------------+-----------------------+
|1 |43784.2892847338|2019-11-15 06:56:34.201|
+---+----------------+-----------------------+