4

I have a Dataframe with a bigint column. How to convert a bigint column to timestamp in scala spark

Anurag Sharma
  • 2,409
  • 2
  • 16
  • 34

1 Answers1

9

You can use from_unixtime/to_timestamp function in spark to convert Bigint column to timestamp.

Example:

spark.sql("select timestamp(from_unixtime(1563853753,'yyyy-MM-dd HH:mm:ss')) as ts").show(false)
+-------------------+
|ts                 |
+-------------------+
|2019-07-22 22:49:13|
+-------------------+

(or)

spark.sql("select to_timestamp(1563853753) as ts").show(false)
+-------------------+
|ts                 |
+-------------------+
|2019-07-22 22:49:13|
+-------------------+

Schema:

spark.sql("select to_timestamp(1563853753) as ts").printSchema
root
 |-- ts: timestamp (nullable = false)

Refer this link for more details regards to converting different formats of timestamps in spark.

notNull
  • 30,258
  • 4
  • 35
  • 50
  • Is there a command to convert the column? Say I have a df with a column named 'ts' and I want that column to be converted from bigint to date time, how to do that? – Cr4zyTun4 Jan 19 '22 at 11:23