I have a query with a column that converts the universal datetime field (not a timestamp) to a local time based on the the timezone. In Oracle I was able to do this with intervals like this snippet below, but Spark wouldn't allow intervals. How can I do this in Spark.SQL?
case when c.timezone in (4,5) then to_char(b.universal_datetime +
NUMTODSINTERVAL(3, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (8) then to_char(b.universal_datetime,'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (7) then to_char(b.universal_datetime + NUMTODSINTERVAL(1, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (6) then to_char(b.universal_datetime + NUMTODSINTERVAL(2, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (10) then to_char(b.universal_datetime - NUMTODSINTERVAL(3, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
when c.timezone in (9) then to_char(b.universal_datetime - NUMTODSINTERVAL(1, 'HOUR'),'yyyy/mm/dd HH24:MI:SS')
ELSE 'Other' END AS Local_Time,