0
spark.sql("select CASE WHEN ((from_unixtime(unix_timestamp(dt,'yyyyMMdd'),'yyyyMMdd') == dt) or  (from_unixtime(unix_timestamp(dt,'MMddyyyy'),'MMddyyyy') == dt)) then dt else '' end as dt, case WHEN ((from_unixtime(unix_timestamp(dt,'yyyyMMdd'),'yyyyMMdd') == dt) or  (from_unixtime(unix_timestamp(dt,'MMddyyyy'),'MMddyyyy') == dt)) then 'Y' else 'dt: should be present in \"yyyyMMdd\" or \"MMddyyyy\" format' end as dt_flag from input").show(false)

In the above code I've written the query for accepting two formats yyyyMMdd and MMddyyyy

In this , multiple delimiters should be accepted like no delimiters,periods,commas,slashes,dashes for date type.

I'm not sure how to add those, please help me on this! Thanks in advance

1 Answers1

1

You can create multiple columns based out of your accepted Date Formats

Once done , you can use coalesce to club them together

An example of the approach is demonstrated here

Vaebhav
  • 4,672
  • 1
  • 13
  • 33
  • I think in that withcolumn is used . How can I incorporate it with my query?. can you please show that in my query. @Vaebhav – Only developer Oct 12 '21 at 06:04