Having dates in one column, how to create a column containing ISO week date?
ISO week date is composed of year, week number and weekday.
- year is not the same as the year obtained using
year
function. - week number is the easy part - it can be obtained using
weekofyear
. - weekday should return 1 for Monday and 7 for Sunday, while Spark's
dayofweek
cannot do it.
Example dataframe:
from pyspark.sql import SparkSession, functions as F
spark = SparkSession.builder.getOrCreate()
df = spark.createDataFrame([
('1977-12-31',),
('1978-01-01',),
('1978-01-02',),
('1978-12-31',),
('1979-01-01',),
('1979-12-30',),
('1979-12-31',),
('1980-01-01',)],
['my_date']
).select(F.col('my_date').cast('date'))
df.show()
#+----------+
#| my_date|
#+----------+
#|1977-12-31|
#|1978-01-01|
#|1978-01-02|
#|1978-12-31|
#|1979-01-01|
#|1979-12-30|
#|1979-12-31|
#|1980-01-01|
#+----------+
Desired result:
+----------+-------------+
| my_date|iso_week_date|
+----------+-------------+
|1977-12-31| 1977-W52-6|
|1978-01-01| 1977-W52-7|
|1978-01-02| 1978-W01-1|
|1978-12-31| 1978-W52-7|
|1979-01-01| 1979-W01-1|
|1979-12-30| 1979-W52-7|
|1979-12-31| 1980-W01-1|
|1980-01-01| 1980-W01-2|
+----------+-------------+