I am working with a JSON object, and want to convert object.hours to relational table, based on Spark SQL dataframe/dataset.
I tried to use "explode", which is not really supporting the "structs array".
The json object is below:
{
"business_id": "abc",
"full_address": "random_address",
"hours": {
"Monday": {
"close": "02:00",
"open": "11:00"
},
"Tuesday": {
"close": "02:00",
"open": "11:00"
},
"Friday": {
"close": "02:00",
"open": "11:00"
},
"Wednesday": {
"close": "02:00",
"open": "11:00"
},
"Thursday": {
"close": "02:00",
"open": "11:00"
},
"Sunday": {
"close": "00:00",
"open": "11:00"
},
"Saturday": {
"close": "02:00",
"open": "11:00"
}
}
}
To a relational table like below,
CREATE TABLE "business_hours" (
"id" integer NOT NULL PRIMARY KEY,
"business_id" integer NOT NULL FOREIGN KEY REFERENCES "businesses",
"day" integer NOT NULL,
"open_time" time,
"close_time" time
)