I'm writing a function that executes data validation using great expectations library. The great expectation library for spark data requires to write their expectation programmatically. So im trying to execute all the functions in a for loop using a parsed yaml.
validator, context = start_spark_df(data, table)
for exp in yaml[table]:
validator.{exp}
The yaml consists of
name_of_the_table:
expect_column_values_to_not_be_null: customer
expect_column_values_to_be_of_type: customer, int
So ideally i would like to be able to execute
validator.expect_column_values_to_not_be_null(customer)
validator.expect_column_values_to_be_of_type(customer, int)
in one for loop.
Is there any way to do it? I couldn't find anything similar to this problem on stack overflow. Thanks a bunch in advance