I have a use case which I am trying to implement using spark solution in AWS glue. I have one table which has query stored as column value which I need to run from script .
For exmaple > Select src_query from table;
This give me another query mentioned below :
select tabl2.col1,tabl3.col2 from table2 join table 3 ;
Now I want to collect information of this second query in dataframe and proceed further.
source_df = spark.read.format("jdbc").option("url", Oracle_jdbc_url).option("dbtable", "table1").option("user", Oracle_Username).option("password", Oracle_Password).load()
Now when we run this data from table1 gets stored in source_df . One of column of table1 is storing some sql query . select col1,col2 from tabl2;
Now I want to run query mentioned above and store its result in dataframe .Something like
final_df2 = spark.read.format("jdbc").option("url", Oracle_jdbc_url).option("query", "select col1,col2 from tabl2").option("user", Oracle_Username).option("password", Oracle_Password).load()
How can I get query from data frame and run it as query to fetch another result in another dataframe .