I have a DataFrame in spark with Sample accounts which has 5 different columns.
val sampledf= sqlContext.sql(select * from Sampledf)
I have other table in oracle db which has millions of records. OracleTable
I want to filter Accounts present in OracleTable with respect to SampleDF
Select * from OracleTable where column in (select column from SamplesDf)
I realized that in oracle we can not provide more than 1000 values in IN condition.
And below subquery query is not working. Due to huge data in OracleTable
I want to achieve below query
select column from OracleTable where (acctnum in (1,2,3,...1000) or acctnum in (1001,....2000) .... Basically all the accounts from SampleDF (every 1000 accounts)
Since we cant give more than 1000 at once (that's the limitation in Oracle) we can give 1000 every time.
How can I generate this kind of dynamic query. DO I need to create Array from Dataframe?
I just need a work around, how can I proceed. Any suggestions will be helpful.