0

I have sql its basically joining the two table and getting result accomm_sk, If the accomm_sk value is NULL then spark UDF will get call lookup in third table if not then get the result. How can I use this Function in spark sql, as Spark is not allowing to register as UDF?

Spark UDF

def GeneratedAccommSk(localHash):
    query = 'select accommodation_sk from staging.accomm_dim where accomm_hash="{}"'.format(localHash)
    accommodationSk_Df=spark.sql(query)
    accomm_count=accommSk_Df.filter(accommSk_Df.accomm_sk.isNotNull()).count()
    if accomm_count != 0:
        accomm_sk=accommSk_Df.select('accomm_sk').collect()[0].asDict()['accomm_sk']
    else:
        func = sc._gateway.jvm.RandomNumberGenerator()
        accom_sk=func.generateRandomNumber().encode('ascii', 'ignore')
    return accom_sk

Spark SQL:

        rate_fact_df=spark.sql("""
*Calling GeneratedAccommSk UDF*
        select  case when accomm_sk IS NOT NULL THEN accommodation_sk 
    ELSE GeneratedAccommSk(a.accommhash) END 
        from 
        staging.contract_test a 
        join 
        dim.accomm_dim b 
        on (a.accomm_hash)= b.accommodation_hash
        """)
marjun
  • 696
  • 5
  • 17
  • 30

1 Answers1

0

That's not going to work for at least two reason:

Depending on the size of accommSk_Df you should either collect it and use a local object (Lookup in spark dataframes) or perform yet another join.