0

I have 2 datasets with me as shown below. I'm trying to find out how many products are associated with each game. Basically, I'm trying to keep a count of the number of products associated.

   scala> df1.show()
   gameid     | games     | users         | cnt_assoc_prod  
   -------------------------------------------
   1          | cricket   |[111, 121]     |
   2          | basketball|[211]          |
   3          | skating   |[101, 100, 98] |

   scala> df2.show()
   user  | products 
   ----------------------
   98    | "shampoo"   
   100   | "soap"          
   101   | "shampoo"
   111   | "shoes"
   121   | "honey"
   211   | "shoes"

I'm trying to iterate through each of df1's users from the array and find the corresponding row in df2 by applying the filter on column matching the user.

df1.map{x => {
   var assoc_products = new Set()
   x.users.foreach(y => assoc_products + df2.filter(z => z.user == y).first(). 
        products)
   x.cnt_assoc_prod = assoc_products.size
}

While applying filter I get following Exception

java.lang.NullPointerException
    at org.apache.spark.sql.Dataset.logicalPlan(Dataset.scala:784)
    at org.apache.spark.sql.Dataset.mapPartitions(Dataset.scala:344)
    at org.apache.spark.sql.Dataset.filter(Dataset.scala:307)

I'm using spark version 1.6.1.

pushpavanthar
  • 819
  • 6
  • 20

1 Answers1

0

You can explode the users column in df1, join with df2 on the user column, then do the groupBy count:

(df1.withColumn("user", explode(col("users")))
    .join(df2, Seq("user"))
    .groupBy("gameid", "games")
    .agg(count($"products").alias("cnt_assoc_prod"))
).show

+------+----------+--------------+
|gameid|     games|cnt_assoc_prod|
+------+----------+--------------+
|     3|   skating|             3|
|     2|basketball|             1|
|     1|   cricket|             2|
+------+----------+--------------+
Psidom
  • 209,562
  • 33
  • 339
  • 356
  • 1
    I learnt that there is trouble awaiting if we try to access one RDD within another from this [link](https://stackoverflow.com/a/34660825/3213772). Btw, above solution worked for me. Thanks!! – pushpavanthar Oct 04 '17 at 12:09