I am very new to Apache Spark. I am trying to create a JavaPairRdd
from HashMap
. I have a HashMap
of type <String,<Integer,Integer>>
How can I convert it into a JavaPairRdd
? I have pasted my code below:
HashMap<String, HashMap<Integer,String>> canlist =
new HashMap<String, HashMap<Integer,String>>();
for (String key : entityKey)
{
HashMap<Integer, String> clkey = new HashMap<Integer, String>();
int f=0;
for (String val :mentionKey)
{
//do something
simiscore = (longerLength - costs[m.length()]) / (double) longerLength;
if (simiscore > 0.6) {
clkey.put(v1,val);
System.out.print(
" The mention " + val + " added to link entity " + key);
}
f++;
System.out.println("Scan Completed");
}
canlist.put(key,clkey);
JavaPairRDD<String, HashMap<Integer, String>> rad;
rad = context.parallelize(scala.collection.Seq(toScalaMap(canlist)));
}
public static <String,Object> Map<String,Object> toScalaMap(HashMap<String,Object> m) {
return (Map<String,Object>) JavaConverters.mapAsScalaMapConverter(m).asScala().toMap(
Predef.<Tuple2<String,Object>>conforms()
);}
}