I have a question that type DefaultMutableTreeNode variable's value turns to default once use it in Spark mapToPair() function. Here is my code:
public class CA implements Serializable{
private final JavaRDD<String> input;
private final List<IB> bList;
public boolean FuncWithSpark(){
/*
!!!at this point, bList.get(0).getD().getRoot() return a valid tree node
*/
JavaRDD<Boolean> counters = input.mapToPair(new PairFunction<String, String, List<String>>() {
@Override
public Tuple2<String, List<String>> call(String s) throws Exception {
/*
!!!at this point, bList.get(0).getD().getRoot() return an uninitialized tree node with default values
*/
...
}
}
}
public CA(JavaRDD<String> input, List<IB> bList) {
this.input = input;
this.bList = bList;
}
}
Interfaces IB, ID, classes CB and CD are defined like:
public interface IB {
...
}
public interface ID {
...
}
public class CB implements IB, Serializable{
private final ID d;
public ID getD(){
return this.d;
}
}
public class CD implements ID, Serializable{
private DefaultMutableTreeNode rootNode;
public DefaultMutableTreeNode getRoot(){
return this.rootNode;
}
}
Question is, what happened to the variable of type DefaultMutableTreeNode in CA.FuncWithSpark()? Is it because of Spark transformation, or DefaultMutableTreeNode's member variables are protected and no accessor to them? Please give me a direction to tackle this problem. Thank you for any help in advance!.