SparkJavaAPI:join的使用
將一組數(shù)據(jù)轉(zhuǎn)化為RDD后,分別創(chuàng)造出兩個PairRDD,然后再對兩個PairRDD進(jìn)行歸約(即合并相同Key對應(yīng)的Value),過程如下圖所示:
代碼實(shí)現(xiàn)如下:
?
public class SparkRDDDemo {
public static void main(String[] args){
SparkConf conf = new SparkConf().setAppName("SparkRDD").setMaster("local");
JavaSparkContext sc = new JavaSparkContext(conf);
List<Integer> data = Arrays.asList(1,2,3,4,5);
JavaRDD<Integer> rdd = sc.parallelize(data);
//FirstRDD
JavaPairRDD<Integer, Integer> firstRDD = rdd.mapToPair(new PairFunction<Integer, Integer, Integer>() {
@Override
public Tuple2<Integer, Integer> call(Integer num) throws Exception {
return new Tuple2<>(num, num * num);
}
});
//SecondRDD
JavaPairRDD<Integer, String> secondRDD = rdd.mapToPair(new PairFunction<Integer, Integer, String>() {
@Override
public Tuple2<Integer, String> call(Integer num) throws Exception {
return new Tuple2<>(num, String.valueOf((char)(64 + num * num)));
}
});
JavaPairRDD<Integer, Tuple2<Integer, String>> joinRDD = firstRDD.join(secondRDD);
JavaRDD<String> res = joinRDD.map(new Function<Tuple2<Integer, Tuple2<Integer, String>>, String>() {
@Override
public String call(Tuple2<Integer, Tuple2<Integer, String>> integerTuple2Tuple2) throws Exception {
int key = integerTuple2Tuple2._1();
int value1 = integerTuple2Tuple2._2()._1();
String value2 = integerTuple2Tuple2._2()._2();
return "<" + key + ",<" + value1 + "," + value2 + ">>";
}
});
List<String> resList = res.collect();
for(String str : resList)
System.out.println(str);
sc.stop();
}
}
?
總結(jié)
以上是生活随笔為你收集整理的SparkJavaAPI:join的使用的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Spark RDD中cache和pers
- 下一篇: Spark的RDD操作之Join大全