val sc = new SparkContext(sparkConf)
val rdd = sc.parallize(list("jcdz", "酥炸", "包子", "油条", "油条", "油条", "汉堡"))
rdd.flatMap(.split(",")).map((,1)).reduceByKey(+).foreach(println)
}结果显示如图
spark workCount 求和
原文:https://blog.51cto.com/u_15084467/2720428