将pom文件中的 flink相关依赖(flink-java,flink-client,flink-stream)的scope修改为 provided
然后 mvn clean package,打出来的包 12K,如下(scope为compile时的包46M)
然后通过如下指令,在自己的mac上运行flink jar包:
flink run -c bigdata.batch.WordCount /Users/walker/tmp/learning-flink-1.0.jar
可以运行成功。
执行输入输出:
test_in.txt:
flink spark flink flink spark hadoop flink hadoop flink flink spark flink flink
flink run -c bigdata.batch.WordCount /Users/walker/tmp/learning-flink-1.0.jar --input ~/tmp/test_in.txt --output ~/tmp/test_out.txt
test_out.txt:
flink 8
hadoop 2
spark 3
原文:https://www.cnblogs.com/wooluwalker/p/11909393.html