接一下以一个示例配置来介绍一下如何以Flink连接HDFS
pom.xml 添加依赖
<dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-hadoop-compatibility_2.11</artifactId> <version>${flink.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>${hadoop.version}</version> </dependency>
将hdfs-site.xml
和core-site.xml
放入到src/main/resources
目录下面
final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment(); DataSource<String> text = env.readTextFile("hdfs://flinkhadoop:9000/user/wuhulala/input/core-site.xml");
<property> <name>dfs.permissions</name> <value>false</value> </property>
Flink 从 0 到 1 学习之(20)Flink读取hdfs文件
原文:https://www.cnblogs.com/huanghanyu/p/13632836.html