Flink 读取HDFS 报错: Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoResta
问题描述
Flink API 读取HDFS上的文件时,报错
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
Caused by: org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-887458683-192.168.0.211-1679184223430:blk_1073742206_1383 file=/tmp/1.txt
代码:
test("从hdfs_文本文件中读取数据") {
//System.setProperty("HADOOP_USER_NAME", "root")
// 1. 获取流执行环境
val env = StreamExecutionEnvironment.getExecutionEnvironment
// 2. 将文本文件作为数据源
val ds: DataStream[String] = env.readTextFile("hdfs://worker01:8020/tmp/1.txt")
// 3. 打印DataStream
ds.print()
// 4. 出发程序执行
env.execute()
}
原因分析:
无法与 DataNode服务 通信,导致读取流程报错
解决方案:
step1: 查看 DataNode 服务端口号 是否放开了 (9866 端口)
#开放 9866 端口
firewall-cmd --permanent --add-port=9866/tcp
#重启防火墙
systemctl restart firewalld.service
step2: 在 IDEA 项目的resources目录下添加 hdfs-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl" target="_blank" rel="external nofollow" ?>
<configuration>
<!-- 客户端连接datanode时是否使用datanode主机名(默认值:false)-->
<property>
<name>dfs.client.use.datanode.hostname</name>
<value>true</value>
</property>
</configuration>