目录
今天在配置Hadoop3.x纠删码(Erasure Coding)时,发生了以下报错,
[mingyu@hadoop102 hadoop-3.1.3]$ hadoop fs -put 1.pptx /input
2023-03-14 19:40:43,384 WARN erasurecode.ErasureCodeNative: ISA-L support is not available in your platform... using builtin-java codec where applicable
2023-03-14 19:40:43,567 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,584 INFO hdfs.DataStreamer: Exception in createBlockOutputStream #1: blk_-9223372036854775423_1166
java.net.NoRouteToHostException: 没有到主机的路由
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:253)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1725)
at org.apache.hadoop.hdfs.StripedDataStreamer.nextBlockOutputStream(StripedDataStreamer.java:106)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,588 WARN hdfs.DataStreamer: Excluding datanode DatanodeInfoWithStorage[192.168.13.105:9866,DS-461042db-82e2-48a4-94dd-5d6e3d783035,DISK]
2023-03-14 19:40:43,592 WARN hdfs.DataStreamer: DataStreamer Exception
java.io.IOException: Unable to create new block.#1: failed, block==null
at org.apache.hadoop.hdfs.StripedDataStreamer.nextBlockOutputStream(StripedDataStreamer.java:114)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,593 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,755 WARN hdfs.DataStreamer: Could not get block locations. Source file "/input/1.pptx._COPYING_" - Aborting...#1: failed, block==null
2023-03-14 19:40:43,853 INFO hdfs.DataStreamer: Exception in createBlockOutputStream #3: blk_-9223372036854775421_1166
java.net.NoRouteToHostException: 没有到主机的路由
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:253)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1725)
at org.apache.hadoop.hdfs.StripedDataStreamer.nextBlockOutputStream(StripedDataStreamer.java:106)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,853 WARN hdfs.DataStreamer: Excluding datanode DatanodeInfoWithStorage[192.168.13.106:9866,DS-91b65fc7-a547-41ad-b8dc-b9729969c950,DISK]
2023-03-14 19:40:43,854 WARN hdfs.DataStreamer: DataStreamer Exception
java.io.IOException: Unable to create new block.#3: failed, block==null
at org.apache.hadoop.hdfs.StripedDataStreamer.nextBlockOutputStream(StripedDataStreamer.java:114)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,854 WARN hdfs.DataStreamer: Could not get block locations. Source file "/input/1.pptx._COPYING_" - Aborting...#3: failed, block==null
2023-03-14 19:40:43,854 WARN hdfs.DFSOutputStream: Failed: oldBytes=0, len=1048576, DFSStripedOutputStream:#3: failed, block==null
java.io.IOException: Could not get block locations. Source file "/input/1.pptx._COPYING_" - Aborting...#3: failed, block==null
at org.apache.hadoop.hdfs.DataStreamer.setupPipelineForAppendOrRecovery(DataStreamer.java:1477)
at org.apache.hadoop.hdfs.DataStreamer.processDatanodeOrExternalError(DataStreamer.java:1256)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:667)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,858 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,971 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,974 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,974 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:44,058 WARN hdfs.DFSOutputStream: Block group <1> failed to write 2 blocks. It's at high risk of losing data.
一、异常产生过程
当时的操作如下
1、启用RS-3-2-1024k纠删码
hdfs ec -enablePolicy -policy RS-3-2-1024k
2、配置纠删码生效路径为/input
hdfs ec -setPolicy -path /input -policy RS-3-2-1024k
3、通过hadoop102上传文件到/input
hadoop fs -put 1.pptx /input
然后就遇到了异常
[mingyu@hadoop102 hadoop-3.1.3]$ hadoop fs -put 1.pptx /input
2023-03-14 19:40:43,384 WARN erasurecode.ErasureCodeNative: ISA-L support is not available in your platform... using builtin-java codec where applicable
2023-03-14 19:40:43,567 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,584 INFO hdfs.DataStreamer: Exception in createBlockOutputStream #1: blk_-9223372036854775423_1166
java.net.NoRouteToHostException: 没有到主机的路由
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:253)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1725)
at org.apache.hadoop.hdfs.StripedDataStreamer.nextBlockOutputStream(StripedDataStreamer.java:106)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,588 WARN hdfs.DataStreamer: Excluding datanode DatanodeInfoWithStorage[192.168.13.105:9866,DS-461042db-82e2-48a4-94dd-5d6e3d783035,DISK]
2023-03-14 19:40:43,592 WARN hdfs.DataStreamer: DataStreamer Exception
java.io.IOException: Unable to create new block.#1: failed, block==null
at org.apache.hadoop.hdfs.StripedDataStreamer.nextBlockOutputStream(StripedDataStreamer.java:114)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,593 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,755 WARN hdfs.DataStreamer: Could not get block locations. Source file "/input/1.pptx._COPYING_" - Aborting...#1: failed, block==null
2023-03-14 19:40:43,853 INFO hdfs.DataStreamer: Exception in createBlockOutputStream #3: blk_-9223372036854775421_1166
java.net.NoRouteToHostException: 没有到主机的路由
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:253)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1725)
at org.apache.hadoop.hdfs.StripedDataStreamer.nextBlockOutputStream(StripedDataStreamer.java:106)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,853 WARN hdfs.DataStreamer: Excluding datanode DatanodeInfoWithStorage[192.168.13.106:9866,DS-91b65fc7-a547-41ad-b8dc-b9729969c950,DISK]
2023-03-14 19:40:43,854 WARN hdfs.DataStreamer: DataStreamer Exception
java.io.IOException: Unable to create new block.#3: failed, block==null
at org.apache.hadoop.hdfs.StripedDataStreamer.nextBlockOutputStream(StripedDataStreamer.java:114)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,854 WARN hdfs.DataStreamer: Could not get block locations. Source file "/input/1.pptx._COPYING_" - Aborting...#3: failed, block==null
2023-03-14 19:40:43,854 WARN hdfs.DFSOutputStream: Failed: oldBytes=0, len=1048576, DFSStripedOutputStream:#3: failed, block==null
java.io.IOException: Could not get block locations. Source file "/input/1.pptx._COPYING_" - Aborting...#3: failed, block==null
at org.apache.hadoop.hdfs.DataStreamer.setupPipelineForAppendOrRecovery(DataStreamer.java:1477)
at org.apache.hadoop.hdfs.DataStreamer.processDatanodeOrExternalError(DataStreamer.java:1256)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:667)
at org.apache.hadoop.hdfs.StripedDataStreamer.run(StripedDataStreamer.java:46)
2023-03-14 19:40:43,858 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,971 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,974 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:43,974 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2023-03-14 19:40:44,058 WARN hdfs.DFSOutputStream: Block group <1> failed to write 2 blocks. It's at high risk of losing data.
然后在HDFS网页端查看上传的文件
在hadoop105、hadoop106上面没有
经过几个小时的排查,终于找到了解决方案😁😁
二、解决方案
将hadoop105、hadoop106的防火墙关掉就好了
systemctl status firewalld
systemctl stop firewalld
systemctl disable firewalld
再次进行上传
hadoop fs -put 1.pptx /input/2.pptx
全都有了
三、思考原因
在向服务器集群添加新服务器的时候没有仔细地将防火墙关掉
在新增服务器的这篇文章已经新增这一步骤