[hadoop ~]$
touch bigfile.tar [hadoop ~]$
cat hadoop-2.5.2.tar.gz >> bigfile.tar [hadoop ~]$ touch bigfile.tarcat
[hadoop ~]$
[hadoop ~]$
[hadoop@master ~]$
cat hadoop-2.5.2.tar.gz >> bigfile.tar [hadoop@master ~]$
hadoop fs -put bigfile.tar / 上传文件到远程目录(/)下 15/12/03 00:57:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/12/03 00:57:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as 192.168.209.102:50010
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1377)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1281)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:526)
15/12/03 00:57:26 INFO hdfs.DFSClient: Abandoning BP-2062059271-192.168.209.100-1448384244888:blk_1073741864_1040
15/12/03 00:57:26 INFO hdfs.DFSClient: Excluding datanode 192.168.209.102:50010
15/12/03 00:57:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as 192.168.209.101:50010
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1377)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1281)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:526)
15/12/03 00:57:26 INFO hdfs.DFSClient: Abandoning BP-2062059271-192.168.209.100-1448384244888:blk_1073741865_1041
15/12/03 00:57:26 INFO hdfs.DFSClient: Excluding datanode 192.168.209.101:50010
原因:
1、某个节点机器突然开启防火墙,导致不能连接
2、强制kill掉某个节点(据说)
3、某个机器直接当掉,岂不是完蛋了?数据冗余容灾的设计干什么吃的?(推测)
解决方法:
root用户关闭防火墙 service iptables stop
可能是某个节点因为重启把iptables又起来了,
可以用chkconfig iptables off解决重启带来的问题,
不重启的话用service iptables stop关掉。