Quantcast
Channel: Apache Timeline
Viewing all articles
Browse latest Browse all 5648

"Hit max consecutive under-replication rotations" Error

$
0
0
Hi all,

While trying to input data from flume to HDFS sink, I'm getting this error

[ERROR - org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:566)] Hit max consecutive under-replication rotations (30); will not continue rolling files under this path due to under-replication

I looked up the error online and it said to make the below modification(dfs.replication). I did that and the problem still persists.

My hadoop configuration hdfs-site.xml has the property

<property>
<name>dfs.replication</name>
<value>1</value>
</property>

I also get this message 30 times before the above error message:
"Block Under-replication detected. Rotating file."

My flume conf file has the configuration:
a1.sinks.k1.channel = c1
a1.sinks.k1.type = hdfs
a1.sinks.k1.hdfs.path = hdfs://xx.xx.xx.xx:8020/input1/event/%y-%m-%d/%H%M
a1.sinks.k1.hdfs.useLocalTimeStamp = true
a1.sinks.k1.hdfs.round = true
a1.sinks.k1.hdfs.roundValue = 10
a1.sinks.k1.hdfs.roundUnit = minute
a1.sinks.k1.hdfs.writeFormat = Text
a1.sinks.k1.hdfs.fileType = DataStream
#a1.sinks.k1.hdfs.filePrefix = events-
a1.sinks.k1.hdfs.rollCount = 1000
a1.sinks.k1.hdfs.batchSize = 10000
a1.sinks.k1.hdfs.rollSize = 0
a1.sinks.k1.hdfs.rollInterval = 30

?Kindly let me know what is it, that I'm doing wrong.

Sincerely,
Sanjay Ramanathan

Viewing all articles
Browse latest Browse all 5648

Trending Articles