首页 > Web开发 > 详细

flume从kafka中读取数据

时间:2016-10-09 20:03:46      阅读:491      评论:0      收藏:0      [点我收藏+]
a1.sources = r1
a1.sinks = k1
a1.channels = c1
 
#使用内置kafka source
a1.sources.r1.type = org.apache.flume.source.kafka.KafkaSource
#kafka连接的zookeeper
a1.sources.r1.zookeeperConnect = localhost:2181
a1.sources.r1.topic = kkt-test-topic
a1.sources.r1.batchSize = 100
a1.sources.r1.channels =c1
 
#这里写到hdfs中
a1.sinks.k1.channel = c1
a1.sinks.k1.type = hdfs
a1.sinks.k1.hdfs.path =hdfs://iz94rak63uyz/user/flume 
a1.sinks.k1.hdfs.writeFormat = Text
a1.sinks.k1.hdfs.fileType = DataStream
a1.sinks.k1.hdfs.rollInterval = 0
a1.sinks.k1.hdfs.rollSize = 1000000
a1.sinks.k1.hdfs.rollCount = 0
a1.sinks.k1.hdfs.batchSize = 1000
a1.sinks.k1.hdfs.txnEventMax = 1000
a1.sinks.k1.hdfs.callTimeout = 60000
a1.sinks.k1.hdfs.appendTimeout = 60000
 
 
# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 1000
 
# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

启动flume命令:

flume-ng agent --conf-file flume.conf --name a1 -Dflume.root.logger=INFO,console --conf = conf

  

flume从kafka中读取数据

原文:http://www.cnblogs.com/wbh1000/p/5943334.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!