用flume-ng-sql-source 从mysql 抽取数据到kafka被storm消费

来源:互联网 发布:桌面软件图 编辑:程序博客网 时间:2024/06/05 02:55

1.下载编译flume-ng-sql-source 下载地址:https://github.com/keedio/flume-ng-sql-source.git

   安装说明文档编译和拷贝jar包

2.编写flume-ng 配置文件

1.channels = ch-1a1.sources = src-1a1.sinks = k1###########sql source################## For each one of the sources, the type is defineda1.sources.src-1.type = org.keedio.flume.source.SQLSourcea1.sources.src-1.hibernate.connection.url = jdbc:mysql://172.16.43.21:3306/test# Hibernate Database connection propertiesa1.sources.src-1.hibernate.connection.user = hadoopa1.sources.src-1.hibernate.connection.password = hadoopa1.sources.src-1.hibernate.connection.autocommit = truea1.sources.src-1.hibernate.dialect = org.hibernate.dialect.MySQL5Dialecta1.sources.src-1.hibernate.connection.driver_class = com.mysql.jdbc.Drivera1.sources.src-1.run.query.delay=5000a1.sources.src-1.status.file.path = /home/hadoop/export/server/apache-flume-1.7.0-bina1.sources.src-1.status.file.name = sqlSource.status# Custom querya1.sources.src-1.start.from = 0a1.sources.src-1.custom.query = select `id`, `str` from json_str where id > $@$ order by id asca1.sources.src-1.batch.size = 1000a1.sources.src-1.max.rows = 1000a1.sources.src-1.hibernate.connection.provider_class = org.hibernate.connection.C3P0ConnectionProvidera1.sources.src-1.hibernate.c3p0.min_size=1a1.sources.src-1.hibernate.c3p0.max_size=10################################################################a1.channels.ch-1.type = memorya1.channels.ch-1.capacity = 10000a1.channels.ch-1.transactionCapacity = 10000a1.channels.ch-1.byteCapacityBufferPercentage = 20a1.channels.ch-1.byteCapacity = 800000################################################################a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSinka1.sinks.k1.topic = testusera1.sinks.k1.brokerList = test0:9092,test1:9092,test2:9092a1.sinks.k1.requiredAcks = 1a1.sinks.k1.batchSize = 20a1.sinks.k1.channel = c1a1.sinks.k1.channel = ch-1a1.sources.src-1.channels=ch-1

3.遇到的问题

mysql中的内容采集到kafka中之后会多出来很多双引号

mysql数据格式:


kafka数据格式:


用storm对kafka中的数据进行格子整理


0 0