从Kafka topic中获取数据并在Storm中进行分析

来源:互联网 发布:c语言fact函数 编辑:程序博客网 时间:2024/06/03 19:02

从Kafka topic中获取数据

    String zks = "x.x.x.x:2181,x.x.x.x:2181,x.x.x.x:2181";    String topic = "test";    String zkRoot = "/storm";    String id = "word";//默认是word    BrokerHosts brokerHosts = new ZkHosts(zks);    /**     *  1、Spout属性设置     */    SpoutConfig spoutConf = new SpoutConfig(brokerHosts, topic, zkRoot, "KafkaSpout-reader");    spoutConf.scheme = new SchemeAsMultiScheme(new StringScheme());    spoutConf.forceFromStart = false;//强制不从最开始读取数据    spoutConf.zkServers = Arrays.asList(new String[] {"x.x.x.x","x.x.x.x","x.x.x.x"});    spoutConf.zkPort = 2181;    spoutConf.scheme = new SchemeAsMultiScheme(new StringScheme());    spoutConf.startOffsetTime =kafka.api.OffsetRequest.LatestTime();    spoutConf.fetchMaxWait = 10000000;    /**      *  2、创建TopologyBuilder      *  并设置Spout对象(其中KafkaSpout对象来自storm-kafka-0.9.2-incubating.jar包下     *  public class storm.kafka.KafkaSpout extends backtype.storm.topology.base.BaseRichSpout)     */    TopologyBuilder builder = new TopologyBuilder();    builder.setSpout("KafkaSpout-reader", new KafkaSpout(spoutConf), 1);//spoutConf中可以进行SpoutConfig属性设置    builder.setBolt("xxx", new KafkaBolt(), 3).shuffleGrouping("KafkaSpout-reader");

将获取的数据提交的Storm中处理

0 0
原创粉丝点击