Producer指定分区
来源:互联网 发布:单页seo怎么赚钱 编辑:程序博客网 时间:2024/06/03 16:01
我们用Kafka生产者时系统会默认进行分区,但是我们也可以通过控制key值得方式让消息存放到指定的partitions中。
首先我们创建一个SimplePartitioner类
package com.teamsun.kafka.m001;
import kafka.producer.Partitioner;
import kafka.utils.VerifiableProperties;
public class SimplePartitionerimplements Partitioner {
public SimplePartitioner(VerifiableProperties props) {
}
@Override
public int partition(Object key,int numPartitions) {
int partition = 0;
String k = (String) key;
partition = Math.abs(k.hashCode()) % numPartitions;
System.out.println(partition);
return partition;
}
}
在这个类中我们可以通过key值来控制指定分区,这里是通过hash值来控制的。
创建Kafka配置文件
package com.teamsun.kafka.m001;
public interface KafkaProperties {
final static StringzkConnect = "hadoop0:42182,hadoop1:42182,hadoop2:42182,hadoop3:42182";
final static StringgroupId1= "group1";
final static Stringtopic = "test3";
final static StringkafkaServerURL = "hadoop0,hadoop1,hadoop2,hadoop3";
final static int kafkaServerPort = 9092;
final static int kafkaProducerBufferSize = 64 * 1024;
final static int connectionTimeOut = 20000;
final static int reconnectInterval = 10000;
final static StringclientId = "SimpleConsumerDemoClient";
}
创建生产者
package com.teamsun.kafka.m001;
import java.util.Properties;
import kafka.javaapi.producer.Producer;
import kafka.producer.KeyedMessage;
import kafka.producer.ProducerConfig;
public class PartitionerProducer {
public static void main(String[] args) {
Properties props = new Properties();
props.put("serializer.class", "kafka.serializer.StringEncoder");
props.put("metadata.broker.list",
"hadoop0:9092,hadoop1:9092,hadoop2:9092,hadoop3:9092");
props.put("partitioner.class", "com.teamsun.kafka.m001.SimplePartitioner");
props.put("request.required.acks", "1");
Producer<String, String> producer = new Producer<String, String>(
new ProducerConfig(props));
String topic = "test3";
for (int i = 0; i <= 1000000; i++) {
String k = "key" + i;
String v = k + "--value" + i;
producer.send(new KeyedMessage<String, String>(topic, k, v));
System.out.println(k+v);
}
producer.close();
}
}
这里注意红色部分引用之前创建的SimplePartitioner类
创建消费者
package com.teamsun.kafka.m001;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Properties;
import com.teamsun.kafka.m001.KafkaProperties;
import kafka.consumer.ConsumerConfig;
import kafka.consumer.ConsumerIterator;
import kafka.consumer.KafkaStream;
import kafka.javaapi.consumer.ConsumerConnector;
public class KafkaConsumer1extends Thread {
private final ConsumerConnectorconsumer;
private final Stringtopic;
public KafkaConsumer1(String topic) {
consumer = kafka.consumer.Consumer
.createJavaConsumerConnector(createConsumerConfig());
this.topic = topic;
}
private static ConsumerConfig createConsumerConfig() {
Properties props = new Properties();
props.put("zookeeper.connect", KafkaProperties.zkConnect);
props.put("group.id", KafkaProperties.groupId1);
props.put("zookeeper.session.timeout.ms","40000");
props.put("zookeeper.sync.time.ms", "200");
props.put("auto.commit.interval.ms", "1000");
return new ConsumerConfig(props);
}
@Override
public void run() {
Map<String, Integer> topicCountMap = new HashMap<String, Integer>();
topicCountMap.put(topic, new Integer(1));
Map<String, List<KafkaStream<byte[],byte[]>>> consumerMap =consumer
.createMessageStreams(topicCountMap);
KafkaStream<byte[], byte[]> stream = consumerMap.get(topic).get(0);
ConsumerIterator<byte[],byte[]> it = stream.iterator();
while (it.hasNext()) {
System.out.println("1receive:" +new String(it.next().message()));
//try {
////sleep(300); // 每条消息延迟300ms
//} catch (InterruptedException e) {
//e.printStackTrace();
//}
}
}
}
多consumer同时消费时指定相同组即可同时消费消息。
- Producer指定分区
- 关于kafka producer 分区策略的思考
- Producer
- Kafka Producer是如何动态感知Topic分区数变化
- MySQL 查询指定分区数据
- 指定分区表分区进行查询
- hbase建表时,指定预分区
- 指定分区表分区进行查询
- 雕虫小鸡 指定子分区
- 将文件系统安装到指定的分区
- linux下取得指定目录所在分区
- 删除指定时间段分区的SQL
- 如何扩容linux虚拟机的指定分区
- Kafka指定分区和offset消费。
- producer consumer
- producer & consumer
- Producer Example
- Kafka Producer
- 已集成第三方开源组件:
- python中的生产者以及消费者实现
- HTTP状态码详解
- Qt中socket编程
- windows下mysql提示access denied for user ''@'localhost' to database解决方案
- Producer指定分区
- 79. Word Search
- Qt网络应用----socket通信例子
- Linux 磁盘查询命令 du
- ACdream 1061:郭氏树
- 顺序表和链表的相关热点面试题
- hadoop、zookeeper、hbase集群安装
- PSR-3 日志接口规范
- 洛谷 P1414 又是毕业季II