kafka java 生产者消费者demo
来源:互联网 发布:微信出题软件 编辑:程序博客网 时间:2024/03/29 22:39
生产者:
import java.io.File;import java.io.FileInputStream;import java.util.Properties;import org.apache.kafka.clients.producer.Callback;import org.apache.kafka.clients.producer.KafkaProducer;import org.apache.kafka.clients.producer.ProducerRecord;import org.apache.kafka.clients.producer.RecordMetadata;import com.alibaba.fastjson.JSON;public class KafkaProduce { private static Properties properties; static { properties = new Properties(); String path = KafkaProduce.class.getResource("/").getFile().toString() + "kafka.properties"; try { FileInputStream fis = new FileInputStream(new File(path)); properties.load(fis); } catch (Exception e) { e.printStackTrace(); } } /** * 发送消息 * * @param topic * @param key * @param value */ public void sendMsg(String topic, String key, String value) { System.err.println("properties:" + JSON.toJSONString(properties)); // 实例化produce KafkaProducer<String, String> kp = new KafkaProducer<String, String>(properties); System.err.println("kp:" + JSON.toJSONString(kp)); // 消息封装 ProducerRecord<String, String> pr = new ProducerRecord<String, String>(topic, key, value); // 发送数据 // kp.send(pr); kp.send(pr, new Callback() { // 回调函数 @Override public void onCompletion(RecordMetadata metadata, Exception exception) { if (null != exception) { System.out.println("记录的offset在:" + metadata.offset()); System.out.println(exception.getMessage() + exception); } } }); // 关闭produce kp.close(); }}
消费者:
import java.io.File;import java.io.FileInputStream;import java.util.HashMap;import java.util.List;import java.util.Map;import java.util.Properties;import kafka.consumer.ConsumerConfig;import kafka.consumer.ConsumerIterator;import kafka.consumer.KafkaStream;import kafka.javaapi.consumer.ConsumerConnector;import kafka.message.MessageAndMetadata;import kafka.serializer.StringDecoder;import kafka.utils.VerifiableProperties;public class KafkaConsume { private final static String TOPIC = "test"; private static Properties properties; static { properties = new Properties(); String path = KafkaConsume.class.getResource("/").getFile().toString() + "kafka.properties"; System.err.println("path:" + path); try { FileInputStream fis = new FileInputStream(new File(path)); properties.load(fis); } catch (Exception e) { e.printStackTrace(); } } /** * 获取消息 * * @throws Exception */ public static void getMsg() throws Exception { ConsumerConfig config = new ConsumerConfig(properties); Map<String, Integer> topicCountMap = new HashMap<String, Integer>(); topicCountMap.put(TOPIC, new Integer(1)); StringDecoder keyDecoder = new StringDecoder(new VerifiableProperties()); StringDecoder valueDecoder = new StringDecoder(new VerifiableProperties()); ConsumerConnector consumer = kafka.consumer.Consumer.createJavaConsumerConnector(config); Map<String, List<KafkaStream<String, String>>> consumerMap = consumer.createMessageStreams(topicCountMap, keyDecoder, valueDecoder); KafkaStream<String, String> stream = consumerMap.get(TOPIC).get(0); ConsumerIterator<String, String> it = stream.iterator(); while (it.hasNext()) { MessageAndMetadata<String, String> keyVlaue = it.next(); String key = keyVlaue.key(); String value = keyVlaue.message(); System.err.println("key:" + key + " ; value:" + value); } }}
kafka.properties配置信息:
##producebootstrap.servers=10.20.135.32:9092producer.type=syncrequest.required.acks=1serializer.class=kafka.serializer.DefaultEncoderkey.serializer=org.apache.kafka.common.serialization.StringSerializervalue.serializer=org.apache.kafka.common.serialization.StringSerializerbak.partitioner.class=kafka.producer.DefaultPartitionerbak.key.serializer=org.apache.kafka.common.serialization.StringSerializerbak.value.serializer=org.apache.kafka.common.serialization.StringSerializer##consumezookeeper.connect=10.20.135.32:2181group.id=test-consumer-groupzookeeper.session.timeout.ms=4000zookeeper.sync.time.ms=200enable.auto.commit=falseauto.commit.interval.ms=1000auto.offset.reset=smallestserializer.class=kafka.serializer.StringEncoder
阅读全文
0 0
- kafka java 生产者消费者demo
- Kafka 生产者和消费者 demo (java&scala)
- kafka生产者和消费者的javaAPI demo
- kafka生产者、消费者java示例
- 使用Java写kafka生产者消费者
- java 实现kafka消息生产者和消费者
- kafka生产者与消费者java代码示例
- [java线程同步]生产者消费者问题demo
- Java多线程之生产者消费者demo
- kafka 生产者消费者配置
- kafka生产者消费者
- kafka C++ 生产者 消费者
- Kafka消费者生产者实例
- kafka生产者消费者
- kafka集群搭建和使用Java写kafka生产者消费者
- kafka集群搭建和使用Java写kafka生产者消费者
- kafka集群搭建和使用Java写kafka生产者消费者
- kafka集群搭建和使用Java写kafka生产者消费者
- 对象(object)
- Foward渲染路径下的光照
- speex使用陷波器去直流
- centos 6.5安装Elasticsearch 5.6.3集群和Head插件
- C++ STL源码剖析——stl_config.h
- kafka java 生产者消费者demo
- js下拉列表添加监听事件(支持所有主流浏览器)
- C# $
- JavaWeb开发SSM框架(Spring+SpringMVC+Mybatis)环境Jar包
- swift3.0 collectionView添加长按手势识别
- Servlet 3.0中的异步处理支持
- 返回字符串函数的四种实现方法
- [杂题] BZOJ3522: [Poi2014]Hotel
- ATS自定义日志中的各字段解读