Kafka——Spring集成Kafka
准备⼯作
安装kafka+zookeeper环境
利⽤命令创建好topic
Pom⽂件,引⼊spring-kafka jar包这⾥需要注意2个地⽅:
1. kafka-clients 包版本与服务器端kafka-clients版本保持⼀致(查看服务器kafka版本⽅法在kafka安装⽬录下libs 中查kafka-clients开头
的jar⽂件)
2. 引⼊的spring-kafka 版本在2.0或者2.X 时Spring版本在5.0才能⽀持
  ..........
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>2.1.8.RELEASE</version>
</dependency>
  ..........
XML配置⽅式
⽣产者
配置:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="/schema/beans"
xmlns:xsi="/2001/XMLSchema-instance" xmlns:context="/schema/context"
xsi:schemaLocation="/schema/beans
/schema/beans/spring-beans-3.0.xsd
/schema/context
/schema/context/spring-context.xsd">
<context:property-placeholder location="classpath*:config/application.properties" />
<!-- 定义producer的参数 -->
<bean id="producerProperties"class="java.util.HashMap">
<constructor-arg>
<map>
<entry key="bootstrap.servers" value="${bootstrap.servers}" />
<entry key="group.id" value="${group.id}" />
<entry key="retries" value="${retries}" />
<entry key="batch.size" value="${batch.size}" />
<entry key="linger.ms" value="${linger.ms}" />
<entry key="" value="${}" />
<entry key="acks" value="${acks}" />
<entry key="key.serializer" value="org.apache.kafkamon.serialization.StringSerializer" />
<entry key="value.serializer" value="org.apache.kafkamon.serialization.StringSerializer" />
</map>
</constructor-arg>
</bean>
<!-- 创建kafkatemplate需要使⽤的producerfactory bean -->
<bean id="producerFactory"
class="org.DefaultKafkaProducerFactory">
<constructor-arg>
<ref bean="producerProperties" />
</constructor-arg>
</bean>
<!-- 创建kafkatemplate bean,使⽤的时候,只需要注⼊这个bean,即可使⽤template的send消息⽅法 -->
<bean id="kafkaTemplate"class="org.KafkaTemplate">
<constructor-arg ref="producerFactory" />
<constructor-arg name="autoFlush" value="true" />
<property name="defaultTopic" value="default" />
</bean>
</beans>
如上图,xml主要配置了KafkaTemplate的构造参数producerFactory和autoFlush,对应了⼀个KafkaTemplate源码中的2参构造函数。
kafka命令1. producerProperties:设置⽣产者⼯⼚需要的配置
2. producerFactory:定义了⽣产者⼯⼚构造⽅法
3. kafkaTemplate:定义了使⽤producerFactory和是否⾃动刷新,2个参数来构造kafka⽣产者模板类
发送消息:
ListenableFuture<SendResult<String, String>> listenableFuture = kafkaTemplate.send("topic", "partition","key","data");
//发送成功回调
SuccessCallback<SendResult<String, String>> successCallback = new SuccessCallback<SendResult<String, String>>() {
@Override
public void onSuccess(SendResult<String, String> result) {
//成功业务逻辑
}
}
//发送失败回调
FailureCallback failureCallback = new FailureCallback() {
@Override
public void onFailure(Throwable ex) {
//失败业务逻辑
}
}
listenableFuture.addCallback(successCallback, failureCallback);
消费者
配置:
<!-- 1.定义consumer的参数 -->
<bean id="consumerProperties"class="java.util.HashMap">
<constructor-arg>
<map>
<entry key="bootstrap.servers" value="${bootstrap.servers}" />
<entry key="group.id" value="${group.id}" />
<entry key="enable.automit" value="${enable.automit}" />
<entry key="session.timeout.ms" value="${session.timeout.ms}" />
<entry key="key.deserializer"
value="org.apache.kafkamon.serialization.StringDeserializer" />
<entry key="value.deserializer"
value="org.apache.kafkamon.serialization.StringDeserializer" />
</map>
</constructor-arg>
</bean>
<!-- 2.创建consumerFactory bean -->
<bean id="consumerFactory"
class="org.DefaultKafkaConsumerFactory" >
<constructor-arg>
<ref bean="consumerProperties" />
</constructor-arg>
</bean>
<!-- 3.定义消费实现类 -->
<bean id="kafkaConsumerService"class="xxx.service.impl.KafkaConsumerSerivceImpl" />
<!-- 4.消费者容器配置信息 -->
<bean id="containerProperties"class="org.springframework.fig.ContainerProperties">
<!-- topic -->
<constructor-arg name="topics">
<list>
<value>${dit.for.lease}</value>
<value>${loan.pic}</value>
</list>
</constructor-arg>
<property name="messageListener"ref="kafkaConsumerService" />
</bean>
<!-- 5.消费者并发消息监听容器,执⾏doStart()⽅法 -->
<bean id="messageListenerContainer"class="org.springframework.kafka.listener.ConcurrentMessageListenerContainer" init-method="doStart" >
<constructor-arg ref="consumerFactory" />
<constructor-arg ref="containerProperties" />
<property name="concurrency" value="${concurrency}" />
</bean>
1. consumerProperties-》consumerFactory 载⼊配置构造消费者⼯⼚
2. messageListener-》containerProperties 载⼊容器配置(topics)
3. consumerFactory+containerProperties-》messageListenerContainer 容器配置(topics)+消息,构造⼀个并发消息监听容器,
并执⾏初始化⽅法doStart
4. 需要注意. KafkaConsumerSerivceImpl 此类需要实现 MessageListener 接⼝
消费消息:
⽅案1:直接实现MessageListener接⼝,复写onMessage⽅法,实现⾃定义消费业务逻辑。
public class KafkaConsumerSerivceImpl implements MessageListener<String, String> {
@Override
public void onMessage(ConsumerRecord<String, String> data) {
//根据不同主题,消费
if("主题1".pic())){
//逻辑1
}else if("主题2".pic())){
//逻辑2
}
}
}
⽅案2:使⽤@KafkaListener注解,并设置topic,⽀持SPEL表达式。这样⽅便拆分多个不同topic处理不同业务逻辑。(特别是有⾃⼰的事务的时候,尤其⽅便)
import org.springframework.kafka.annotation.KafkaListener;
public class KafkaConsumerSerivceImpl {
@KafkaListener(topics = "${templar.pic}")
void templarAgreementNoticewithhold(ConsumerRecord<String, String> data){
//消费业务逻辑
}
}
Java注解⽅式
⽣产者
配置:
/**
* @description kafka ⽣产者配置
*/
@Configuration
@EnableKafka
public class KafkaProducerConfig {
public KafkaProducerConfig(){
System.out.println("kafka⽣产者配置");
}
@Bean
public ProducerFactory<Integer, String> producerFactory() {
return new DefaultKafkaProducerFactory(producerProperties());
}
@Bean
public Map<String, Object> producerProperties() {
Map<String, Object> props = new HashMap<String, Object>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, Instance().getString("kafka.producer.bootstrap.servers"));
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, Instance().getString("kafka.producer.key.serializer"));
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_Instance().getString("kafka.producer.value.serializer"));
props.put(ProducerConfig.RETRIES_Instance().getInt("ies"));
props.put(ProducerConfig.BATCH_SIZE_Instance().getInt("kafka.producer.batch.size",1048576));
props.put(ProducerConfig.LINGER_MS_Instance().getInt("kafka.producer.linger.ms"));
props.put(ProducerConfig.BUFFER_MEMORY_Instance().getLong("kafka.",33554432L));
props.put(ProducerConfig.ACKS_Instance().getString("kafka.producer.acks","all"));
return props;
}
@Bean
public KafkaTemplate<Integer, String> kafkaTemplate() {
KafkaTemplate kafkaTemplate = new KafkaTemplate<Integer, String>(producerFactory(),true);
kafkaTemplate.Instance().getString("kafka.producer.defaultTopic","default"));
return kafkaTemplate;
}
}
发送消息:
  跟xml配置⼀样。
消费者
配置:
/**
* @description kafka 消费者配置
*/
@Configuration
@EnableKafka
public class KafkaConsumerConfig {
public KafkaConsumerConfig(){
System.out.println("kafka消费者配置加载...");
}
@Bean
KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<Integer, String>>
kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<Integer, String> factory =
new ConcurrentKafkaListenerContainerFactory();
factory.setConsumerFactory(consumerFactory());
factory.setConcurrency(3);
return factory;
}
@Bean
public ConsumerFactory<Integer, String> consumerFactory() {
return new DefaultKafkaConsumerFactory(consumerProperties());
}
@Bean
public Map<String, Object> consumerProperties() {
Map<String, Object> props= new HashMap<String, Object>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, Instance().getString("sumer.bootstrap.servers"));
props.put(ConsumerConfig.GROUP_ID_CONFIG,  Instance().getString("up.id"));
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG,  Instance().getStrin
g("able.automit"));
props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, Instance().getString("sumer.automit.interval.ms"));        props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG,  Instance().getString("sumer.session.timeout.ms"));
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,  Instance().getString("sumer.key.deserializer"));
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,  Instance().getString("sumer.value.deserializer"));
return props;
}
@Bean
public KafkaConsumerListener kafkaConsumerListener(){
return new KafkaConsumerListener();
}
}
消费消息:
  跟xml配置⼀样。
引⽤:

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。