- 背景:
项目需要对接confluent-kafka压测,查看消费端的性能情况。并且confluent-kafka开了SSL验证,需要账号密码,如果直接用jmeter的kafka插件,是不满足使用需求,所以只能单独重新写一个对接confluent-kafka的插件!!! - 测试场景
模拟场景,1并发,发送100笔消息,发送相同的内容 -
先上结果:
1.第一次写的 JavaSampler 插件结果:
平均每笔600ms左右,TPS只有1.7/s- 第二次修改后的 JavaSampler 插件结果:
平均每笔7ms左右,TPS能到130/s,(如果不限制请求量,TPS还能再高点,能到1000左右)
从上面的结果很明显的看出来,第一个写的就是垃圾,(ps. 因为之前用spring框架已经验证过了,每秒confluent-kafka的性能能到1000左右)所以,排除人家中间件的锅,那就是自己写了个垃圾出来,然后就是漫长的排查之路!!!
- 上代码吧:
public void product(Properties props, String topic, String key, String value) throws InterruptedException, InstantiationException, IllegalAccessException {
//判断topic,来实例化对象
Class avroType = null;
switch (topic){
case "staging-shareservice-masterdata-style":
avroType = ProductStyle.class;
break;
case "staging-shareservice-masterdata-styleoption":
avroType = ProductStyleOption.class;
break;
case "staging-shareservice-masterdata-sku":
avroType = ProductSku.class;
break;
case "staging-shareservice-masterdata-price":
avroType = Price.class;
break;
case "staging-shareservice-masterdata-location-standard" :
avroType = LocationStandard.class;
}
//序列化value
Object avroValue = avroValueSerializer.avroValue(value, avroType);
//准备生产者
KafkaProducer<String, Object> producer = new KafkaProducer<>(props);
ProducerRecord<String, Object> record = new ProducerRecord<>(topic, key, avroValue);
try {
// 1、发送消息
producer.send(record);
} catch (Exception e) {
e.printStackTrace();
}
// producer.close();
}
就是上面这段发送逻辑,太菜了,看jmeter日志,发现频繁的打印配置信息,每发一次打印一次,很明显每次发送都加载了配置信息导致的,配置信息一般都是初始化的时候加载一次,后面复用就行了,好了点找到了,接下来就是看哪里加载的配置信息了,然后就开始低效调优。
1、先把配置类初始化放setup里,结果显而易见无效;
2、把KafkaProducer也放setup中,尝试了一下,发现效果显著;
哈哈,问题找到, 效果也很明显,最后的代码
myKafkaProducer myKafkaProducer = null;
Properties props = null;
//准备生产者
KafkaProducer<String, Object> producer = null;
// 发送内容对象
ProducerRecord<String, Object> record = null;
//序列化value类型
Object avroValue = null;
//初始化
public void setupTest(JavaSamplerContext context) {
myKafkaProducer = new myKafkaProducer();
String paramBroker = context.getParameter("broker");
String paramTopic = context.getParameter("topic");
String paramKey = context.getParameter("key");
String paramValue = context.getParameter("value");
//初始化配置信息
props = myKafkaProducer.initNewConfig(paramBroker);
//准备生产者
producer = new KafkaProducer<>(props);
//判断topic,来实例化对象
Class avroType = null;
switch (paramTopic){
case "staging-shareservice-masterdata-style":
avroType = ProductStyle.class;
break;
case "staging-shareservice-masterdata-styleoption":
avroType = ProductStyleOption.class;
break;
case "staging-shareservice-masterdata-sku":
avroType = ProductSku.class;
break;
case "staging-shareservice-masterdata-price":
avroType = Price.class;
break;
case "staging-shareservice-masterdata-location-standard" :
avroType = LocationStandard.class;
}
try {
avroValue = avroValueSerializer.avroValue(paramValue, avroType);
} catch (InstantiationException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
}
}
@Override
public SampleResult runTest(JavaSamplerContext javaSamplerContext) {
SampleResult result = this.newSampleResult();
String paramTopic = javaSamplerContext.getParameter("topic");
String paramKey = javaSamplerContext.getParameter("key");
String paramValue = javaSamplerContext.getParameter("value");
StringBuilder paramStr = new StringBuilder("topic:")
.append(paramTopic).append(",\nkey:")
.append(paramKey).append(", \nvalue:")
.append(paramValue);
sampleResultStart(result, paramStr.toString());
record = new ProducerRecord<>(paramTopic, paramKey, avroValue);
try {
// 1、发送消息
producer.send(record);
sampleResultSuccess(result, "异步发送成功");
}catch (Exception ex){
sampleResultFailed(result, "500", ex);
}
return result;
}
相关文章
暂无评论...