Kafka producer TimeoutException: Expiring 1 record(s)









up vote
9
down vote

favorite












I am using Kafka with Spring-boot:



Kafka Producer class:



@Service
public class MyKafkaProducer

@Autowired
private KafkaTemplate<String, String> kafkaTemplate;

private static Logger LOGGER = LoggerFactory.getLogger(NotificationDispatcherSender.class);

// Send Message
public void sendMessage(String topicName, String message) throws Exception
LOGGER.debug("========topic Name===== " + topicName + "=========message=======" + message);
ListenableFuture<SendResult<String, String>> result = kafkaTemplate.send(topicName, message);
result.addCallback(new ListenableFutureCallback<SendResult<String, String>>()
@Override
public void onSuccess(SendResult<String, String> result)
LOGGER.debug("sent message='' with offset=", message, result.getRecordMetadata().offset());


@Override
public void onFailure(Throwable ex)
LOGGER.error(Constants.PRODUCER_MESSAGE_EXCEPTION.getValue() + " : " + ex.getMessage());

);




Kafka-configuration:



spring.kafka.producer.retries=0
spring.kafka.producer.batch-size=100000
spring.kafka.producer.request.timeout.ms=30000
spring.kafka.producer.linger.ms=10
spring.kafka.producer.acks=0
spring.kafka.producer.buffer-memory=33554432
spring.kafka.producer.max.block.ms=5000
spring.kafka.bootstrap-servers=192.168.1.161:9092,192.168.1.162:9093


Let's say I have sent 10 times 1000 messages in topic my-test-topic.



8 out of 10 times I am successfully getting all messages in my consumer but sometimes I am getting this below error:



2017-10-05 07:24:11, [ERROR] [my-service - LoggingProducerListener - onError:76] Exception thrown when sending a message with key='null' and payload='{"deviceType":"X","deviceKeys":["apiKey":"X-X-o"],"devices...' to topic my-test-topic



and org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for my-test-topic-4 due to 30024 ms has passed since batch creation plus linger time










share|improve this question























  • Is this error you are describing from the producer or the consumer?
    – adarshr
    Oct 9 '17 at 15:19










  • I am getting this error on producer
    – Prakash Pandey
    Oct 9 '17 at 15:20










  • So, your batch is too slow for such a "low" request.timeout.ms. Try to make batch-size a bit lower
    – Artem Bilan
    Oct 9 '17 at 15:31










  • Isn't 30 second enough?( I am new to Kafka, please bear with me)
    – Prakash Pandey
    Oct 9 '17 at 15:38










  • I don't know, but according your error you are really exceeding those 30 secs: due to 30024 ms has passed
    – Artem Bilan
    Oct 9 '17 at 16:35














up vote
9
down vote

favorite












I am using Kafka with Spring-boot:



Kafka Producer class:



@Service
public class MyKafkaProducer

@Autowired
private KafkaTemplate<String, String> kafkaTemplate;

private static Logger LOGGER = LoggerFactory.getLogger(NotificationDispatcherSender.class);

// Send Message
public void sendMessage(String topicName, String message) throws Exception
LOGGER.debug("========topic Name===== " + topicName + "=========message=======" + message);
ListenableFuture<SendResult<String, String>> result = kafkaTemplate.send(topicName, message);
result.addCallback(new ListenableFutureCallback<SendResult<String, String>>()
@Override
public void onSuccess(SendResult<String, String> result)
LOGGER.debug("sent message='' with offset=", message, result.getRecordMetadata().offset());


@Override
public void onFailure(Throwable ex)
LOGGER.error(Constants.PRODUCER_MESSAGE_EXCEPTION.getValue() + " : " + ex.getMessage());

);




Kafka-configuration:



spring.kafka.producer.retries=0
spring.kafka.producer.batch-size=100000
spring.kafka.producer.request.timeout.ms=30000
spring.kafka.producer.linger.ms=10
spring.kafka.producer.acks=0
spring.kafka.producer.buffer-memory=33554432
spring.kafka.producer.max.block.ms=5000
spring.kafka.bootstrap-servers=192.168.1.161:9092,192.168.1.162:9093


Let's say I have sent 10 times 1000 messages in topic my-test-topic.



8 out of 10 times I am successfully getting all messages in my consumer but sometimes I am getting this below error:



2017-10-05 07:24:11, [ERROR] [my-service - LoggingProducerListener - onError:76] Exception thrown when sending a message with key='null' and payload='{"deviceType":"X","deviceKeys":["apiKey":"X-X-o"],"devices...' to topic my-test-topic



and org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for my-test-topic-4 due to 30024 ms has passed since batch creation plus linger time










share|improve this question























  • Is this error you are describing from the producer or the consumer?
    – adarshr
    Oct 9 '17 at 15:19










  • I am getting this error on producer
    – Prakash Pandey
    Oct 9 '17 at 15:20










  • So, your batch is too slow for such a "low" request.timeout.ms. Try to make batch-size a bit lower
    – Artem Bilan
    Oct 9 '17 at 15:31










  • Isn't 30 second enough?( I am new to Kafka, please bear with me)
    – Prakash Pandey
    Oct 9 '17 at 15:38










  • I don't know, but according your error you are really exceeding those 30 secs: due to 30024 ms has passed
    – Artem Bilan
    Oct 9 '17 at 16:35












up vote
9
down vote

favorite









up vote
9
down vote

favorite











I am using Kafka with Spring-boot:



Kafka Producer class:



@Service
public class MyKafkaProducer

@Autowired
private KafkaTemplate<String, String> kafkaTemplate;

private static Logger LOGGER = LoggerFactory.getLogger(NotificationDispatcherSender.class);

// Send Message
public void sendMessage(String topicName, String message) throws Exception
LOGGER.debug("========topic Name===== " + topicName + "=========message=======" + message);
ListenableFuture<SendResult<String, String>> result = kafkaTemplate.send(topicName, message);
result.addCallback(new ListenableFutureCallback<SendResult<String, String>>()
@Override
public void onSuccess(SendResult<String, String> result)
LOGGER.debug("sent message='' with offset=", message, result.getRecordMetadata().offset());


@Override
public void onFailure(Throwable ex)
LOGGER.error(Constants.PRODUCER_MESSAGE_EXCEPTION.getValue() + " : " + ex.getMessage());

);




Kafka-configuration:



spring.kafka.producer.retries=0
spring.kafka.producer.batch-size=100000
spring.kafka.producer.request.timeout.ms=30000
spring.kafka.producer.linger.ms=10
spring.kafka.producer.acks=0
spring.kafka.producer.buffer-memory=33554432
spring.kafka.producer.max.block.ms=5000
spring.kafka.bootstrap-servers=192.168.1.161:9092,192.168.1.162:9093


Let's say I have sent 10 times 1000 messages in topic my-test-topic.



8 out of 10 times I am successfully getting all messages in my consumer but sometimes I am getting this below error:



2017-10-05 07:24:11, [ERROR] [my-service - LoggingProducerListener - onError:76] Exception thrown when sending a message with key='null' and payload='{"deviceType":"X","deviceKeys":["apiKey":"X-X-o"],"devices...' to topic my-test-topic



and org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for my-test-topic-4 due to 30024 ms has passed since batch creation plus linger time










share|improve this question















I am using Kafka with Spring-boot:



Kafka Producer class:



@Service
public class MyKafkaProducer

@Autowired
private KafkaTemplate<String, String> kafkaTemplate;

private static Logger LOGGER = LoggerFactory.getLogger(NotificationDispatcherSender.class);

// Send Message
public void sendMessage(String topicName, String message) throws Exception
LOGGER.debug("========topic Name===== " + topicName + "=========message=======" + message);
ListenableFuture<SendResult<String, String>> result = kafkaTemplate.send(topicName, message);
result.addCallback(new ListenableFutureCallback<SendResult<String, String>>()
@Override
public void onSuccess(SendResult<String, String> result)
LOGGER.debug("sent message='' with offset=", message, result.getRecordMetadata().offset());


@Override
public void onFailure(Throwable ex)
LOGGER.error(Constants.PRODUCER_MESSAGE_EXCEPTION.getValue() + " : " + ex.getMessage());

);




Kafka-configuration:



spring.kafka.producer.retries=0
spring.kafka.producer.batch-size=100000
spring.kafka.producer.request.timeout.ms=30000
spring.kafka.producer.linger.ms=10
spring.kafka.producer.acks=0
spring.kafka.producer.buffer-memory=33554432
spring.kafka.producer.max.block.ms=5000
spring.kafka.bootstrap-servers=192.168.1.161:9092,192.168.1.162:9093


Let's say I have sent 10 times 1000 messages in topic my-test-topic.



8 out of 10 times I am successfully getting all messages in my consumer but sometimes I am getting this below error:



2017-10-05 07:24:11, [ERROR] [my-service - LoggingProducerListener - onError:76] Exception thrown when sending a message with key='null' and payload='{"deviceType":"X","deviceKeys":["apiKey":"X-X-o"],"devices...' to topic my-test-topic



and org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for my-test-topic-4 due to 30024 ms has passed since batch creation plus linger time







apache-kafka kafka-consumer-api kafka-producer-api spring-kafka






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Dec 31 '17 at 11:09









michalbrz

1,3991023




1,3991023










asked Oct 9 '17 at 15:15









Prakash Pandey

6581028




6581028











  • Is this error you are describing from the producer or the consumer?
    – adarshr
    Oct 9 '17 at 15:19










  • I am getting this error on producer
    – Prakash Pandey
    Oct 9 '17 at 15:20










  • So, your batch is too slow for such a "low" request.timeout.ms. Try to make batch-size a bit lower
    – Artem Bilan
    Oct 9 '17 at 15:31










  • Isn't 30 second enough?( I am new to Kafka, please bear with me)
    – Prakash Pandey
    Oct 9 '17 at 15:38










  • I don't know, but according your error you are really exceeding those 30 secs: due to 30024 ms has passed
    – Artem Bilan
    Oct 9 '17 at 16:35
















  • Is this error you are describing from the producer or the consumer?
    – adarshr
    Oct 9 '17 at 15:19










  • I am getting this error on producer
    – Prakash Pandey
    Oct 9 '17 at 15:20










  • So, your batch is too slow for such a "low" request.timeout.ms. Try to make batch-size a bit lower
    – Artem Bilan
    Oct 9 '17 at 15:31










  • Isn't 30 second enough?( I am new to Kafka, please bear with me)
    – Prakash Pandey
    Oct 9 '17 at 15:38










  • I don't know, but according your error you are really exceeding those 30 secs: due to 30024 ms has passed
    – Artem Bilan
    Oct 9 '17 at 16:35















Is this error you are describing from the producer or the consumer?
– adarshr
Oct 9 '17 at 15:19




Is this error you are describing from the producer or the consumer?
– adarshr
Oct 9 '17 at 15:19












I am getting this error on producer
– Prakash Pandey
Oct 9 '17 at 15:20




I am getting this error on producer
– Prakash Pandey
Oct 9 '17 at 15:20












So, your batch is too slow for such a "low" request.timeout.ms. Try to make batch-size a bit lower
– Artem Bilan
Oct 9 '17 at 15:31




So, your batch is too slow for such a "low" request.timeout.ms. Try to make batch-size a bit lower
– Artem Bilan
Oct 9 '17 at 15:31












Isn't 30 second enough?( I am new to Kafka, please bear with me)
– Prakash Pandey
Oct 9 '17 at 15:38




Isn't 30 second enough?( I am new to Kafka, please bear with me)
– Prakash Pandey
Oct 9 '17 at 15:38












I don't know, but according your error you are really exceeding those 30 secs: due to 30024 ms has passed
– Artem Bilan
Oct 9 '17 at 16:35




I don't know, but according your error you are really exceeding those 30 secs: due to 30024 ms has passed
– Artem Bilan
Oct 9 '17 at 16:35












2 Answers
2






active

oldest

votes

















up vote
3
down vote













There are 3 possibilities:



  1. Increase request.timeout.ms - this is the time that Kafka will wait for whole batch to be ready in buffer. So in your case if there are less than 100 000 messages in buffer, timeout will occur. More info here: https://stackoverflow.com/a/34794261/2707179

  2. Decrease batch-size - related to previous point, it will send batches more often but they will include fewer messages.

  3. Depending on message size, maybe your network cannot catch up with high load? Check if your throughput is not a bottleneck.





share|improve this answer
















  • 3




    I have the same issue as OP ever since I enabled SSL on Kafka, and notice that, like me, he has set linger.ms. According to the documentation, batches are sent out after this linger time even if the batch is not full, so even with a high batch size it should not time out.
    – Jodiug
    Jun 4 at 9:08


















up vote
1
down vote













Give acks_config="1" and it will work






share|improve this answer






















    Your Answer






    StackExchange.ifUsing("editor", function ()
    StackExchange.using("externalEditor", function ()
    StackExchange.using("snippets", function ()
    StackExchange.snippets.init();
    );
    );
    , "code-snippets");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "1"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f46649748%2fkafka-producer-timeoutexception-expiring-1-records%23new-answer', 'question_page');

    );

    Post as a guest






























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    3
    down vote













    There are 3 possibilities:



    1. Increase request.timeout.ms - this is the time that Kafka will wait for whole batch to be ready in buffer. So in your case if there are less than 100 000 messages in buffer, timeout will occur. More info here: https://stackoverflow.com/a/34794261/2707179

    2. Decrease batch-size - related to previous point, it will send batches more often but they will include fewer messages.

    3. Depending on message size, maybe your network cannot catch up with high load? Check if your throughput is not a bottleneck.





    share|improve this answer
















    • 3




      I have the same issue as OP ever since I enabled SSL on Kafka, and notice that, like me, he has set linger.ms. According to the documentation, batches are sent out after this linger time even if the batch is not full, so even with a high batch size it should not time out.
      – Jodiug
      Jun 4 at 9:08















    up vote
    3
    down vote













    There are 3 possibilities:



    1. Increase request.timeout.ms - this is the time that Kafka will wait for whole batch to be ready in buffer. So in your case if there are less than 100 000 messages in buffer, timeout will occur. More info here: https://stackoverflow.com/a/34794261/2707179

    2. Decrease batch-size - related to previous point, it will send batches more often but they will include fewer messages.

    3. Depending on message size, maybe your network cannot catch up with high load? Check if your throughput is not a bottleneck.





    share|improve this answer
















    • 3




      I have the same issue as OP ever since I enabled SSL on Kafka, and notice that, like me, he has set linger.ms. According to the documentation, batches are sent out after this linger time even if the batch is not full, so even with a high batch size it should not time out.
      – Jodiug
      Jun 4 at 9:08













    up vote
    3
    down vote










    up vote
    3
    down vote









    There are 3 possibilities:



    1. Increase request.timeout.ms - this is the time that Kafka will wait for whole batch to be ready in buffer. So in your case if there are less than 100 000 messages in buffer, timeout will occur. More info here: https://stackoverflow.com/a/34794261/2707179

    2. Decrease batch-size - related to previous point, it will send batches more often but they will include fewer messages.

    3. Depending on message size, maybe your network cannot catch up with high load? Check if your throughput is not a bottleneck.





    share|improve this answer












    There are 3 possibilities:



    1. Increase request.timeout.ms - this is the time that Kafka will wait for whole batch to be ready in buffer. So in your case if there are less than 100 000 messages in buffer, timeout will occur. More info here: https://stackoverflow.com/a/34794261/2707179

    2. Decrease batch-size - related to previous point, it will send batches more often but they will include fewer messages.

    3. Depending on message size, maybe your network cannot catch up with high load? Check if your throughput is not a bottleneck.






    share|improve this answer












    share|improve this answer



    share|improve this answer










    answered Dec 30 '17 at 12:49









    michalbrz

    1,3991023




    1,3991023







    • 3




      I have the same issue as OP ever since I enabled SSL on Kafka, and notice that, like me, he has set linger.ms. According to the documentation, batches are sent out after this linger time even if the batch is not full, so even with a high batch size it should not time out.
      – Jodiug
      Jun 4 at 9:08













    • 3




      I have the same issue as OP ever since I enabled SSL on Kafka, and notice that, like me, he has set linger.ms. According to the documentation, batches are sent out after this linger time even if the batch is not full, so even with a high batch size it should not time out.
      – Jodiug
      Jun 4 at 9:08








    3




    3




    I have the same issue as OP ever since I enabled SSL on Kafka, and notice that, like me, he has set linger.ms. According to the documentation, batches are sent out after this linger time even if the batch is not full, so even with a high batch size it should not time out.
    – Jodiug
    Jun 4 at 9:08





    I have the same issue as OP ever since I enabled SSL on Kafka, and notice that, like me, he has set linger.ms. According to the documentation, batches are sent out after this linger time even if the batch is not full, so even with a high batch size it should not time out.
    – Jodiug
    Jun 4 at 9:08













    up vote
    1
    down vote













    Give acks_config="1" and it will work






    share|improve this answer


























      up vote
      1
      down vote













      Give acks_config="1" and it will work






      share|improve this answer
























        up vote
        1
        down vote










        up vote
        1
        down vote









        Give acks_config="1" and it will work






        share|improve this answer














        Give acks_config="1" and it will work







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited 20 hours ago









        Giorgos Myrianthous

        3,50121233




        3,50121233










        answered Aug 21 at 11:33









        user10254546

        111




        111



























             

            draft saved


            draft discarded















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f46649748%2fkafka-producer-timeoutexception-expiring-1-records%23new-answer', 'question_page');

            );

            Post as a guest














































































            Popular posts from this blog

            Use pre created SQLite database for Android project in kotlin

            Darth Vader #20

            Ondo