Kafka Streams application opening too many files on kafka servers









up vote
0
down vote

favorite












I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.



As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.



I'm using kafka-streams-1.0.1 API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.



My application's configuration includes the following kafka producer configs:




  • batch.size: set to 100,000 messages.


  • linger.ms: set to 1,000 milliseconds.


  • buffer.memory: set to the byte equivalent of 5 MegaBytes.

The stream processing itself is very simple, and is composed:



stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);


I would appreciate any suggestions you guys might have.










share|improve this question





















  • The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
    – daniu
    Nov 10 at 9:15











  • And it is configured to that level. Still, files keep opening
    – Stav Saad
    Nov 10 at 9:23










  • I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
    – cricket_007
    Nov 10 at 13:02










  • Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
    – Stav Saad
    Nov 11 at 10:37










  • It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
    – mjuarez
    Nov 17 at 5:33














up vote
0
down vote

favorite












I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.



As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.



I'm using kafka-streams-1.0.1 API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.



My application's configuration includes the following kafka producer configs:




  • batch.size: set to 100,000 messages.


  • linger.ms: set to 1,000 milliseconds.


  • buffer.memory: set to the byte equivalent of 5 MegaBytes.

The stream processing itself is very simple, and is composed:



stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);


I would appreciate any suggestions you guys might have.










share|improve this question





















  • The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
    – daniu
    Nov 10 at 9:15











  • And it is configured to that level. Still, files keep opening
    – Stav Saad
    Nov 10 at 9:23










  • I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
    – cricket_007
    Nov 10 at 13:02










  • Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
    – Stav Saad
    Nov 11 at 10:37










  • It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
    – mjuarez
    Nov 17 at 5:33












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.



As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.



I'm using kafka-streams-1.0.1 API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.



My application's configuration includes the following kafka producer configs:




  • batch.size: set to 100,000 messages.


  • linger.ms: set to 1,000 milliseconds.


  • buffer.memory: set to the byte equivalent of 5 MegaBytes.

The stream processing itself is very simple, and is composed:



stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);


I would appreciate any suggestions you guys might have.










share|improve this question













I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.



As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.



I'm using kafka-streams-1.0.1 API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.



My application's configuration includes the following kafka producer configs:




  • batch.size: set to 100,000 messages.


  • linger.ms: set to 1,000 milliseconds.


  • buffer.memory: set to the byte equivalent of 5 MegaBytes.

The stream processing itself is very simple, and is composed:



stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);


I would appreciate any suggestions you guys might have.







java apache-kafka apache-kafka-streams






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 10 at 8:44









Stav Saad

541413




541413











  • The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
    – daniu
    Nov 10 at 9:15











  • And it is configured to that level. Still, files keep opening
    – Stav Saad
    Nov 10 at 9:23










  • I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
    – cricket_007
    Nov 10 at 13:02










  • Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
    – Stav Saad
    Nov 11 at 10:37










  • It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
    – mjuarez
    Nov 17 at 5:33
















  • The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
    – daniu
    Nov 10 at 9:15











  • And it is configured to that level. Still, files keep opening
    – Stav Saad
    Nov 10 at 9:23










  • I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
    – cricket_007
    Nov 10 at 13:02










  • Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
    – Stav Saad
    Nov 11 at 10:37










  • It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
    – mjuarez
    Nov 17 at 5:33















The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
Nov 10 at 9:15





The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
Nov 10 at 9:15













And it is configured to that level. Still, files keep opening
– Stav Saad
Nov 10 at 9:23




And it is configured to that level. Still, files keep opening
– Stav Saad
Nov 10 at 9:23












I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
Nov 10 at 13:02




I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
Nov 10 at 13:02












Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
Nov 11 at 10:37




Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
Nov 11 at 10:37












It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
– mjuarez
Nov 17 at 5:33




It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
– mjuarez
Nov 17 at 5:33












1 Answer
1






active

oldest

votes

















up vote
0
down vote













Use Java 8 or Java 10 or lower and
Use latest Kafka, https://kafka.apache.org/quickstart



See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855






share|improve this answer




















    Your Answer






    StackExchange.ifUsing("editor", function ()
    StackExchange.using("externalEditor", function ()
    StackExchange.using("snippets", function ()
    StackExchange.snippets.init();
    );
    );
    , "code-snippets");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "1"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53237372%2fkafka-streams-application-opening-too-many-files-on-kafka-servers%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    Use Java 8 or Java 10 or lower and
    Use latest Kafka, https://kafka.apache.org/quickstart



    See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855






    share|improve this answer
























      up vote
      0
      down vote













      Use Java 8 or Java 10 or lower and
      Use latest Kafka, https://kafka.apache.org/quickstart



      See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855






      share|improve this answer






















        up vote
        0
        down vote










        up vote
        0
        down vote









        Use Java 8 or Java 10 or lower and
        Use latest Kafka, https://kafka.apache.org/quickstart



        See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855






        share|improve this answer












        Use Java 8 or Java 10 or lower and
        Use latest Kafka, https://kafka.apache.org/quickstart



        See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 25 at 22:31









        UnP

        6111




        6111



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53237372%2fkafka-streams-application-opening-too-many-files-on-kafka-servers%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Use pre created SQLite database for Android project in kotlin

            Darth Vader #20

            Ondo