Kafka Streams application opening too many files on kafka servers
up vote
0
down vote
favorite
I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.
As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.
I'm using kafka-streams-1.0.1
API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.
My application's configuration includes the following kafka producer configs:
batch.size
: set to 100,000 messages.linger.ms
: set to 1,000 milliseconds.buffer.memory
: set to the byte equivalent of 5 MegaBytes.
The stream processing itself is very simple, and is composed:
stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);
I would appreciate any suggestions you guys might have.
java apache-kafka apache-kafka-streams
|
show 1 more comment
up vote
0
down vote
favorite
I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.
As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.
I'm using kafka-streams-1.0.1
API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.
My application's configuration includes the following kafka producer configs:
batch.size
: set to 100,000 messages.linger.ms
: set to 1,000 milliseconds.buffer.memory
: set to the byte equivalent of 5 MegaBytes.
The stream processing itself is very simple, and is composed:
stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);
I would appreciate any suggestions you guys might have.
java apache-kafka apache-kafka-streams
The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
Nov 10 at 9:15
And it is configured to that level. Still, files keep opening
– Stav Saad
Nov 10 at 9:23
I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
Nov 10 at 13:02
Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
Nov 11 at 10:37
It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
– mjuarez
Nov 17 at 5:33
|
show 1 more comment
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.
As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.
I'm using kafka-streams-1.0.1
API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.
My application's configuration includes the following kafka producer configs:
batch.size
: set to 100,000 messages.linger.ms
: set to 1,000 milliseconds.buffer.memory
: set to the byte equivalent of 5 MegaBytes.
The stream processing itself is very simple, and is composed:
stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);
I would appreciate any suggestions you guys might have.
java apache-kafka apache-kafka-streams
I've been working on an application based on the java kafka-streams API, whose goal is to process a stream of data coming from one kafka topic, and produce it into another topic.
As it seems, whenever I start producing messages using the kafka-streams application, file handles just keep opening on the kafka brokers I'm using, and they are never closed, meaning eventually the kafka server ends up with too many open files, and the kafka and zookeeper daemons crash.
I'm using kafka-streams-1.0.1
API jar for Java, and running on JDK 11. The kafka cluster is of Kafka version 1.0.0.
My application's configuration includes the following kafka producer configs:
batch.size
: set to 100,000 messages.linger.ms
: set to 1,000 milliseconds.buffer.memory
: set to the byte equivalent of 5 MegaBytes.
The stream processing itself is very simple, and is composed:
stream.map((k,v) -> handle(k,v)).filter((k,v) -> v != null).to(outgoingTopic);
I would appreciate any suggestions you guys might have.
java apache-kafka apache-kafka-streams
java apache-kafka apache-kafka-streams
asked Nov 10 at 8:44
Stav Saad
541413
541413
The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
Nov 10 at 9:15
And it is configured to that level. Still, files keep opening
– Stav Saad
Nov 10 at 9:23
I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
Nov 10 at 13:02
Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
Nov 11 at 10:37
It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
– mjuarez
Nov 17 at 5:33
|
show 1 more comment
The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
Nov 10 at 9:15
And it is configured to that level. Still, files keep opening
– Stav Saad
Nov 10 at 9:23
I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
Nov 10 at 13:02
Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
Nov 11 at 10:37
It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
– mjuarez
Nov 17 at 5:33
The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
Nov 10 at 9:15
The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
Nov 10 at 9:15
And it is configured to that level. Still, files keep opening
– Stav Saad
Nov 10 at 9:23
And it is configured to that level. Still, files keep opening
– Stav Saad
Nov 10 at 9:23
I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
Nov 10 at 13:02
I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
Nov 10 at 13:02
Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
Nov 11 at 10:37
Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
Nov 11 at 10:37
It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
– mjuarez
Nov 17 at 5:33
It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
– mjuarez
Nov 17 at 5:33
|
show 1 more comment
1 Answer
1
active
oldest
votes
up vote
0
down vote
Use Java 8 or Java 10 or lower and
Use latest Kafka, https://kafka.apache.org/quickstart
See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
Use Java 8 or Java 10 or lower and
Use latest Kafka, https://kafka.apache.org/quickstart
See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855
add a comment |
up vote
0
down vote
Use Java 8 or Java 10 or lower and
Use latest Kafka, https://kafka.apache.org/quickstart
See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855
add a comment |
up vote
0
down vote
up vote
0
down vote
Use Java 8 or Java 10 or lower and
Use latest Kafka, https://kafka.apache.org/quickstart
See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855
Use Java 8 or Java 10 or lower and
Use latest Kafka, https://kafka.apache.org/quickstart
See some reports here about bug filed https://issues.apache.org/jira/browse/KAFKA-6855
answered Nov 25 at 22:31
UnP
6111
6111
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53237372%2fkafka-streams-application-opening-too-many-files-on-kafka-servers%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
The recommendation by Confluent is to configure the kernel of the host to allow 100.000+ open file handles. Normally the default is way below that.
– daniu
Nov 10 at 9:15
And it is configured to that level. Still, files keep opening
– Stav Saad
Nov 10 at 9:23
I don't think Java 11 is officially supported by Kafka community, or officially tested. Not sure if that's the problem, though
– cricket_007
Nov 10 at 13:02
Java 11 is used as the runtime of my own application and should not affect stuff happening on the Kafka servers
– Stav Saad
Nov 11 at 10:37
It does seem like your usage of kafka stream is pretty simple. I would suggest using a more recent version of Kafka, to see if that fixes the issue. 2.0.0 has been out for a few months already, and 2.1.0 is about to be released, probably in the next couple of weeks.
– mjuarez
Nov 17 at 5:33