How many events can socket.io handle?



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








1















I was trying Socket.io (server and client) for my personal project. As it is my first try with node.js even javascript and mongodb I am bit confused about performance of my server.



I have created a complex real time system with many events and many rooms. Server have very limited events but clients have too many events. These events are distributed under rooms.



For example -



  • Room R1 >>
    Event R1E1,
    Event R1E2,
    Event R1E3....
    Event R1EN


  • Room R2 >>
    Event R2E1,
    Event R2E2,
    Event R2E3....
    Event R2EN


All the data is stored in mongodb. Working awesome.



But issue arise when few clients (5-8) with 10-15 events registered start sending data. Server initially works fine but after couple of minutes it stops responding. Clients stay connected even server is not responding. Requests are pile up. Sometime server receive last sessions request.



It all start when the ultimate device start registering events. So I want to know how many events a socket.io can handle ?



P.S. Here what I think "event" is -



io.on('event', function(msg)
console.log( msg);
);


Edit 2



As I studied about node.js, a node is basically a process which runs on a single thread, if it requires to process other things it starts another node (an async thread), leaves new thread alone doing it's process and return to main thread running. If we want to process some sequences of process we use "async/await".



In my case I am using async at only one place when client first connect. Here I query 3 different collections of mongodb and return the data on an event.



My server is currently running on a MacBook pro (16 GB RAM, i7 6th gen quad core). It should handle at least 4-6 concurrent threads.



I have created a load test, 100000 different events distributed under 1000 rooms with 5 request per second querying db. It was working fine. Almost 40% RAM and 250% CPU was max laod.



My connection to db is persistent mean I connect to db as soon as server starts and keep that connection reference alive.



So what is the issue?










share|improve this question
























  • I don't have a definitive answer for you, but I would assume the bottleneck is probably somewhere in the handling of your events, rather than with Socket.io itself. That being said, if you expect your application to grow, I would start looking at how to set up multiple nodes with socket.io (socket.io/docs/using-multiple-nodes/…). That alone may help things by putting the event handlers into separate process loops.

    – Justin Heath
    Nov 15 '18 at 16:13











  • in my case then server and clients will have many "redis". I have tested performance benchmarks it was good and current scenario have small fraction of that benchmark test. so technically it should work.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:19

















1















I was trying Socket.io (server and client) for my personal project. As it is my first try with node.js even javascript and mongodb I am bit confused about performance of my server.



I have created a complex real time system with many events and many rooms. Server have very limited events but clients have too many events. These events are distributed under rooms.



For example -



  • Room R1 >>
    Event R1E1,
    Event R1E2,
    Event R1E3....
    Event R1EN


  • Room R2 >>
    Event R2E1,
    Event R2E2,
    Event R2E3....
    Event R2EN


All the data is stored in mongodb. Working awesome.



But issue arise when few clients (5-8) with 10-15 events registered start sending data. Server initially works fine but after couple of minutes it stops responding. Clients stay connected even server is not responding. Requests are pile up. Sometime server receive last sessions request.



It all start when the ultimate device start registering events. So I want to know how many events a socket.io can handle ?



P.S. Here what I think "event" is -



io.on('event', function(msg)
console.log( msg);
);


Edit 2



As I studied about node.js, a node is basically a process which runs on a single thread, if it requires to process other things it starts another node (an async thread), leaves new thread alone doing it's process and return to main thread running. If we want to process some sequences of process we use "async/await".



In my case I am using async at only one place when client first connect. Here I query 3 different collections of mongodb and return the data on an event.



My server is currently running on a MacBook pro (16 GB RAM, i7 6th gen quad core). It should handle at least 4-6 concurrent threads.



I have created a load test, 100000 different events distributed under 1000 rooms with 5 request per second querying db. It was working fine. Almost 40% RAM and 250% CPU was max laod.



My connection to db is persistent mean I connect to db as soon as server starts and keep that connection reference alive.



So what is the issue?










share|improve this question
























  • I don't have a definitive answer for you, but I would assume the bottleneck is probably somewhere in the handling of your events, rather than with Socket.io itself. That being said, if you expect your application to grow, I would start looking at how to set up multiple nodes with socket.io (socket.io/docs/using-multiple-nodes/…). That alone may help things by putting the event handlers into separate process loops.

    – Justin Heath
    Nov 15 '18 at 16:13











  • in my case then server and clients will have many "redis". I have tested performance benchmarks it was good and current scenario have small fraction of that benchmark test. so technically it should work.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:19













1












1








1








I was trying Socket.io (server and client) for my personal project. As it is my first try with node.js even javascript and mongodb I am bit confused about performance of my server.



I have created a complex real time system with many events and many rooms. Server have very limited events but clients have too many events. These events are distributed under rooms.



For example -



  • Room R1 >>
    Event R1E1,
    Event R1E2,
    Event R1E3....
    Event R1EN


  • Room R2 >>
    Event R2E1,
    Event R2E2,
    Event R2E3....
    Event R2EN


All the data is stored in mongodb. Working awesome.



But issue arise when few clients (5-8) with 10-15 events registered start sending data. Server initially works fine but after couple of minutes it stops responding. Clients stay connected even server is not responding. Requests are pile up. Sometime server receive last sessions request.



It all start when the ultimate device start registering events. So I want to know how many events a socket.io can handle ?



P.S. Here what I think "event" is -



io.on('event', function(msg)
console.log( msg);
);


Edit 2



As I studied about node.js, a node is basically a process which runs on a single thread, if it requires to process other things it starts another node (an async thread), leaves new thread alone doing it's process and return to main thread running. If we want to process some sequences of process we use "async/await".



In my case I am using async at only one place when client first connect. Here I query 3 different collections of mongodb and return the data on an event.



My server is currently running on a MacBook pro (16 GB RAM, i7 6th gen quad core). It should handle at least 4-6 concurrent threads.



I have created a load test, 100000 different events distributed under 1000 rooms with 5 request per second querying db. It was working fine. Almost 40% RAM and 250% CPU was max laod.



My connection to db is persistent mean I connect to db as soon as server starts and keep that connection reference alive.



So what is the issue?










share|improve this question
















I was trying Socket.io (server and client) for my personal project. As it is my first try with node.js even javascript and mongodb I am bit confused about performance of my server.



I have created a complex real time system with many events and many rooms. Server have very limited events but clients have too many events. These events are distributed under rooms.



For example -



  • Room R1 >>
    Event R1E1,
    Event R1E2,
    Event R1E3....
    Event R1EN


  • Room R2 >>
    Event R2E1,
    Event R2E2,
    Event R2E3....
    Event R2EN


All the data is stored in mongodb. Working awesome.



But issue arise when few clients (5-8) with 10-15 events registered start sending data. Server initially works fine but after couple of minutes it stops responding. Clients stay connected even server is not responding. Requests are pile up. Sometime server receive last sessions request.



It all start when the ultimate device start registering events. So I want to know how many events a socket.io can handle ?



P.S. Here what I think "event" is -



io.on('event', function(msg)
console.log( msg);
);


Edit 2



As I studied about node.js, a node is basically a process which runs on a single thread, if it requires to process other things it starts another node (an async thread), leaves new thread alone doing it's process and return to main thread running. If we want to process some sequences of process we use "async/await".



In my case I am using async at only one place when client first connect. Here I query 3 different collections of mongodb and return the data on an event.



My server is currently running on a MacBook pro (16 GB RAM, i7 6th gen quad core). It should handle at least 4-6 concurrent threads.



I have created a load test, 100000 different events distributed under 1000 rooms with 5 request per second querying db. It was working fine. Almost 40% RAM and 250% CPU was max laod.



My connection to db is persistent mean I connect to db as soon as server starts and keep that connection reference alive.



So what is the issue?







javascript node.js mongodb socket.io iot






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 7 at 20:26









grantnz

6,30812535




6,30812535










asked Nov 15 '18 at 16:02









Harvant S. ChoudharyHarvant S. Choudhary

715825




715825












  • I don't have a definitive answer for you, but I would assume the bottleneck is probably somewhere in the handling of your events, rather than with Socket.io itself. That being said, if you expect your application to grow, I would start looking at how to set up multiple nodes with socket.io (socket.io/docs/using-multiple-nodes/…). That alone may help things by putting the event handlers into separate process loops.

    – Justin Heath
    Nov 15 '18 at 16:13











  • in my case then server and clients will have many "redis". I have tested performance benchmarks it was good and current scenario have small fraction of that benchmark test. so technically it should work.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:19

















  • I don't have a definitive answer for you, but I would assume the bottleneck is probably somewhere in the handling of your events, rather than with Socket.io itself. That being said, if you expect your application to grow, I would start looking at how to set up multiple nodes with socket.io (socket.io/docs/using-multiple-nodes/…). That alone may help things by putting the event handlers into separate process loops.

    – Justin Heath
    Nov 15 '18 at 16:13











  • in my case then server and clients will have many "redis". I have tested performance benchmarks it was good and current scenario have small fraction of that benchmark test. so technically it should work.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:19
















I don't have a definitive answer for you, but I would assume the bottleneck is probably somewhere in the handling of your events, rather than with Socket.io itself. That being said, if you expect your application to grow, I would start looking at how to set up multiple nodes with socket.io (socket.io/docs/using-multiple-nodes/…). That alone may help things by putting the event handlers into separate process loops.

– Justin Heath
Nov 15 '18 at 16:13





I don't have a definitive answer for you, but I would assume the bottleneck is probably somewhere in the handling of your events, rather than with Socket.io itself. That being said, if you expect your application to grow, I would start looking at how to set up multiple nodes with socket.io (socket.io/docs/using-multiple-nodes/…). That alone may help things by putting the event handlers into separate process loops.

– Justin Heath
Nov 15 '18 at 16:13













in my case then server and clients will have many "redis". I have tested performance benchmarks it was good and current scenario have small fraction of that benchmark test. so technically it should work.

– Harvant S. Choudhary
Nov 15 '18 at 16:19





in my case then server and clients will have many "redis". I have tested performance benchmarks it was good and current scenario have small fraction of that benchmark test. so technically it should work.

– Harvant S. Choudhary
Nov 15 '18 at 16:19












1 Answer
1






active

oldest

votes


















2















So I just want to know how many events a socket.io can handle ?




First off, it's not clear if you're talking about how many event handlers a socket.io server can have or whether you're asking about how many real-time events (as in events/sec) a socket.io server can handle.



On the first item, there is no coded limit to how many event handlers a socket.io server can handle. A socket derives from an EventEmitter and it uses the EventEmitter's listener capabilities to allow someone to listen for events. There are no coded limits for that infrastructure and there's not even really a practical limit either as it's a pretty lightweight system.



In general a system that requires thousands of separately coded event listeners probably could be designed more efficiently other ways, but we'd have to see more detail on what you're doing to know how to advise more specifically.




As for how many messages/sec a socket.io server can handle, that totally depends upon what the server is doing to process each message, how much bandwidth your server has, how fast your server is at processing each message and so on.




Unless you are flooding your server with tens of thousands of messages at once or doing heavy processing on each message, I would guess that your server difficulties probably have to do with other parts of your server code (likely in what you are doing when messages arrive and how you process them).



I would also wonder if you have created some sort of circular message loop where clientA emits msgA to server. server receives that message, does some processing on it and emits msgB to clientA. clientA receives that message, does some processing on it and some side effect of that processing causes it to emit msgA to server again and you can end up with a never ending message loop.



Also, rooms in socket.io don't "have events" or "receive events" so that part of your description doesn't really make sense. You can send an event to all sockets within a room. But, that actually just causes the server to loop through all members of a given room and send them each a message individually.




Per your edit, if an "event" is this:



io.on('event', function(msg)
console.log( msg);
);


Then, the number of events that your server can handle per second depends upon all sorts of system configuration variables (bandwidth, CPU, database performance, etc...) and how much processing you do to handle each incoming event. Here's a list of things to do:



  1. Make absolutely sure you have no synchronous I/O anywhere in your server other than at server startup time as that will instantly ruin your ability to have lots of "in process" events going at once.

  2. Make the code that processes each event as efficient as possible. If you're consulting the database on each event, that will likely set your database up as a bottleneck.

  3. Design some tests to figure out where your first bottleneck in processing is.

  4. Improve the performance characteristics of that first bottleneck.

  5. Rinse, lather repeat until you've removed/improved the first N places you were running into a bottleneck.

Keep in mind that a single node.js instance has only one thread that is running your Javascript. So, if you wanted to be able to process 100 messages/sec, you can use no more than 10ms of CPU to process each message (1000ms/sec divided by 100 messages/sec = 10ms/message). You can fan out to multiple CPUs by implementing clustering or firing up multiple processes to process a work queue if CPU is your actual bottleneck, but you'd have to first determine that with testing.






share|improve this answer

























  • I can't put all code here but I can explain how it going to work. Think about an end to end encrypted system talking on just one channel (or in room). Separately db, server, clients working fine. Server was able to handle 100000 events distributed in 1000 rooms with 5 requests per second querying db. It was taking about 40% RAM and 250% CPU.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:26











  • Question updated.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:33











  • @HarvantS.Choudhary - I'm not sure you've seen the edits to my answer. If you're really asking how many messages/sec your server can handle, that depends entirely upon things that are not disclosed here and could ultimately only be answered by devising an appropriate test for your server in your data center anyway since it depends upon a whole lot of environmental factors (server configuration, bandwidth, number of active clients, processing on each message, etc...). Also, if you have ANY synchronous I/O code in your server, that could also ruin scalability.

    – jfriend00
    Nov 15 '18 at 16:48











  • @HarvantS.Choudhary - Added some more to the end of my answer.

    – jfriend00
    Nov 15 '18 at 18:07











  • Yeah it helped. I will update my load test code will try those approaches.

    – Harvant S. Choudhary
    Nov 16 '18 at 2:45











Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53323371%2fhow-many-events-can-socket-io-handle%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









2















So I just want to know how many events a socket.io can handle ?




First off, it's not clear if you're talking about how many event handlers a socket.io server can have or whether you're asking about how many real-time events (as in events/sec) a socket.io server can handle.



On the first item, there is no coded limit to how many event handlers a socket.io server can handle. A socket derives from an EventEmitter and it uses the EventEmitter's listener capabilities to allow someone to listen for events. There are no coded limits for that infrastructure and there's not even really a practical limit either as it's a pretty lightweight system.



In general a system that requires thousands of separately coded event listeners probably could be designed more efficiently other ways, but we'd have to see more detail on what you're doing to know how to advise more specifically.




As for how many messages/sec a socket.io server can handle, that totally depends upon what the server is doing to process each message, how much bandwidth your server has, how fast your server is at processing each message and so on.




Unless you are flooding your server with tens of thousands of messages at once or doing heavy processing on each message, I would guess that your server difficulties probably have to do with other parts of your server code (likely in what you are doing when messages arrive and how you process them).



I would also wonder if you have created some sort of circular message loop where clientA emits msgA to server. server receives that message, does some processing on it and emits msgB to clientA. clientA receives that message, does some processing on it and some side effect of that processing causes it to emit msgA to server again and you can end up with a never ending message loop.



Also, rooms in socket.io don't "have events" or "receive events" so that part of your description doesn't really make sense. You can send an event to all sockets within a room. But, that actually just causes the server to loop through all members of a given room and send them each a message individually.




Per your edit, if an "event" is this:



io.on('event', function(msg)
console.log( msg);
);


Then, the number of events that your server can handle per second depends upon all sorts of system configuration variables (bandwidth, CPU, database performance, etc...) and how much processing you do to handle each incoming event. Here's a list of things to do:



  1. Make absolutely sure you have no synchronous I/O anywhere in your server other than at server startup time as that will instantly ruin your ability to have lots of "in process" events going at once.

  2. Make the code that processes each event as efficient as possible. If you're consulting the database on each event, that will likely set your database up as a bottleneck.

  3. Design some tests to figure out where your first bottleneck in processing is.

  4. Improve the performance characteristics of that first bottleneck.

  5. Rinse, lather repeat until you've removed/improved the first N places you were running into a bottleneck.

Keep in mind that a single node.js instance has only one thread that is running your Javascript. So, if you wanted to be able to process 100 messages/sec, you can use no more than 10ms of CPU to process each message (1000ms/sec divided by 100 messages/sec = 10ms/message). You can fan out to multiple CPUs by implementing clustering or firing up multiple processes to process a work queue if CPU is your actual bottleneck, but you'd have to first determine that with testing.






share|improve this answer

























  • I can't put all code here but I can explain how it going to work. Think about an end to end encrypted system talking on just one channel (or in room). Separately db, server, clients working fine. Server was able to handle 100000 events distributed in 1000 rooms with 5 requests per second querying db. It was taking about 40% RAM and 250% CPU.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:26











  • Question updated.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:33











  • @HarvantS.Choudhary - I'm not sure you've seen the edits to my answer. If you're really asking how many messages/sec your server can handle, that depends entirely upon things that are not disclosed here and could ultimately only be answered by devising an appropriate test for your server in your data center anyway since it depends upon a whole lot of environmental factors (server configuration, bandwidth, number of active clients, processing on each message, etc...). Also, if you have ANY synchronous I/O code in your server, that could also ruin scalability.

    – jfriend00
    Nov 15 '18 at 16:48











  • @HarvantS.Choudhary - Added some more to the end of my answer.

    – jfriend00
    Nov 15 '18 at 18:07











  • Yeah it helped. I will update my load test code will try those approaches.

    – Harvant S. Choudhary
    Nov 16 '18 at 2:45















2















So I just want to know how many events a socket.io can handle ?




First off, it's not clear if you're talking about how many event handlers a socket.io server can have or whether you're asking about how many real-time events (as in events/sec) a socket.io server can handle.



On the first item, there is no coded limit to how many event handlers a socket.io server can handle. A socket derives from an EventEmitter and it uses the EventEmitter's listener capabilities to allow someone to listen for events. There are no coded limits for that infrastructure and there's not even really a practical limit either as it's a pretty lightweight system.



In general a system that requires thousands of separately coded event listeners probably could be designed more efficiently other ways, but we'd have to see more detail on what you're doing to know how to advise more specifically.




As for how many messages/sec a socket.io server can handle, that totally depends upon what the server is doing to process each message, how much bandwidth your server has, how fast your server is at processing each message and so on.




Unless you are flooding your server with tens of thousands of messages at once or doing heavy processing on each message, I would guess that your server difficulties probably have to do with other parts of your server code (likely in what you are doing when messages arrive and how you process them).



I would also wonder if you have created some sort of circular message loop where clientA emits msgA to server. server receives that message, does some processing on it and emits msgB to clientA. clientA receives that message, does some processing on it and some side effect of that processing causes it to emit msgA to server again and you can end up with a never ending message loop.



Also, rooms in socket.io don't "have events" or "receive events" so that part of your description doesn't really make sense. You can send an event to all sockets within a room. But, that actually just causes the server to loop through all members of a given room and send them each a message individually.




Per your edit, if an "event" is this:



io.on('event', function(msg)
console.log( msg);
);


Then, the number of events that your server can handle per second depends upon all sorts of system configuration variables (bandwidth, CPU, database performance, etc...) and how much processing you do to handle each incoming event. Here's a list of things to do:



  1. Make absolutely sure you have no synchronous I/O anywhere in your server other than at server startup time as that will instantly ruin your ability to have lots of "in process" events going at once.

  2. Make the code that processes each event as efficient as possible. If you're consulting the database on each event, that will likely set your database up as a bottleneck.

  3. Design some tests to figure out where your first bottleneck in processing is.

  4. Improve the performance characteristics of that first bottleneck.

  5. Rinse, lather repeat until you've removed/improved the first N places you were running into a bottleneck.

Keep in mind that a single node.js instance has only one thread that is running your Javascript. So, if you wanted to be able to process 100 messages/sec, you can use no more than 10ms of CPU to process each message (1000ms/sec divided by 100 messages/sec = 10ms/message). You can fan out to multiple CPUs by implementing clustering or firing up multiple processes to process a work queue if CPU is your actual bottleneck, but you'd have to first determine that with testing.






share|improve this answer

























  • I can't put all code here but I can explain how it going to work. Think about an end to end encrypted system talking on just one channel (or in room). Separately db, server, clients working fine. Server was able to handle 100000 events distributed in 1000 rooms with 5 requests per second querying db. It was taking about 40% RAM and 250% CPU.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:26











  • Question updated.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:33











  • @HarvantS.Choudhary - I'm not sure you've seen the edits to my answer. If you're really asking how many messages/sec your server can handle, that depends entirely upon things that are not disclosed here and could ultimately only be answered by devising an appropriate test for your server in your data center anyway since it depends upon a whole lot of environmental factors (server configuration, bandwidth, number of active clients, processing on each message, etc...). Also, if you have ANY synchronous I/O code in your server, that could also ruin scalability.

    – jfriend00
    Nov 15 '18 at 16:48











  • @HarvantS.Choudhary - Added some more to the end of my answer.

    – jfriend00
    Nov 15 '18 at 18:07











  • Yeah it helped. I will update my load test code will try those approaches.

    – Harvant S. Choudhary
    Nov 16 '18 at 2:45













2












2








2








So I just want to know how many events a socket.io can handle ?




First off, it's not clear if you're talking about how many event handlers a socket.io server can have or whether you're asking about how many real-time events (as in events/sec) a socket.io server can handle.



On the first item, there is no coded limit to how many event handlers a socket.io server can handle. A socket derives from an EventEmitter and it uses the EventEmitter's listener capabilities to allow someone to listen for events. There are no coded limits for that infrastructure and there's not even really a practical limit either as it's a pretty lightweight system.



In general a system that requires thousands of separately coded event listeners probably could be designed more efficiently other ways, but we'd have to see more detail on what you're doing to know how to advise more specifically.




As for how many messages/sec a socket.io server can handle, that totally depends upon what the server is doing to process each message, how much bandwidth your server has, how fast your server is at processing each message and so on.




Unless you are flooding your server with tens of thousands of messages at once or doing heavy processing on each message, I would guess that your server difficulties probably have to do with other parts of your server code (likely in what you are doing when messages arrive and how you process them).



I would also wonder if you have created some sort of circular message loop where clientA emits msgA to server. server receives that message, does some processing on it and emits msgB to clientA. clientA receives that message, does some processing on it and some side effect of that processing causes it to emit msgA to server again and you can end up with a never ending message loop.



Also, rooms in socket.io don't "have events" or "receive events" so that part of your description doesn't really make sense. You can send an event to all sockets within a room. But, that actually just causes the server to loop through all members of a given room and send them each a message individually.




Per your edit, if an "event" is this:



io.on('event', function(msg)
console.log( msg);
);


Then, the number of events that your server can handle per second depends upon all sorts of system configuration variables (bandwidth, CPU, database performance, etc...) and how much processing you do to handle each incoming event. Here's a list of things to do:



  1. Make absolutely sure you have no synchronous I/O anywhere in your server other than at server startup time as that will instantly ruin your ability to have lots of "in process" events going at once.

  2. Make the code that processes each event as efficient as possible. If you're consulting the database on each event, that will likely set your database up as a bottleneck.

  3. Design some tests to figure out where your first bottleneck in processing is.

  4. Improve the performance characteristics of that first bottleneck.

  5. Rinse, lather repeat until you've removed/improved the first N places you were running into a bottleneck.

Keep in mind that a single node.js instance has only one thread that is running your Javascript. So, if you wanted to be able to process 100 messages/sec, you can use no more than 10ms of CPU to process each message (1000ms/sec divided by 100 messages/sec = 10ms/message). You can fan out to multiple CPUs by implementing clustering or firing up multiple processes to process a work queue if CPU is your actual bottleneck, but you'd have to first determine that with testing.






share|improve this answer
















So I just want to know how many events a socket.io can handle ?




First off, it's not clear if you're talking about how many event handlers a socket.io server can have or whether you're asking about how many real-time events (as in events/sec) a socket.io server can handle.



On the first item, there is no coded limit to how many event handlers a socket.io server can handle. A socket derives from an EventEmitter and it uses the EventEmitter's listener capabilities to allow someone to listen for events. There are no coded limits for that infrastructure and there's not even really a practical limit either as it's a pretty lightweight system.



In general a system that requires thousands of separately coded event listeners probably could be designed more efficiently other ways, but we'd have to see more detail on what you're doing to know how to advise more specifically.




As for how many messages/sec a socket.io server can handle, that totally depends upon what the server is doing to process each message, how much bandwidth your server has, how fast your server is at processing each message and so on.




Unless you are flooding your server with tens of thousands of messages at once or doing heavy processing on each message, I would guess that your server difficulties probably have to do with other parts of your server code (likely in what you are doing when messages arrive and how you process them).



I would also wonder if you have created some sort of circular message loop where clientA emits msgA to server. server receives that message, does some processing on it and emits msgB to clientA. clientA receives that message, does some processing on it and some side effect of that processing causes it to emit msgA to server again and you can end up with a never ending message loop.



Also, rooms in socket.io don't "have events" or "receive events" so that part of your description doesn't really make sense. You can send an event to all sockets within a room. But, that actually just causes the server to loop through all members of a given room and send them each a message individually.




Per your edit, if an "event" is this:



io.on('event', function(msg)
console.log( msg);
);


Then, the number of events that your server can handle per second depends upon all sorts of system configuration variables (bandwidth, CPU, database performance, etc...) and how much processing you do to handle each incoming event. Here's a list of things to do:



  1. Make absolutely sure you have no synchronous I/O anywhere in your server other than at server startup time as that will instantly ruin your ability to have lots of "in process" events going at once.

  2. Make the code that processes each event as efficient as possible. If you're consulting the database on each event, that will likely set your database up as a bottleneck.

  3. Design some tests to figure out where your first bottleneck in processing is.

  4. Improve the performance characteristics of that first bottleneck.

  5. Rinse, lather repeat until you've removed/improved the first N places you were running into a bottleneck.

Keep in mind that a single node.js instance has only one thread that is running your Javascript. So, if you wanted to be able to process 100 messages/sec, you can use no more than 10ms of CPU to process each message (1000ms/sec divided by 100 messages/sec = 10ms/message). You can fan out to multiple CPUs by implementing clustering or firing up multiple processes to process a work queue if CPU is your actual bottleneck, but you'd have to first determine that with testing.







share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 15 '18 at 18:06

























answered Nov 15 '18 at 16:15









jfriend00jfriend00

444k55582627




444k55582627












  • I can't put all code here but I can explain how it going to work. Think about an end to end encrypted system talking on just one channel (or in room). Separately db, server, clients working fine. Server was able to handle 100000 events distributed in 1000 rooms with 5 requests per second querying db. It was taking about 40% RAM and 250% CPU.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:26











  • Question updated.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:33











  • @HarvantS.Choudhary - I'm not sure you've seen the edits to my answer. If you're really asking how many messages/sec your server can handle, that depends entirely upon things that are not disclosed here and could ultimately only be answered by devising an appropriate test for your server in your data center anyway since it depends upon a whole lot of environmental factors (server configuration, bandwidth, number of active clients, processing on each message, etc...). Also, if you have ANY synchronous I/O code in your server, that could also ruin scalability.

    – jfriend00
    Nov 15 '18 at 16:48











  • @HarvantS.Choudhary - Added some more to the end of my answer.

    – jfriend00
    Nov 15 '18 at 18:07











  • Yeah it helped. I will update my load test code will try those approaches.

    – Harvant S. Choudhary
    Nov 16 '18 at 2:45

















  • I can't put all code here but I can explain how it going to work. Think about an end to end encrypted system talking on just one channel (or in room). Separately db, server, clients working fine. Server was able to handle 100000 events distributed in 1000 rooms with 5 requests per second querying db. It was taking about 40% RAM and 250% CPU.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:26











  • Question updated.

    – Harvant S. Choudhary
    Nov 15 '18 at 16:33











  • @HarvantS.Choudhary - I'm not sure you've seen the edits to my answer. If you're really asking how many messages/sec your server can handle, that depends entirely upon things that are not disclosed here and could ultimately only be answered by devising an appropriate test for your server in your data center anyway since it depends upon a whole lot of environmental factors (server configuration, bandwidth, number of active clients, processing on each message, etc...). Also, if you have ANY synchronous I/O code in your server, that could also ruin scalability.

    – jfriend00
    Nov 15 '18 at 16:48











  • @HarvantS.Choudhary - Added some more to the end of my answer.

    – jfriend00
    Nov 15 '18 at 18:07











  • Yeah it helped. I will update my load test code will try those approaches.

    – Harvant S. Choudhary
    Nov 16 '18 at 2:45
















I can't put all code here but I can explain how it going to work. Think about an end to end encrypted system talking on just one channel (or in room). Separately db, server, clients working fine. Server was able to handle 100000 events distributed in 1000 rooms with 5 requests per second querying db. It was taking about 40% RAM and 250% CPU.

– Harvant S. Choudhary
Nov 15 '18 at 16:26





I can't put all code here but I can explain how it going to work. Think about an end to end encrypted system talking on just one channel (or in room). Separately db, server, clients working fine. Server was able to handle 100000 events distributed in 1000 rooms with 5 requests per second querying db. It was taking about 40% RAM and 250% CPU.

– Harvant S. Choudhary
Nov 15 '18 at 16:26













Question updated.

– Harvant S. Choudhary
Nov 15 '18 at 16:33





Question updated.

– Harvant S. Choudhary
Nov 15 '18 at 16:33













@HarvantS.Choudhary - I'm not sure you've seen the edits to my answer. If you're really asking how many messages/sec your server can handle, that depends entirely upon things that are not disclosed here and could ultimately only be answered by devising an appropriate test for your server in your data center anyway since it depends upon a whole lot of environmental factors (server configuration, bandwidth, number of active clients, processing on each message, etc...). Also, if you have ANY synchronous I/O code in your server, that could also ruin scalability.

– jfriend00
Nov 15 '18 at 16:48





@HarvantS.Choudhary - I'm not sure you've seen the edits to my answer. If you're really asking how many messages/sec your server can handle, that depends entirely upon things that are not disclosed here and could ultimately only be answered by devising an appropriate test for your server in your data center anyway since it depends upon a whole lot of environmental factors (server configuration, bandwidth, number of active clients, processing on each message, etc...). Also, if you have ANY synchronous I/O code in your server, that could also ruin scalability.

– jfriend00
Nov 15 '18 at 16:48













@HarvantS.Choudhary - Added some more to the end of my answer.

– jfriend00
Nov 15 '18 at 18:07





@HarvantS.Choudhary - Added some more to the end of my answer.

– jfriend00
Nov 15 '18 at 18:07













Yeah it helped. I will update my load test code will try those approaches.

– Harvant S. Choudhary
Nov 16 '18 at 2:45





Yeah it helped. I will update my load test code will try those approaches.

– Harvant S. Choudhary
Nov 16 '18 at 2:45



















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53323371%2fhow-many-events-can-socket-io-handle%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Darth Vader #20

How to how show current date and time by default on contact form 7 in WordPress without taking input from user in datetimepicker

Ondo