We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2
Subject: Who delivers the highest number of concurrent
requests?
Nowadays we need such types of servers that are fast and
can serve more requests with low resources. Here are two options to choose the best server between node and go. Although node is known as a fast language around the world its npm packages made the thing easy for everyone. Go is also a fast language and it works behind the scenes. As the article shows the node imports a fastify package which is based on the node js http package used for the concurrent http requests as we are developing Google ad buffers. On the hardware of the I7-8550U CPU firstly we tested the node server with Apache Bench with 100 concurrent connections the server served request per second is 12925.33/sec and its mean time per request across all concurrent requests is 0.077/ms which is normal. Secondly when we increase our concurrent connections to 500 its server served request per second is 9673.37/sec and its mean time per request across all concurrent requests is 0.103/ms the server hits its limit and node js starts to struggle. We imported a fast HTTP server that does not use an HTTP library instead of a pure HTTP protocol implementation. When we tested a go server with 100 concurrent connections, the server served request per second was 15847.80/sec and its mean time per request across all concurrent requests was 0.063/ms. When we increased concurrent connections to 500 server-served requests per second it was 14682.27/sec and its mean time per request across all concurrent requests was 0.068/ms. Which is far greater than node js and we can get more concurrent requests. We can conclude that go is better and faster in performance in comparison with node js because it can serve a high number of concurrent requests in less time. As we can see, Go delivers 18% more requests than node js in case of 100 concurrent connections and its ability to deliver requests increases to 34% when the concurrent connections limit reaches 500.