Handling multiple parallel HTTP requests in Node.js

Problem

I know that Node is non-blocking, but I just realized that the default behaviour of http.listen(8000) means that all HTTP requests are handled one-at-a-time. I know I shouldn't have been surprised at this (it's how ports work), but it does make me seriously wonder how to write my code so that I can handle multiple, parallel HTTP requests.

So what's the best way to write a server so that it doesn't hog port 80 and long-running responses don't result in long request queues?

To illustrate the problem, try running the code below and loading it up in two browser tabs at the same time.

var http = require('http');
http.createServer(function (req, res) {
    res.setHeader('Content-Type', 'text/html; charset=utf-8');
    res.write("<p>" + new Date().toString() + ": starting response");
    setTimeout(function () {
        res.write("<p>" + new Date().toString() + ": completing response and closing connection</p>");
        res.end();
    }, 4000);
}).listen(8080);
Problem courtesy of: Andrew

Solution

You are misunderstanding how node works. The above code can accept TCP connections from hundreds or thousands of clients, read the HTTP requests, and then wait the 4000 ms timeout you have baked in there, and then send the responses. Each client will get a response in about 4000 + a small number of milliseconds. During that setTimeout (and during any I/O operation) node can continue processing. This includes accepting additional TCP connections. I tested your code and the browsers each get a response in 4s. The second one does NOT take 8s, if that is how you think it works.

I ran curl -s localhost:8080 in 4 tabs as quickly as I can via the keyboard and the seconds in the timestamps are:

  1. 54 to 58
  2. 54 to 58
  3. 55 to 59
  4. 56 to 00

There's no issue here, although I can understand how you might think there is one. Node would be totally broken if it worked as your post suggested.

Here's another way to verify:

for i in 1 2 3 4 5 6 7 8 9 10; do curl -s localhost:8080 &;done                                                                                                                                                                       
Solution courtesy of: Peter Lyons

Discussion

I used following code to test request handling

app.get('/', function(req, res) {
  console.log('time', MOMENT());  
  setTimeout( function() {
    console.log(data, '          ', MOMENT());
    res.send(data);
    data = 'changing';
  }, 50000);
  var data = 'change first';
  console.log(data);
});

Since this request doesn't take that much processing time, except for 50 sec of setTimeout and all the time-out were processed together like usually do.

Response 3 request together-

time moment("2017-05-22T16:47:28.893")
change first
time moment("2017-05-22T16:47:30.981")
change first
time moment("2017-05-22T16:47:33.463")
change first
change first            moment("2017-05-22T16:48:18.923")
change first            moment("2017-05-22T16:48:20.988")
change first            moment("2017-05-22T16:48:23.466")

After this i moved to second phase... i.e., what if my request takes so much time to process a sync file or some thing else that take time.

app.get('/second', function(req, res) {
    console.log(data);
    if(req.headers.data === '9') {
        res.status(200);
        res.send('response from api');
    } else {
        console.log(MOMENT());
        for(i = 0; i<9999999999; i++){}
        console.log('Second MOMENT', MOMENT());
        res.status(400);
        res.send('wrong data');
    }

    var data = 'second test';
});

As my first request was still in process so my second didn't get accepted by Node. Thus i got following response of 2 request-

undefined
moment("2017-05-22T17:43:59.159")
Second MOMENT moment("2017-05-22T17:44:40.609")
undefined
moment("2017-05-22T17:44:40.614")
Second MOMENT moment("2017-05-22T17:45:24.643") 

Thus For all Async functions theres a virtual thread in Node and Node does accept other request before completing previous requests async work like(fs, mysql,or calling API), however it keeps it self as single thread and does not process other request until all previous ones are completed.

Discussion courtesy of: NAVIN COC

Your code can accept multiple connections because the job is done in callback function of the setTimeout call.

But if you instead of setTimeout do a heavy job... then it is true that node.js will not accept other multiple connections! SetTimeout accidentally frees the process so the node.js can accept other jobs and you code is executed in other "thread".

I don't know which is the correct way to implement this. But this is how it seems to work.

Discussion courtesy of: user3286817

Browser blocks the other same requests. If you call it from different browsers then this will work parallelly.

Discussion courtesy of: Amar T

This recipe can be found in it's original form on Stack Over Flow.