NodeJS: When handling http requests how do you determine which to use? (http, request, fetch, axios)

I’m just curious. Who has tried different packages on http requests? It’s because I’ve tried basic get and post on these packages (node’s http, request, fetch API, XMLHttpRequest, axios) on my projects just to feel the different ways to do http requests, but I’m not really sure if which is which on situations.

Thank you for shedding some light to me. :slight_smile:

  • fetch and XMLHttpRequest are browser APIs.
  • http is Node-specific (EDIT for clarity: http as in the Node module with functions relating to dealing with HTTP requests, not http in general)
  • fetch effectively replaces XMLHttpRequest (with some caveats regarding cancelling requests), as it is much more powerful.
  • fetch returns a promise, which makes things easier in many ways.
  • the node-fetch package provides the same API as fetch for use server-side and tries to deal with some Node-specific things (streams being the key thing).
  • fetch is very low-level and explicit - you don’t get anything unless you ask for it. For example, I’ve normally had to write a wrapper function that automatically adds in defaults needed by whatever app I’m building (CORs, cookie logic, XSRF protection logic etc).
  • http is also very low-level.
  • request is a library that wraps http and gives you some nice conveniences.
  • axios, when used server side, does the same thing.
  • axios and request, by providing useful defaults, can make things easier to code, with the normal cost of learning another API.
7 Likes

Hello Jonathan,

POST and GET are HTTP methods, we have HEAD, PUT, OPTION, … methods too, but in 99% cases, POST and GET work and do the business for you.
You may also read about RESTFul too.

This is generic talking comrade. And opinion based. While I prefer low-level CGI level back-end coding, some devs code using very high level like using JSF, or ASP.
You are the dev here, just implement some essentials(like privacy, security, …) and do the coding as you like. Target user just need your code works right.

HTTP is a protocol(RFC 7230), but I’m not sure if it’s known as a kind of object in nodejs Dan.

Dear Dan stated good points, since nodejs is based on JS, I also think it’s more targeted by XSS attacks, have an eye on it.

Also considering the great point of Dan also as

If you(Jonathan) start reading about IO and streaming, you will find out HTTP is nothing very special really, and when people starts some Socket IO, they would code a very simple HTTP server since it’s very easy.

Nodejs(and almost all other backend technologies/langs) just wrap these streams from client/server into objects dev could use them easier.

Happy coding.

2 Likes

Ah, haha, I should’ve been more specific there, http as in the Node stdlib module, rather than http in general

1 Like

I see. So their main difference is in their structure and implementation, but all of them does the job.
Thank you @DanCouper and @NULL_dev . If i could mark all of your comments solution, i would. :slight_smile:

1 Like

Yeah, that’s it. Like Dan already laid out in great detail, they may apply to different platforms and have differing levels of abstraction. When you use one over the other is mostly a matter of personal preference. In the browser, I always start with fetch and would only move to axios if I needed it (I have never). For Node, it’s a bit more complicated as there is no native API that uses promises. In the past, I’ve hand-rolled them in myself, but for the future I’ll stick to a library like node-fetch or axios. When you’re writing a full-stack app, it’s best to keep the API consistent, so I don’t think requests would be a good solution - either use axios or fetch for both front and back-end.

2 Likes

Yeah, definitely what @PortableStick says - they key advantage of fetch and axios is that you don’t have to mentally translate between front- and back-end code for HTTP requests, you only have to learn how they work once, then you can apply them everywhere.

Just as an addendum, all of them are just a way to map JavaScript data structures to the form expected by the HTTP protocol, then deal with the response a without having to deal with the nitty-gritty of what the response actually looks like, and b give you a nice response mapped back to something that can be dealt with in a way JS understands.

So like (pseudocode):

superAwesomeHTTP({
  url: 'http://www.example.com/index.html',
  method: 'GET',
  headers: {}
}).then(response => console.log(response));

Which will build into something like

GET /index.html HTTP/1.1
Host: www.example.com
Accept: image/gif, image/jpeg, */*
Accept-Language: en-us
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)

And then back comes something like

HTTP/1.1 200 OK
Date: Mon, 16 Jul 2018 14:54:53 GMT
Server: Apache/2.2.14 (Win32)
Last-Modified: Mon, 9 Jul 2018 10:25:26 GMT
ETag: "10000000565a5-2c-3e94b66c2e680"
Accept-Ranges: bytes
Content-Length: 44
Connection: close
Content-Type: text/html
  
<html><body><h1>It works!</h1></body></html>

And superAwesomeHTTP will parse that and then console log <html><body><h1>It works!</h1></body></html>.

Node has a relatively good overview of the process here, but it kinda expects you know how it’s all working. This is pretty good though:

https://robrich.org/slides/anatomy_of_a_web_request

2 Likes

Thank you for the advice! I now get some insights on which to use now.

Thank you Dan! The slides are great. I am having a in-a-nutshell idea of requests. :slight_smile:

<<please ignore this reply, probably wrong statement>>

I’m not JS/Node expert, and you may please correct me if I’m wrong. But I thinking mixing non-blocking ops with blocking ops are not good practice! about the promise(as far as I know it waits till the a async work(s) completed) could block the current thread, so it breaks the NIO. Unless the promise(join) is handled using another async operation(sacrificing two threads to keep the NIO, maybe better go for BIO then?).

Except the some IO ops like file streaming could be NIO, some IO streaming should be in BIO manner like database connectivity(not sure but most drivers work like that).

I see some devs run an async work(a new thread) to run a blocking ops like database connection, please note it’s just tricky(temporary) BIO work over NIO, it works, but it’s BIo in heart. Also note(I’m not sure about JS/Node) that a thread needs its context and stack, so it cost some CPU for that.
Some low-level languages like C(CGI), Java(web profile),… create a pool of threads that works as recycling a thread, and like a semaphore too, this is a good move. I hope node does it so, but not sure if NodeJS considered itself as enterprise level server.

1 Like

Interesting. Do you have any documentation on this?

<<please ignore this reply, probably wrong statement>>

2 Likes

This is really interesting stuff, thank you. I’ve kinda avoided Node on the backend for most things bar very lightweight and low traffic (what you call) NIO stuff, basically following the recommendations in the docs when I’ve needed to spin up a JS solution It’s quite nice, quick for that, but really useful info on its usability for other things :+1:

Dear Dan,

Please don’t get my statement about the node completely correct, I’m not the JS/Node dev, but since I’ve read nodejs is NIO by default, don’t know if it’s possible to make it BIO too or not.

But one thing I’m sure is about the enterprise level of nodejs as you mentioned, this is better to get it worked for light and small works indeed.

Happy coding.

I see the confusion here. Node’s event loop allows it to have non-blocking I/O. setTimeout is not I/O. If you did the same test with network requests, you would see the asynchronicity you expect. Both of the sources you posted are resources teaching you why you shouldn’t do exactly what you just did, yet also advising you that writing asynchronous IO is totally possible:

[L]et’s consider a case where each request to a web server takes 50ms to complete and 45ms of that 50ms is database I/O that can be done asynchronously. Choosing non-blocking asynchronous operations frees up that 45ms per request to handle other requests. This is a significant difference in capacity just by choosing to use non-blocking methods instead of blocking methods.

Even if this were an example of Node being nasty with our I/O, keep in mind that promises are little more than syntactic sugar for callback functions and can’t change anything about JavaScript’s concurrency model. Using a promise for an HTTP request would be no more or less performant than any other API.

1 Like

You are right here, but I think that in the instance you’re talking about, where you may actually need threads, the normal approach would be to delegate to a different language say via a message queue, or spawn more instances of Node. You would avoid doing what you did there, that idiom doesn’t really work in JS due to its limitations. Yes, there’s technically only one thread, but it uses the reactor pattern to avoid blocking on that thread. Parallelized or concurrent work it’s not so good at, but then that’s difficult in most languages; Node isn’t an exception here (as a comparison, basic Ruby and Python have a global lock), though it is on some ways particularly bad at it. nb it looks likely threads will come soonish due to them being useful for compiling C/C++ code that uses threading to JS - Firefox’s engine has them available behind a flag afaik for example. And browser-side you can use Web Workers right now to spawn any number of OS-level threads to do work (albeit the message passing to and from those workers is still done one the single main thread)

<<please ignore this reply, probably wrong statement>>

It’s not a simulation, it’s actually a blocking operation. It just has nothing to do with network I/O or promises.

I’m not sure why you’re bringing up anything else in your post, but I don’t have time for it. Write an example that uses a non-blocking network or database operation rather than setTimeout, which is blocking, and you’ll see.

Yes I think you are right, maybe I’m wrong. Since I have zero knowledge in node/js I should have not replied this post. Sorry.

Please excuse my interrupts code fellows, happy coding.

You aren’t wrong, it’s just that the scenario you’re describing (of which a better example might be to use any of the Node stdlib sync versions of the functions, say something from the fs module to read/write things to the local filesystem) involves deliberately writing blocking code in an asynchronous context. Which isn’t extraordinary (it is necessary in some situations), it’s just not the way you would normally write JS code - because you only have that single thread, you need to allow the runtime to handle scheduling asynchronous calls rather than forcing it. Stuff like sleep (which I think you’re trying to emulate here?) doesn’t really work well because you only have that central process - https://nodejs.org/en/docs/guides/event-loop-timers-and-nexttick/

I think a comparable thing is that you can write synchronous AJAX calls in browser JS code (although this will cause the console to immediately start spitting out warnings telling you not to do it), and they will lock the browser until they complete

1 Like