So for yet another post on node.js and the many useful packages that can be installed via npm I thought I would write another post on the npm package request, that is a popular http client for scripting http. Although I think this time around I will be focusing on streams. Out of the box request only gives to ways to work with incoming data, callback methods, and streams. Promise support can be added, but that is off topic for this post.
In this post I will be making heavy use of Buffers, they come up a lot in node.js when working with streams. In this post I will be mostly working with chunks of data from requests using request, as that is what this is mainly what the post is on. However in some examples I will also be working with buffers when it comes to writing data to a file as well. If you are not to keen on buffers you might want to check out my post on buffers, but the node.js docs might very well be the best source on that subject.
This post has to do with streams involving the http client request. Request gives two ways to work with data that is the reponse to an http request, callbacks, and streams. If you do not know a great deal about streams you might want to check out the node.js docs on streams as well.
As chunks of data come in from a request there are many things that I might want to do with that stream of data. Such as writing them to a file, running it through some kind of transformation, or just logging it out to the standard output in the command line interface. In this section I will be going over a few quick examples of just doing this for starters.
One way to log that data to the console, is to pipe it to process.stdout. This will work in a simular way to that of using console.log, the incoming chunks will be logged out to the console.
If you really want to use console.log, one way to do so would be to the data event.
This is also handy if I want to do something with the chunks as they come in, but it might not be a true replacement for stream.transform, more on that later.
One thing that comes to mind when working with streams using request, is that I will want to write some chunks of data to a file. for this there is the createWriteStream method in the stream module. In this section I will be covering a few examples of this using request.
So this will be a basic example of using stream.createWriteStream to write chunks of data from a request using request. I can use the on error, and on data events to log any errors, and give a sense of progress. The pipe the incoming chunks to createWriteStream.
If I want to use write stream in a way in wich I call a method each time I want to write a few bytes of data to the file that I have created with create Write stream that is not to hard to pull off. Doing so just requires that I save the instance of the write stream to a variable, and just call the write method, passing the chunks to that method.
This is useful for situations in which the writing process might have many pauses, and I just want to write to the file a few bytes at a time.
The transform method in the node.js built in stream module, can be used to define my own transform methods. That is anything that I might want to do with an incoming stream of data, simple things like converting text to upper case, to more complex things that can be considered a kind of compression, encryption, or obfuscation.
For a simple example of a transform I made an example that just converts every letter to uppercase.
This is just a basic example of what can be done with Stream.transform, but I can define some kind of more complex way of transforming an incoming stream of data with this.
So maybe a good example of why streams are important is when dealing with very large files. The novel war an peace has a reputation of being a very large book, of over five hundred thousand words. On a slow connection a file this large will take a little while to download.
In this example I am checking the headers of a file at a url that holds a text file that is a copy of war and peace via a head request first. Once the head request is done I look for a content length header, and assume that this must be the byte length of the file. Once I have the byte length I can check if it is between some set limits for downloading the body of a file. If all goes well I then start the get request for the file, and for each incoming byte of data I update a percent done variable, and log the current percentage done in the console. While this is happening I am also writing the chunks to a file as well.
All I have to do is just pipe the output of request, to in instance of the PNG constructor that is available via what is exported when I bring pngjs into a project.
after piping the stream from request to PNG there is then a parsed event that I can use to work with the image data in the form of an array of numbers that are the reg,green, blue, and alpha color channels of the png files pixel data.
Its a really cool project for handling this sort of thing, It can also be used to encode to png as well, but that is a matter for another post.
So streams can be a nice way to work with requests, the callback system as well is tired yet true as well with request. Also if I want to use promises there is a wealth of options when it comes to adding support for that sort of thing as well, as compared to axios where that is just built in from that start. After spending some time playing around with request I am beginning to see why this is such a popular http client for node.js, there is a lot of versatility with this one.
Than you for reading I hope you got something out of this post. If you think something is missing please feel free to write about it in the comments.