eventEmitter
class. stream
module via requiring it in the following way :
var stream = require('stream');
The streams which is used to perform read operations are readable streams.All aspects of readable streams are explained below :
stream.read()
explicitly to read the chunks of data. readable.pause()
: This method is used to change the mode of the stream from flowing
to
paused
and also all the data availble keeps residing in the internal buffer.
readable.resume()
: This method is used to change the mode of the stream from paused
to
flowing
and also stream will resume emitting events.
readable.isPaused()
: This method is used to check the current operating state of the readable stream. If it
returns true
then that signifies that readable stream is in paused mode.
readable.pipe()
: This method is used to attach a writable stream to the readable which will make the stream switch to
flowing mode and start pushing data to the attached writable.
readable.unpipe()
: This method is used to detach the writable stream previously attached to the readable stream.
readable.read()
: This method is used to pull the data out of the internal buffer where data is returned in the form
of buffers unless any other format is specified using readable.setEncoding()
. If there is no data to pull , then null is returned.
readable.setEncoding()
: This method is used to set the encoding for readable stream. By default the data is pulled in the
form of buffers.
readable.unshift()
: This method is used to push the data back to the internal buffer.
readable.wrap()
: This method is used to read the data from the readables where the data sources uses the old streams.
readable.destroy()
: This method is used to signifies the end of readable stream and stream releases any resources , if held.
The streams which is used to perform write operations are writable streams. All aspects of writable streams are explained below :
system.write(chunk)
method returns false and it indicates when it will
be appropriate to resume writing data.
stream.pipe()
method is called on a readable stream indicating the addition of the
writable in the set of destinations of the readable.
stream.unpipe()
method is called on a readable stream indicating the removal of the
writable from the set of destinations of the readable.
writable.cork()
: This method is used to force all the written data to be buffered in memory. This buffered data is
flushed in either of the following scenarios :
stream.uncork()
method is called. stream.end()
method is called. writable.uncork()
: This method is used to flush all the data buffered by stream.cork()
method.
writable.write()
: This method is used to write some data to the stream and call the given callback when the data is
handled successfully.
writable.setDefaultEncoding()
: This method is used to set the default encoding for the writable stream.
writable.end()
: This method is used to signifies that no more data will be written to the writable stream.
writable.destroy()
: This method is used to signifies the end of writable stream.
Duplex streams
are the streams which implements both readable
and writable
streams simultaneously.Most common example of
duplex
stream include net.socket
class of net
module.
A better explanation of how duplex streams works is as follows :
Suppose we build a socket in node.js to implement the functionality of transmit and receive data
simulataneously, then that can be achieved using duplex
stream. We will be having two independent channels in the network where one channel is used
for transmitting data and other for receiving data.
duplex
streams for implementing sockets. duplex
streams for gzip compression and decompression. duplex
stream for performing encryption, decryption and creating message digests.
Transform
streams are duplex
streams that can transform or modify data as it is read and written. Also where output is in some way related to the input.
These streams read the input data , transform
it using the manipulating function and output the new data as shown below :
We can also chain streams together to create complex processes by piping one to next as shown below :
transform
streams for gzip compression and decompression like in zlib.createDeflate()
method. transform
stream for performing encryption, decryption and creating message digests. transform.destroy()
: This method is used to destroy the stream and emit error
. Moreover , The
tranform
stream would release all internal resouces being used after this method call.
code-snippet about how we can use streams in our code.
// require fs module for file system
var fs = require('fs');
// write data to a file using writeable stream
var wdata = "I am working with streams for the first time";
var myWriteStream = fs.createWriteStream('aboutMe.txt');
// write data
myWriteStream.write(wdata);
// done writing
myWriteStream.end();
// write handler for error event
myWriteStream.on('error', function(err){
console.log(err);
});
myWriteStream.on('finish', function() {
console.log("data written successfully using streams.");
console.log("Now trying to read the same file using read streams ");
var myReadStream = fs.createReadStream('aboutMe.txt');
// add handlers for our read stream
var rContents = '' // to hold the read contents;
myReadStream.on('data', function(chunk) {
rContents += chunk;
});
myReadStream.on('error', function(err){
console.log(err);
});
myReadStream.on('end',function(){
console.log('read: ' + rContents);
});
console.log('performed write and read using streams');
});
> node filename_streams.js
In this chapter of 30 days of node tutorial series, we learned about the basics of streams, what are they, types of streams, readable stream, writable stream , duplex stream , transform stream and lastly a coding example to explain how we can use streams in our code.