
Node.js Streams: The Ultimate Guide to Handling Large Data
What are Streams?
A collection of data is called a stream. But unlike an array or a string, a stream object does not keep all of its contents in memory at once. Rather, one piece of data at a time is delivered into the memory from the stream. It increases the efficiency of streams. The ideal platform for creating data streaming applications is Node.js.
Stream types Node.js handles four basic kinds of streams:
- Data can be written to Writable − streams.
- Data can be read from readable streams.
- Both readable and writeable streams are known as duplex streams.
- Transform − Duplex streams that have the ability to change or alter data while it is being read and written.
Each type of Stream is an EventEmitter instance and throws several events at different instances of time. For example, some of the commonly used events are −
- data − This event is fired when there is data available to read.
- end − This event is fired when there is no more data to read.
- error − This event is fired when there is any error receiving or writing data.
- finish − This event is fired when all the data has been flushed to the underlying system.
This chapter’s examples make use of a file called input.text that contains the following information.
Thefullstack is providing self-learning materials to educate the world in an easy-to-understand manner!
Streamable The data can be read in chunks of a certain size from the file object, which functions as a stream. To read data from a specified file, we use the fs module’s createReadStream() function in the example that follows. The file contents are gathered by the readable stream’s on event until the end event is triggered.
var fs = require("fs");
var data = '';
// Create a readable stream
var readerStream = fs.createReadStream('input.txt');
// Set the encoding to be utf8.
readerStream.setEncoding('UTF8');
// Handle stream events --> data, end, and error
readerStream.on('data', function(chunk) {
data += chunk;
});
readerStream.on('end', function() {
console.log(data);
});
readerStream.on('error', function(err) {
console.log(err.stack);
});
console.log("Program Ended");
Save the above script as main.js. Run the above Node.js application. It will display the contents of the input.text file.
Writable Stream The createWriteStream() function from the fs module creates an object of a writable stream. Its write() method stores the data in the file given to the createWriteStream() function as an argument.
Save the following code with main.js as its name.
var fs = require("fs");
var data = `thefullstack is giving self-learning content
to teach the world in a simple and easy way!!!!!`;
// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');
// Write the data to stream with encoding to be utf8
writerStream.write(data, 'UTF8');
// Mark the end of file
writerStream.end();
// Handle stream events --> finish, and error
writerStream.on('finish', function() {
console.log("Write completed.");
});
writerStream.on('error', function(err) {
console.log(err.stack);
});
console.log("Program Ended");
To see if it contains the provided data, open output.txt that was created in your current directory.
Connecting the Streams The process of supplying the output of one stream as the input to another is known as piping. Typically, it is used to send the output of one stream to another and get data from another stream. The number of piping operations is unlimited. We’ll now demonstrate how to read data from one file and write it to another using pipes.
Create a js file named main.js with the following code −
var fs = require("fs");
// Create a readable stream
var readerStream = fs.createReadStream('input.txt');
// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');
// Pipe the read and write operations
// read input.txt and write data to output.txt
readerStream.pipe(writerStream);
console.log("Program Ended");
Now run the main.js to see the result −
node main.js
Check the output.
Program Termination
Open the output.txt file that was created in the directory you are now in; it should include the following:
Thefullstack is providing self-learning materials to educate the world in an easy-to-understand manner!
Connecting the Streams A method for joining the output of one stream to another and establishing a chain of several stream activities is called chaining. Typically, it is employed in piping activities. We will now compress a file and then decompress it using piping and chaining.
Create a js file named main.js with the following code −
var fs = require("fs");
var zlib = require('zlib');
// Compress the file input.txt to input.txt.gz
fs.createReadStream('input.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('input.txt.gz'));
console.log("File Compressed.");
Now run the main.js to see the result −
node main.js
Verify the Output.
File Compressed.
You will find that input.txt has been compressed and it created a file input.txt.gz in the current directory. Now let’s try to decompress the same file using the following code −
var fs = require("fs");
var zlib = require('zlib');
// Decompress the file input.txt.gz to input.txt
fs.createReadStream('input.txt.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('input.txt'));
console.log("File Decompressed.");
Now run the main.js to see the result −
node main.js
Verify the Output.
File Decompressed.
It might ne helpful:
Model-Based Security Testing: Ensuring Software Security with MBT
Leave a Reply