0% found this document useful (0 votes)
7 views12 pages

Streams

The document explains the concepts of streams and buffers in Node.js, highlighting their differences and functionalities. It details how streams allow for progressive reading and writing of data in chunks, as opposed to buffering entire files at once. Additionally, it provides examples of various types of streams, common events, and practical implementations for reading, writing, compressing, and decompressing files.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views12 pages

Streams

The document explains the concepts of streams and buffers in Node.js, highlighting their differences and functionalities. It details how streams allow for progressive reading and writing of data in chunks, as opposed to buffering entire files at once. Additionally, it provides examples of various types of streams, common events, and practical implementations for reading, writing, compressing, and decompressing files.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 12

Streams

https://www.geeksforgeeks.org/what-is-stream-and-its-types-in-node-js/
Concept of streams and buffers is the sole foundation of most of the
mid-to-low level programming languages.

In Computer Science, buffer allocates memory in RAM (temporarily) in


order to move it somewhere else. For example, read buffer bridges data
from HDD to a faster medium, may it be a printer, or speaker, that both
can execute incoming data chunk at much grater pace than being limited
by the access speed of HDD. It is a FIFO algorithm, also known as
a queue.

Stream, on the other hand, differs from the buffer in the way it
consumes data chunks. Streams can read/write data chunk by chunk,
without buffering the whole file at once. In other words, if you read a
textfile.txt where you have :
“My name is Doctor Watson”
It is possible to consume the file at the pace of byte by byte (as 1
character is a byte) that allows you to grab, transform and display the
contents of the file, byte by byte, progressively, like this:
“character: M”
“character: y”
“character: “
“character: n”
(…)
Each character progressively, gradually taken from the file and
transformed, without buffering the whole textfile.txt in a upfront fashion.
Stream can be progressively loaded data with no definite
length specified. Speaking of the above example in this context; it stores
each chunk of data in temporary allocated memory, instead of the whole,
definite sized buffer.

Streams also use buffers (typically), to store content streamed so far, and
when the stream is done, that buffer gets freed somewhere (examples: in
case of reading, freed up in the output device, case of writing; writing to
the disk).
The heart of streams and buffers of Nodejs lies in C++ <iostream>
library, as Nodejs is built in mainly C++ (some of the code in C as well).

I. Nodejs: Example of setting up a buffer (buffers the file at once to the


memory, not recommended for big files):
1.fs.readFile(__dirname + '/filename.txt', function (err, data) {
2.res.end(data);
3.});

II. Nodejs: Example of setting up a the stream, will (later) take data
chunk by chunk:
4.var stream = fs.createReadStream(__dirname + '/filename.txt');
4 types of stream
1. Readable
2. Writable
3. Duplex
4. Transforms

Each type of stream is an “EventEmitter” instance and


throws several events at different instance of times.

Commonly used events are


1. Data
2. End
3. Error
4. Finish
const fs=require("fs");
const http=require("http");
http.createServer(function(req,res){
const rs=fs.createReadStream("myfile.txt");

rs.on("data",(chunkdata)=>{
res.write(chunkdata);
});

rs.on("end",()=>{
res.end();
});

rs.on("error",(err)=>{
console.log(err);
res.end("File not Found");
});
}).listen(8000,"127.0.0.1");
var fs = require('fs');
var data = 'This is a code to learn“ + " about writing in a stream.';

// Create a writable stream


var writerStream = fs.createWriteStream('output.txt');

// Write the data to stream with


// encoding to be utf8
writerStream.write(data, 'UTF8');

// Mark the end of file


writerStream.end();

// Handling finish stream event


writerStream.on('finish', function () {
});

// Handling error stream event


writerStream.on('error', function (err) {
console.log(err.stack);
});
For this example, in myfile.txt save huge text(large text), then execute.

The output will be myfile.txt copied to yourfile.txt.

On console window ‘the end’ will print more than one time.

Because of stream we can say or otherwise, a chunk of data reads and


then writes.

//copy ‘myfile.txt’ to 'yourfile.txt’


var fs=require("fs");
const rs=fs.createReadStream("myfile.txt");
var ws=fs.createWriteStream('yourfile.txt');

rs.on('data',function(datachunk){
console.log("the end")
ws.write(datachunk);
})
var fs = require('fs');

// Create a readable stream


var readerStream = fs.createReadStream('input.txt');

// Create a writable stream


var writerStream = fs.createWriteStream('output.txt');

// Pipe the read and write operations


// read input.txt and write data to output.txt
readerStream.pipe(writerStream);
var fs = require('fs');
var zlib = require('zlib');

// Compress the file input.txt to input.txt.gz


fs.createReadStream('input.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('input.txt.gz'));

console.log('File Compressed.');
var fs = require('fs');
var zlib = require('zlib');

// Decompress the file input.txt.gz to input.txt


fs.createReadStream('input.txt.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('input.txt'));

console.log('File Decompressed.');

You might also like