Stream Module
Stream Module
// Concatenating buffers
const concatenatedBuffer = Buffer.concat([buffer1,
buffer2]);
console.log(concatenatedBuffer.toString('utf-8'));
// Output: Hello World
Stream Module
Akash Pundir
System Programming –I
School of Computer Science and Engineering
Whaaat is this, Stream?
Streams are sequences of data made
available over time. Rather than reading or
writing all the data at once, streams allow
you to process data piece by piece, which is
particularly useful when dealing with large
datasets or when real-time processing is
required.
So, Stream Module….
// Listen for the 'data' event, which indicates that a chunk of data is available
readableStream.on('data', (chunk) => {
console.log('Received chunk of data:');
console.log(chunk);
});
// Listen for the 'end' event, which indicates that all data has been read
readableStream.on('end', () => {
console.log('Finished reading data from the file.');
});
// Listen for the 'error' event, in case of any errors during reading
readableStream.on('error', (err) => {
console.error('Error reading data:', err);
});
Writing Data
const fs = require('fs');
// Data to be written
const data = ['Hello, world!\n', 'This is a test.\n'];
// End the writable stream to indicate that no more data will be written
writableStream.end();
// Listen for the 'finish' event, which indicates that
all data has been written
writableStream.on('finish', () => {
console.log('Finished writing data to the file.');
});
// Pipe the data from the readable stream to the writable stream
readableStream.pipe(writableStream);