100 Node.js Interview Questions
100 Node.js Interview Questions
js Interview Questions
Node.js uses an event-driven, non-blocking I/O model, making it ideal for building scalable
and high-performance applications, such as web servers, APIs, and real-time applications. It
works by using a single-threaded event loop to handle multiple concurrent requests without
creating separate threads for each one.
Runtime Limited to browser-specific Built for I/O-heavy tasks like file access or network
tasks communication
1/105
The package.json file is a configuration file for a Node.js project. It serves several purposes:
Metadata: Contains project details like name, version, author, and description.
Example:
json
{
"name": "my-app",
"version": "1.0.0",
"scripts": {
"start": "node index.js",
"test": "jest"
},
"dependencies": {
"express": "^4.18.2"
}
}
1. Open a terminal.
2/105
5. How do you initialize a new Node.js project?
1. Open a terminal and navigate to the project directory.
npm init
You’ll be prompted to fill out details like project name, version, etc.
3. Alternatively, use npm init -y to skip the prompts and create a package.json file with
default values.
Types of modules:
javascript
// CommonJS
module.exports = { greet: () => console.log("Hello") };
3/105
// ES Modules
export const greet = () => console.log("Hello");
Importing a module:
javascript
// CommonJS
const myModule = require('./myModule');
myModule.greet();
// ES Modules
import { greet } from './myModule.js';
greet();
Example:
javascript
4/105
node -v
Key features:
Example:
javascript
Execution Uses callbacks or Promises for Pauses execution until the file is read
completion
Example
Code
Example
5/105
Feature readFile readFileSync
```javascript ```javascript
});
Common Uses:
Example:
javascript
6/105
You can use the built-in http module to create an HTTP server.
Example:
javascript
server.listen(3000, () => {
console.log('Server is running on http://localhost:3000');
});
How It Works:
2. The event loop processes these events sequentially, executing the associated callback
functions.
3. Long-running tasks (e.g., I/O) are offloaded to the worker threads, and their results are
pushed back to the event loop when ready.
7/105
The fs (File System) module in Node.js provides methods to interact with the file system,
such as reading, writing, deleting, and updating files or directories.
Common Methods:
Example:
javascript
const fs = require('fs');
fs.readFile('example.txt', 'utf8', (err, data) => {
if (err) throw err;
console.log(data);
});
Example:
javascript
const fs = require('fs');
const data = fs.readFileSync('example.txt', 'utf8');
console.log(data);
This method blocks the execution of subsequent code until the file is fully read.
8/105
Error handling in Node.js can be done using:
Example (Callback):
javascript
const fs = require('fs');
fs.readFile('example.txt', 'utf8', (err, data) => {
if (err) {
console.error('Error reading file:', err.message);
} else {
console.log(data);
}
});
9/105
To update a package:
1. Run:
bash
bash
Execution Executes once after the specified delay. Executes repeatedly until cleared.
Example
console.log('Hello');
}, 1000);
``` ```
10/105
21. What is the difference between npm install and npm install --
save ?
npm install Installs all dependencies listed in Does not modify package.json ; it simply
package.json . ensures the required packages are installed.
npm install - Installs a specific package and Automatically updates the dependencies
-save adds it to dependencies . section of package.json .
Note: As of npm 5 (released in 2017), --save is the default behavior for npm install .
Explicitly using --save is no longer necessary.
Example:
bash
2. Run:
bash
3. To remove it globally:
bash
bash
11/105
npm uninstall <package-name> --save
javascript
const fs = require('fs');
In this example:
fs.readFile is asynchronous.
The callback function ( (err, data) => {} ) runs when the file is read or if an error
occurs.
Example:
12/105
javascript
server.listen(3000, () => {
console.log('Server running at http://localhost:3000');
});
I/O-heavy operations (e.g., database calls) are offloaded to the event loop or worker
threads.
Common Methods:
13/105
console.time and console.timeEnd : Measures execution time for code.
Example:
javascript
Syntax Uses require to import and Uses import and export keywords.
module.exports to export.
Default Support Default module system in Node.js Supported in modern Node.js (v12+ with
before v13. flag, v13+ without flag).
File Extension Files typically use .js . Files must use .mjs (or set type:
"module" in package.json ).
Interoperability Can import ES6 modules with Can use require with createRequire
dynamic import . function.
Example:
CommonJS:
javascript
// math.js
module.exports.add = (a, b) => a + b;
// app.js
const math = require('./math');
console.log(math.add(2, 3)); // 5
14/105
ES6 Modules:
javascript
// math.mjs
export const add = (a, b) => a + b;
// app.mjs
import { add } from './math.mjs';
console.log(add(2, 3)); // 5
makefile
DB_HOST=localhost
DB_USER=root
DB_PASS=securepassword
2. Load Variables: Use the dotenv package to load these into process.env .
bash
javascript
require('dotenv').config();
15/105
console.log(process.env.DB_HOST); // Outputs: localhost
console.log(process.env.DB_USER); // Outputs: root
4. Best Practices:
Types of Streams:
4. Transform Streams: Duplex streams that modify data (e.g., zlib for compression).
Key Events:
Example:
javascript
const fs = require('fs');
16/105
readable.on('end', () => {
console.log('Stream ended.');
});
Key Characteristics:
javascript
// Middleware function
function logger(req, res, next) {
console.log(`${req.method} ${req.url}`);
next();
}
17/105
console.log('Server running on http://localhost:3000');
});
30. How do you use the crypto module to hash data in Node.js?
The crypto module in Node.js provides cryptographic functionality, including creating
hashes, which are fixed-size strings generated from input data. Hashes are commonly used
for checksums and password storage.
2. Use the crypto.createHash method with an algorithm like sha256 , md5 , etc.
3. Update the hash object with data and output the result.
Example:
javascript
console.log('Hash:', hash);
// Example output: "Hash:
a591a6d40bf420404a011733cfb7b190d62c65bf0bcda32b92dbb8a62d93d799"
Notes:
Avoid md5 or sha1 for sensitive data as they are considered weak.
18/105
request) to complete, Node.js moves on to execute other tasks, enhancing performance and
scalability.
Key Features:
Non-Blocking I/O: Tasks like file operations or database queries are handled in the
background, freeing the event loop for other operations.
Promises & async/await : Modern syntax for managing asynchronous flows more
readably.
Example:
javascript
const fs = require('fs');
1. Readable Streams: For reading data (e.g., file reading, HTTP request body).
Example: fs.createReadStream()
19/105
Events: data , end , error
2. Writable Streams: For writing data (e.g., file writing, HTTP response).
Example: fs.createWriteStream()
Example: net.Socket
4. Transform Streams: A type of Duplex stream that modifies data as it passes through
(e.g., compression, encryption).
Example: zlib.createGzip()
javascript
const fs = require('fs');
readable.pipe(writable);
console.log('Data is being copied from input.txt to output.txt.');
javascript
20/105
const { spawn } = require('child_process');
const ls = spawn('ls', ['-lh', '/usr']);
javascript
javascript
21/105
libraries.
Steps:
Example:
javascript
req.on('end', () => {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('File uploaded successfully.');
});
} else {
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end(`
<form method="POST" enctype="multipart/form-data">
<input type="file" name="file" />
<button type="submit">Upload</button>
</form>
`);
}
}).listen(3000, () => {
console.log('Server listening on http://localhost:3000');
});
22/105
The cluster module in Node.js enables the creation of multiple processes (workers) to
utilize multiple CPU cores effectively. Each worker runs an instance of the application,
sharing the same server port.
How It Works:
3. The master manages worker processes and restarts them if they fail.
Use Case: Clusters are ideal for improving performance in CPU-bound tasks.
Example:
javascript
if (cluster.isMaster) {
const numCPUs = os.cpus().length;
// Fork workers
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
23/105
Key Notes:
36. Explain how to use the http and https modules in Node.js.
The http and https modules allow Node.js to create HTTP and HTTPS servers and make
HTTP(S) requests.
The http module is used to create a server that listens for requests on a specified port.
Example:
javascript
server.listen(3000, () => {
console.log('HTTP server running at http://localhost:3000');
});
The https module is used to create secure servers. It requires an SSL certificate and private
key.
Example:
javascript
24/105
cert: fs.readFileSync('certificate.pem'),
};
Example:
javascript
res.on('end', () => {
console.log('Response:', data);
});
}).on('error', (err) => {
console.error('Error:', err.message);
});
25/105
Steps for Token-Based Authentication:
1. Generate a Token:
Use libraries like jsonwebtoken to create a secure token after verifying user
credentials.
bash
javascript
2. Verify Token:
javascript
javascript
26/105
res.end('Protected content accessed.');
});
} else {
res.end('Public content.');
}
});
server.listen(3000);
Basic Usage:
javascript
myEmitter.on('event', () => {
console.log('An event occurred!');
});
javascript
27/105
javascript
Example:
javascript
Best Practices:
28/105
Readable Streams:
javascript
const fs = require('fs');
readable.on('end', () => {
console.log('No more data.');
});
Writable Streams:
javascript
const fs = require('fs');
Example:
javascript
29/105
readable.pipe(writable);
console.log('File copied successfully.');
simplify the process of building robust web applications and APIs. Its benefits include:
1. Simplified Routing:
javascript
2. Middleware Support:
3. Extensibility:
Compatible with many plugins for extended functionality (e.g., body parsing, cookies).
4. Scalability:
5. Community Support:
Extensive documentation and a large community offer reliable solutions for common
challenges.
30/105
42. How do you create a RESTful API in Node.js?
To create a RESTful API, follow these steps:
Install Express:
bash
Example:
javascript
let items = [
{ id: 1, name: 'Item 1' },
{ id: 2, name: 'Item 2' },
];
31/105
const item = items.find(i => i.id == req.params.id);
if (item) {
item.name = req.body.name;
res.json(item);
} else {
res.status(404).send('Item not found');
}
});
// Start server
app.listen(3000, () => console.log('API running on port 3000'));
javascript
2. Streaming Data:
32/105
Use spawn to handle large data streams between parent and child processes.
javascript
javascript
Creating a Promise:
javascript
javascript
33/105
myPromise
.then(result => console.log(result)) // Output: Operation succeeded
.catch(error => console.error(error));
Using async/await :
javascript
asyncFunction();
javascript
const fs = require('fs').promises;
fs.readFile('example.txt', 'utf8')
.then(data => console.log('File content:', data))
.catch(err => console.error('Error reading file:', err));
1. Install ws :
34/105
bash
npm install ws
javascript
// Handle disconnection
socket.on('close', () => {
console.log('Client disconnected');
});
});
3. Client Implementation:
javascript
socket.onopen = () => {
console.log('Connected to server');
socket.send('Hello, server!');
};
35/105
socket.onclose = () => console.log('Disconnected from server');
Summary:
Express Framework: Simplifies building web apps and APIs.
RESTful API: Use Express with HTTP verbs ( GET , POST , etc.) to manage resources.
Use Cases:
1. Compressing a File:
javascript
input.pipe(zlib.createGzip()).pipe(output);
console.log('File compressed successfully.');
36/105
2. Decompressing a File:
javascript
input.pipe(zlib.createGunzip()).pipe(output);
console.log('File decompressed successfully.');
bash
Open Chrome DevTools and connect to the debugging URL printed in the terminal.
2. Using console.log :
Insert console.log statements to print variable values and track execution flow.
Insert the debugger keyword in your code where you want to pause execution.
javascript
let x = 10;
debugger; // Execution stops here
x += 20;
console.log(x);
37/105
4. Using a Debugger Tool (e.g., VS Code):
Set breakpoints and step through the code using an IDE like Visual Studio Code.
bash
Key Methods:
javascript
const os = require('os');
console.log('Platform:', os.platform());
console.log('Architecture:', os.arch());
console.log('Uptime (seconds):', os.uptime());
javascript
3. Memory Usage:
38/105
Retrieve total and free memory.
javascript
4. Network Interfaces:
javascript
5. Home Directory:
javascript
Example:
javascript
const fs = require('fs').promises;
39/105
console.error('Error reading file:', err);
}
}
readFileContent();
javascript
1. Using setTimeout :
javascript
setTimeout(() => {
console.log('Task executed after 2 seconds');
}, 2000);
2. Using setInterval :
40/105
Executes a function repeatedly at a fixed interval.
javascript
setInterval(() => {
console.log('Task executed every 3 seconds');
}, 3000);
javascript
clearTimeout(timer);
bash
javascript
Summary:
zlib : Handles compression and decompression (e.g., Gzip).
41/105
Debugging: Use --inspect , debugger , or IDE tools like VS Code.
Both process.nextTick and setImmediate are used to schedule callbacks in Node.js, but
they differ in their execution timing within the event loop.
process.nextTick :
Executes callbacks before the next iteration of the event loop begins.
It's a part of the microtask queue and has higher priority than the timers phase.
Suitable for deferring tasks to execute as soon as the current operation is complete.
setImmediate :
Executes callbacks in the check phase of the event loop, after I/O events.
It's part of the macrotask queue and will run after microtasks are completed.
Example:
javascript
console.log('Start');
console.log('End');
Output:
sql
Start
End
42/105
NextTick
SetImmediate
Key Difference:
Use Case for process.nextTick : Critical operations to execute immediately after the
current function.
1. The producer generates data faster than the consumer can process it.
2. The writable buffer fills up, and the writable stream signals the producer to stop sending
more data.
Managing Backpressure:
javascript
if (!canWrite) {
writable.once('drain', () => {
console.log('Drain event fired, resuming writes.');
writable.write('More data');
});
}
43/105
javascript
Why It Matters:
Properly handling backpressure ensures efficient resource usage and prevents system
crashes due to memory overload.
3. Use Clustering:
Distribute workload across multiple CPU cores using the cluster module.
javascript
if (cluster.isMaster) {
const numCPUs = require('os').cpus().length;
for (let i = 0; i < numCPUs; i++) cluster.fork();
} else {
44/105
http.createServer((req, res) => {
res.end('Hello World');
}).listen(3000);
}
4. Use Caching:
Implement caching for repeated requests (e.g., in-memory caching with Redis ).
Use the zlib module for compressing responses to reduce bandwidth usage.
54. What are worker threads in Node.js, and how are they used?
Worker threads in Node.js provide a way to execute JavaScript code in parallel, leveraging
multiple threads in a single Node.js process. This is useful for CPU-intensive tasks.
Node.js is single-threaded for JavaScript execution. Worker threads allow offloading heavy
tasks, preventing them from blocking the main thread.
Example:
45/105
javascript
if (isMainThread) {
const worker = new Worker( filename);
Use Cases:
Memory Structure:
1. Stack:
Limited in size.
2. Heap:
3. C++ Objects:
46/105
Garbage Collection:
Garbage collection in Node.js is automatic but can cause performance issues during large
sweeps.
Mark-and-Sweep Algorithm:
Incremental GC:
2. Use Streams:
Avoid loading large files or data sets into memory; use streams to process data in
chunks.
bash
4. Monitoring:
javascript
console.log(process.memoryUsage());
Summary:
process.nextTick vs. setImmediate : Microtask vs. macrotask execution.
47/105
Backpressure: Prevents memory overflow in streams by pausing data flow.
Memory Management: Uses V8's garbage collector; monitor and optimize usage to
prevent leaks.
56. What are the main differences between Node.js and other server-
side technologies like PHP?
Node.js and PHP are both popular server-side technologies, but they differ in several
fundamental aspects.
Node.js:
PHP:
2. Performance:
PHP: PHP is generally better suited for typical server-rendered web applications but can
struggle with highly concurrent or real-time applications due to its synchronous
processing.
3. Concurrency Model:
48/105
Node.js: Uses a single-threaded event loop and non-blocking I/O to handle multiple
requests simultaneously without the need for multiple threads. It scales well with tasks
that are I/O-bound but not CPU-intensive.
PHP: Each incoming request is handled by a new process or thread. It is often paired
with web servers like Apache or Nginx, which spawn multiple worker processes to
handle requests.
Node.js: Uses npm (Node Package Manager), which has a large ecosystem of libraries
and modules for all sorts of tasks (from web frameworks to data manipulation).
PHP: Uses Composer for managing dependencies, and while its ecosystem is strong, it’s
not as modern or as large as Node.js's npm.
5. Learning Curve:
PHP: PHP has a relatively low learning curve and is designed specifically for web
development, making it easy for beginners to get started with server-side development.
2. If a module has already been loaded, Node.js returns the cached version, even if it is still
in the process of loading.
3. As a result, if a circular dependency exists, the first module will receive an incomplete
version of the second module until the module has finished loading.
49/105
Refactor the Code: Break the circular dependency by restructuring the code. Move
shared functionality into a separate module or separate concerns to avoid
interdependence.
Lazy Loading: Delay the require statement until it's absolutely necessary. This can
sometimes prevent circular dependency issues by ensuring the modules are only loaded
when they are needed.
javascript
Use exports Carefully: Be cautious about modifying exports after the module has
been loaded, as it can result in partial exports being used.
Heap Snapshot: Allows taking a snapshot of the heap for memory usage analysis.
Memory Allocation: Helps inspect and manage memory allocated to the V8 heap.
Customizable Flags: Allows for setting custom V8 flags to influence the behavior of the
engine (e.g., to enable or disable specific optimizations).
Example Usage:
You can use the v8 module to access heap statistics or take heap snapshots for debugging:
javascript
50/105
const v8 = require('v8');
console.log(v8.getHeapStatistics());
59. Explain how the cluster module can be used for scaling applications.
The cluster module in Node.js allows you to create child processes (workers) that can
share the same server port. It is useful for scaling applications to take advantage of multi-
core systems.
How It Works:
Node.js is single-threaded, which means it can only use one CPU core. The cluster
module allows you to fork multiple processes to use multiple cores, enabling you to
handle more traffic efficiently.
Each worker is an independent process, but they share the same server socket, making it
easier to scale the application horizontally.
Basic Example:
javascript
if (cluster.isMaster) {
// Fork workers based on the number of CPU cores
const numCPUs = os.cpus().length;
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
51/105
res.writeHead(200);
res.end('Hello from Node.js cluster!');
}).listen(8000);
}
Benefits of Clustering:
Fault Tolerance: If one worker crashes, other workers can continue to handle requests.
Optimal CPU Utilization: Clustering allows Node.js to make use of all available CPU
cores.
1. Process:
A process is an independent execution unit that has its own memory space, system
resources, and execution context.
Processes are isolated from each other and cannot directly access each other’s memory.
2. Thread:
A thread is a smaller unit of execution within a process. Threads share the same
memory space and resources as their parent process.
Node.js is single-threaded by default, meaning the event loop and JavaScript execution
happen in a single thread.
However, Node.js can spawn additional threads for certain tasks, such as using the
worker threads module for parallel processing.
Key Differences:
Memory: Processes have their own memory, while threads share memory with other
threads in the same process.
52/105
Performance: Processes are more isolated, leading to higher overhead, whereas threads
are more lightweight but can lead to concurrency issues if not properly managed.
Concurrency: Node.js uses a single thread for JavaScript execution but employs
additional threads for I/O operations and background tasks (e.g., the worker threads
module).
Summary:
Node.js vs. PHP: Node.js is event-driven, non-blocking, and uses JavaScript, while PHP is
synchronous and uses a different model for processing requests.
Processes vs. Threads: Processes are independent units of execution with separate
memory, while threads share memory within a process and are lighter weight.
Key Concepts:
Event Loop: The event loop in Node.js constantly monitors the event queue and
processes I/O operations as they complete. The event loop runs in a single thread, and
when an I/O operation is requested (e.g., reading from a file or making an HTTP
request), Node.js offloads this task to the operating system, freeing up the event loop to
process other tasks.
Callbacks: When an I/O operation is complete, Node.js invokes a callback function that
was registered to handle the result. This callback is added to the event loop’s queue, and
once the current operation finishes, the callback is executed.
Libuv Library: Node.js uses libuv, a multi-platform support library that handles
asynchronous I/O, to manage operations such as file system access, networking, and
53/105
child process management. It is the foundation behind the non-blocking behavior of
Node.js.
Example:
javascript
const fs = require('fs');
Here, fs.readFile reads a file asynchronously. While the file is being read, Node.js
continues executing the console.log("File reading in progress...") statement.
Advantages:
Efficiency: Node.js does not wait for I/O operations to finish before continuing with other
tasks, leading to efficient use of system resources.
While event delegation is commonly discussed in the context of DOM manipulation (e.g., in
browsers), in Node.js, the concept can be applied in several ways:
How it Works:
Instead of attaching event listeners to each individual child object, you attach a listener
to a parent or container object that listens for events that bubble up.
54/105
When an event is triggered on a child, the parent can delegate the handling of the event
based on the event's properties (like event type or target).
In a scenario with many routes in an HTTP server, instead of assigning individual listeners for
each route, we can use a single request event listener on the server object:
javascript
server.listen(3000, () => {
console.log('Server is listening on port 3000');
});
Here, event delegation is used by attaching a single listener to the server that handles
different requests, rather than creating individual handlers for each route.
55/105
1. Avoid Shared Mutable State: The most important strategy in ensuring thread safety is
to avoid shared mutable state between concurrent tasks. If shared state is needed,
consider using locks or other synchronization techniques.
2. Worker Threads: If you use the worker threads module to run CPU-intensive tasks,
ensure thread safety by passing messages instead of sharing objects directly between
threads.
You can use atomic operations (where possible) to ensure consistency when
multiple threads are involved.
Pass data between threads using postMessage() and handle the result through the
message event.
3. Child Processes: For parallel processing, use child processes in conjunction with the
cluster module. Each child process has its own memory space, avoiding shared state
issues.
4. Libraries: For thread-safe data structures, consider using libraries like async or
immutable.js for managing state in an asynchronous environment.
5. Asynchronous Operations: For asynchronous I/O, Node.js naturally avoids thread safety
issues by not allowing multiple operations to interfere with each other on the main
thread.
javascript
if (isMainThread) {
const worker = new Worker( filename);
worker.on('message', (message) => console.log('Worker says:', message));
worker.postMessage('Hello Worker');
} else {
parentPort.on('message', (message) => {
parentPort.postMessage(`Received: ${message}`);
});
}
In this example, communication between the main thread and worker thread is done via
messages, ensuring data safety and no shared memory issues.
56/105
64. What is the difference between blocking and non-blocking code in
Node.js?
The distinction between blocking and non-blocking code is crucial in understanding how
Node.js works, especially with respect to asynchronous operations.
Blocking Code:
Blocking refers to operations that stop the execution of subsequent code until the
current operation is finished.
In a blocking operation, Node.js waits for a task to complete before moving on to the
next task, effectively blocking the event loop.
javascript
const fs = require('fs');
console.log('Start');
const data = fs.readFileSync('example.txt', 'utf8'); // Blocking
console.log(data);
console.log('End');
In this example, fs.readFileSync is a blocking call. The program waits for the file to be read
before continuing, blocking the event loop and delaying further execution.
Non-blocking Code:
Non-blocking operations allow Node.js to continue executing other code while waiting
for a task to complete. These operations use callbacks, promises, or async/await to
handle the result once the operation finishes.
javascript
const fs = require('fs');
console.log('Start');
fs.readFile('example.txt', 'utf8', (err, data) => { // Non-blocking
57/105
if (err) throw err;
console.log(data);
});
console.log('End');
Here, the fs.readFile function is non-blocking. While the file is being read, Node.js
continues to execute the console.log('End') statement, and once the file is read, the
callback is triggered.
Key Difference:
Blocking: Delays the execution of the program until the operation completes.
Non-blocking: Allows other tasks to execute while waiting for the operation to finish.
1. Streams:
Node.js streams allow you to read and write data piece-by-piece, which is
particularly useful for large files or datasets. Instead of loading the entire dataset
into memory, streams read and process data in chunks, helping to conserve
memory.
Writable Streams for output (e.g., writing to files or sending HTTP responses).
Example:
javascript
const fs = require('fs');
const readableStream = fs.createReadStream('largefile.txt', { encoding: 'utf8'
});
58/105
console.log(chunk); // Process data chunk by chunk
});
2. Pagination:
When working with large datasets from a database or API, implement pagination to
load and process data in smaller chunks.
Example:
javascript
function fetchData(page) {
// Simulate a data fetch
return new Promise(resolve => {
setTimeout(() => resolve(`Data for page ${page}`), 500);
});
}
3. Batch Processing:
For operations like database writes or API requests, split the large dataset into
smaller batches. This prevents overwhelming the system with too much data at
once and ensures smooth handling.
Use buffering techniques when dealing with binary data or large files. Buffers allow
you to work with raw binary data more efficiently than regular strings.
Consider using tools like Redis or MongoDB to store large datasets offload them
from memory.
5. Compression:
59/105
If you're dealing with large datasets over a network, compressing the data before
sending it can reduce I/O time and memory usage. Node.js has built-in support for
compression via the zlib module.
By using these techniques, Node.js applications can efficiently handle large datasets without
consuming excessive resources or causing performance bottlenecks.
1. In-Memory Rate-Limiting:
Track the number of requests from a user (typically using the user's IP address or
session identifier).
Store the timestamps of the user's requests and check if the user has exceeded the
limit within the specified time window.
2. Example:
javascript
60/105
if (!rateLimit.has(ip)) {
rateLimit.set(ip, []);
}
// Filter out requests that are older than the time window
rateLimit.set(ip, requestTimes.filter(time => currentTime - time <= timeWindow));
server.listen(3000, () => {
console.log('Server is running');
});
This example tracks requests per IP address, and if a user exceeds the limit, it returns a 429
Too Many Requests response.
For more complex rate-limiting needs (e.g., using Redis to persist rate-limit data),
you can use libraries like express-rate-limit .
bash
javascript
61/105
const app = express();
app.use(limiter);
app.listen(3000, () => {
console.log('Server running on port 3000');
});
You can use the built-in --inspect flag in Node.js to start a debugging session and
monitor memory usage.
Example:
bash
62/105
Then, open Chrome DevTools to monitor memory usage and inspect heap
snapshots.
2. Heap Snapshots:
You can take heap snapshots to analyze memory allocations over time. Tools like
Chrome DevTools or clinic.js can help you visualize memory usage patterns.
3. Using process.memoryUsage() :
Example:
javascript
setInterval(() => {
console.log(process.memoryUsage());
}, 1000);
4. Third-Party Libraries:
Libraries like memwatch-next or heapdump can help monitor memory usage and
detect leaks by generating memory dumps or alerts when memory usage is
unusually high.
bash
javascript
63/105
68. What is the purpose of the domain module, and why is it
deprecated?
The domain module in Node.js was introduced to manage uncaught exceptions in
asynchronous callbacks. It allowed you to group multiple I/O operations and handle errors
for all operations in that group (domain). It provided a mechanism to catch errors that were
otherwise difficult to handle, like those in callbacks.
Purpose:
Error Handling: It was used for catching exceptions in asynchronous code, where the
normal try-catch block would not work.
Why is it deprecated?:
The domain module was deprecated in Node.js because its usage often led to
unintended side effects and unpredictable error handling behavior. The asynchronous
model in Node.js (using callbacks, promises, and async/await) makes error handling
more straightforward without requiring special constructs like domains.
Modern JavaScript patterns, such as async/await and global error handlers like
process.on('uncaughtException') , are now preferred for error handling.
Recommendation:
Instead of using domain , handle errors in asynchronous code using try-catch with
async/await or handle event-driven errors with proper error event listeners.
64/105
Example:
javascript
Example:
javascript
Example:
javascript
These functions can be used to interact with DNS servers and resolve domains as part of
network operations in Node.js.
65/105
Creating Buffers:
1. From a String:
javascript
2. Allocating Buffers:
javascript
3. From an Array:
javascript
Manipulating Buffers:
You can read and write to buffers using different methods such as .toString() ,
.write() , or .slice() .
Example:
javascript
Use Cases:
File Handling: Buffers are used extensively when working with file systems to handle
binary data (e.g., reading/writing images, videos, or other binary files).
Networking: Buffers are useful in network operations where raw binary data needs to
be processed or transferred.
66/105
These are the main concepts related to handling DNS queries and working with the buffer
module in Node.js, along with detailed explanations on various aspects of the Node.js
environment.
1. Event Loop:
The event loop is the heart of Node.js, and libuv implements this loop, which
allows Node.js to handle multiple concurrent operations efficiently without blocking
the thread.
It enables non-blocking I/O, allowing Node.js to execute I/O operations (like reading
files or querying databases) asynchronously while continuing to process other tasks.
2. Asynchronous I/O:
3. Thread Pool:
libuv uses a thread pool to handle I/O operations that are blocking in nature (e.g.,
file system operations, DNS resolution). While JavaScript runs on a single thread,
libuv offloads some blocking tasks to the thread pool, allowing the main thread
(event loop) to remain free for other operations.
4. Cross-Platform Compatibility:
across various operating systems like Windows, macOS, and Linux without requiring
special handling for OS-specific I/O mechanisms.
In summary, libuv provides the foundation for asynchronous, non-blocking I/O and
concurrency, ensuring that Node.js can handle high levels of traffic and complex operations
67/105
in an efficient and scalable way.
In-Memory Caching:
You can use a JavaScript object or a package like node-cache for basic in-memory
caching.
bash
javascript
68/105
Redis is a popular in-memory data store that can be used to cache data outside the
application process. It is commonly used in distributed systems for high-
performance caching.
bash
javascript
HTTP Caching: Cache HTTP responses using caching headers ( Cache-Control , ETag ,
etc.).
Content Delivery Networks (CDNs): For static assets, CDNs can be used for caching
content globally, reducing latency and server load.
69/105
A reverse proxy is a server that sits between client devices and a backend server (like a
Node.js application), forwarding client requests to the appropriate backend service. It acts as
an intermediary, ensuring proper request handling, load balancing, security, and caching.
1. Load Balancing:
A reverse proxy can distribute incoming traffic across multiple instances of a Node.js
application, improving performance and ensuring better resource utilization. This is
particularly important for handling high volumes of requests.
2. SSL Termination:
The reverse proxy can handle SSL/TLS encryption and decryption (SSL termination),
reducing the computational load on the Node.js server by offloading the encryption
task to the proxy.
3. Caching:
A reverse proxy can cache static content (e.g., HTML, images, etc.) and reduce the
load on your Node.js application by serving cached data for repeated requests.
4. Security:
A reverse proxy can act as an additional security layer, filtering out malicious
requests, preventing DDoS attacks, or protecting sensitive endpoints.
5. API Gateway:
Reverse proxies can also serve as an API Gateway, managing requests to multiple
microservices, directing them to different backend services based on routing rules.
6. Scaling:
Popular reverse proxy tools include NGINX and HAProxy. NGINX can be configured to
forward requests to a Node.js app running on a different port.
nginx
70/105
server {
listen 80;
server_name example.com;
location / {
proxy_pass http://localhost:3000; # Forward requests to Node.js app
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
74. How do you use a message queue with Node.js (e.g., RabbitMQ)?
A message queue allows asynchronous communication between services by sending
messages that can be processed later. RabbitMQ is a popular open-source message broker
that supports various messaging patterns, including publish/subscribe, work queues, and
routing.
To use RabbitMQ with Node.js, you can use the amqplib library, which provides an interface
to interact with RabbitMQ.
1. Install amqplib :
bash
javascript
71/105
amqp.connect('amqp://localhost', (err, conn) => {
if (err) throw err;
conn.createChannel((err, channel) => {
if (err) throw err;
const queue = 'task_queue';
const msg = 'Hello, RabbitMQ!';
3. Consumer (Receiving Messages): A consumer listens for messages from the queue and
processes them.
javascript
4. Benefits:
Decoupling: Services are decoupled, making them easier to scale and maintain.
Reliability: RabbitMQ ensures that messages are not lost (durable queues).
72/105
Asynchronous Processing: RabbitMQ can handle heavy or slow tasks
asynchronously, freeing up resources for other operations.
Each microservice exposes a REST API for communication with other services. You
can use Express or other frameworks to create APIs.
2. Service Discovery:
Microservices need to discover and communicate with each other. This can be
achieved using tools like Consul or Eureka, which register microservices and provide
dynamic service discovery.
3. Inter-Service Communication:
4. API Gateway:
Use an API Gateway (such as NGINX or Kong) to route incoming requests to the
correct microservice and handle cross-cutting concerns like authentication, rate-
limiting, and logging.
Each microservice should have its own database to ensure loose coupling. This can
be SQL, NoSQL, or a combination, depending on the service's needs.
73/105
Example of Simple Microservice Architecture:
In this architecture, each microservice can be scaled independently based on its load.
These answers provide an overview of key topics related to Node.js and microservices,
helping you better understand how to optimize and scale your applications.
76. Explain the Internals of the Node.js Event Loop and Its Phases.
The Node.js event loop is the core mechanism behind non-blocking, asynchronous
execution in Node.js. It allows Node.js to handle multiple operations (such as I/O, network
requests, and timers) concurrently without blocking the execution thread.
The event loop operates in multiple phases, each with specific tasks. Understanding these
phases is essential to grasp how Node.js processes tasks.
1. Timers Phase:
Executes callbacks for completed I/O operations (like reading files, network
requests, etc.) that were queued in the previous cycle.
This phase is used internally for housekeeping and to prepare for the next cycle. It
does not typically execute application-level code.
4. Poll Phase:
The poll phase is where most I/O events are handled. If there are no timers to
execute, the event loop will block and wait for I/O events. This phase processes I/O
tasks (such as database queries) that are ready for execution. If there are callbacks
to execute, they are processed here.
74/105
5. Check Phase:
Executes callbacks for setImmediate() calls. This phase happens immediately after
the poll phase, before any new timers are triggered.
Handles close events, such as when a socket or handle is closed. This includes
events like socket.on('close') or process.on('exit') .
The event loop continuously cycles through these phases. The order of execution is:
Timers -> I/O Callbacks -> Idle/Prepare -> Poll -> Check -> Close Callbacks.
The event loop provides non-blocking concurrency by allowing I/O operations to be handled
without pausing the execution of the program.
1. Asynchronous I/O:
2. Clustering:
The cluster module allows you to create multiple child processes (workers) running in
parallel. Each worker runs on a separate CPU core, leveraging multi-core systems and
improving concurrency.
javascript
75/105
const cluster = require('cluster');
const http = require('http');
const numCPUs = require('os').cpus().length;
if (cluster.isMaster) {
// Fork workers for each CPU core
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
} else {
// Worker code
http.createServer((req, res) => {
res.writeHead(200);
res.end('Hello, world!');
}).listen(8000);
}
3. Load Balancing:
Use a reverse proxy (like NGINX) or a load balancer to distribute incoming requests to
multiple instances of the Node.js application, ensuring no single process is
overwhelmed.
4. Caching:
Implement caching mechanisms (e.g., Redis) to reduce the load on the application by
storing frequently accessed data in memory.
5. Rate Limiting:
Implement rate limiting to prevent clients from overwhelming your server with too
many requests. Tools like Redis and middleware like express-rate-limit can be useful.
For CPU-intensive tasks, consider using worker threads. These allow you to run
operations on separate threads, freeing the main event loop for I/O-bound tasks.
78. What Are Some Advanced Patterns for Error Handling in Node.js?
76/105
Error handling in Node.js is a crucial aspect of ensuring that the application runs reliably,
especially in an asynchronous environment. Here are some advanced patterns for handling
errors:
For frameworks like Express, you can use a centralized error-handling middleware. This
pattern helps manage all application errors in a single place, improving maintainability.
Example:
javascript
The domain module was used for handling uncaught errors across asynchronous
callbacks. While it's deprecated, it was a useful tool for catching unhandled errors in
multiple callbacks and preventing crashes. Consider using try/catch blocks or other
patterns instead.
In asynchronous code using Promises, use the .catch() method to catch errors at the
end of the chain, ensuring you don't miss exceptions thrown at any point in the chain.
Example:
javascript
someAsyncFunction()
.then(result => { /* process result */ })
.catch(error => { console.error('Error:', error); });
Example:
77/105
javascript
5. Graceful Shutdown:
Handle uncaught exceptions and unhandled promise rejections to allow for a graceful
shutdown.
Example:
javascript
78/105
Use the WebAssembly API to load and instantiate WebAssembly modules in Node.js.
This allows you to run compiled code (e.g., C, C++, Rust) within a Node.js process.
Example:
javascript
const fs = require('fs');
WebAssembly.instantiate(wasmBuffer)
.then(wasmModule => {
const result = wasmModule.instance.exports.add(5, 3);
console.log('Result from WebAssembly:', result); // Output: 8
})
.catch(err => {
console.error('Failed to load WebAssembly module', err);
});
2. Interoperability:
WebAssembly code in Node.js can interact with JavaScript through imports and
exports. You can pass values between JavaScript and WebAssembly modules,
enabling high-performance operations.
3. Use Cases:
Heavy computation: Use WebAssembly for tasks requiring high performance (e.g.,
cryptographic operations).
79/105
You can create a custom stream by extending the Readable stream class and implementing
the _read method, which dictates how data is pushed to the stream.
javascript
_read(size) {
const chunk = this.data.shift();
if (chunk) {
this.push(chunk);
} else {
this.push(null); // End the stream
}
}
}
To create a custom writable stream, extend the Writable class and implement the _write
method, which determines how data is written to the stream.
javascript
80/105
writableStream.write('Hello');
writableStream.write('World');
writableStream.end();
By creating custom streams, you can handle data in unique ways, such as processing data on
the fly or writing to non-standard outputs.
81. What is the Difference Between C++ Addons and Native Modules in
Node.js?
In Node.js, both C++ Addons and Native Modules allow you to extend the functionality of
Node.js using native code (C++), but there are subtle differences in their context and use.
C++ Addons:
C++ Addons are a way to write native code to extend Node.js using the V8 JavaScript
engine directly. These addons are compiled into binary modules that can be loaded in
Node.js just like regular JavaScript modules.
They allow you to interact with Node.js's internal components and V8 directly, which can
be more efficient for performance-intensive tasks (e.g., computational algorithms or
accessing system-level APIs).
C++ Addons are usually created using Node's N-API (Native API) or nan (a C++ header
file) to facilitate the communication between JavaScript and native C++ code.
Native Modules:
Native Modules generally refer to any modules that involve native bindings or external
dependencies written in a language like C or C++. These modules may include a C++
Addon, but the term "native module" is broader and may also include compiled C++ code
wrapped using bindings like node-gyp.
Node.js's native modules often use binding.gyp files (which are part of the node-gyp
tool) to specify the build process for native code that can be used in Node.js.
Key Difference:
C++ Addons specifically refer to extensions written in C++ that interact with Node.js at
the V8 engine level. Native Modules could refer to any native code bindings to Node.js,
which may include C++, but also C or other languages.
81/105
82. How Do You Implement a Custom Event Emitter in Node.js?
Node.js has a built-in EventEmitter class that allows you to handle events and listeners. To
create a custom event emitter, you can extend this class and define your own events and
behavior.
javascript
Explanation:
1. Inherit from EventEmitter : You create a custom class (e.g., MyEmitter ) that extends
the EventEmitter class.
3. Listen to Events: Use .on() or .once() to listen for the emitted events.
The above example demonstrates how to create and emit a custom event, event , and
handle it using an event listener.
82/105
Node.js uses V8, Google's open-source JavaScript engine, which includes an automatic
garbage collection (GC) mechanism for memory management. Garbage collection ensures
that memory used by objects no longer in use is released, preventing memory leaks.
1. Mark-and-Sweep Algorithm:
Mark phase: V8 identifies all live objects that are referenced (reachable) and
marks them as "in use."
Sweep phase: V8 then clears all objects that are not marked as in use, freeing
their memory.
V8 divides the heap into multiple regions (generations). Objects that survive multiple
garbage collection cycles are moved to the old generation, while new objects are
initially allocated in the new generation.
3. Triggering GC:
Managing GC in Node.js:
Use tools like --inspect to monitor the heap and analyze memory usage.
83/105
Operating System Threads:
Operating system threads are managed by the OS kernel and are used by processes to
execute multiple tasks concurrently. These threads have their own stack and memory
and can be scheduled by the OS for execution on different CPU cores.
Operating system threads are heavy-weight, meaning they come with their own
memory and resources.
Node.js Threads:
Node.js uses a single-threaded event loop model for handling I/O-bound tasks.
However, it also provides several mechanisms (like worker threads) to handle CPU-
bound tasks in parallel.
Worker Threads: In Node.js, worker threads are a way to run multiple threads
(background threads) in parallel, each with its own JavaScript execution environment.
This allows you to offload CPU-intensive tasks without blocking the event loop.
javascript
if (isMainThread) {
const worker = new Worker( filename);
worker.on('message', (message) => console.log(message));
worker.postMessage('Hello from main thread');
} else {
parentPort.on('message', (message) => {
console.log(message);
parentPort.postMessage('Hello from worker thread');
});
}
Key Difference:
Operating system threads are managed by the OS and are typically used by native
applications for parallelism. In contrast, Node.js threads (via worker threads) are a
feature that allows CPU-bound tasks to run in parallel without blocking the event loop,
but they still share the same process.
84/105
85. Explain the tick and microtask Queues in Node.js.
The tick and microtask queues are part of Node.js's event loop mechanism and help
manage the execution order of asynchronous operations.
Tick Queue:
Microtask Queue:
The microtask queue is where Promised-based callbacks (i.e., .then() , .catch() , and
async functions) are placed for execution.
Microtasks have a higher priority than other events in the event loop and are executed
after the current operation completes but before the next event loop phase begins.
Execution Order:
2. I/O Callbacks
Key Difference:
nextTick queue has the highest priority and is executed first, even before microtasks.
Microtasks are executed before the event loop continues to the next phase.
85/105
86. How Do You Analyze and Optimize a Node.js Application's CPU
Usage?
To analyze and optimize CPU usage in a Node.js application, you can employ the following
techniques:
Node.js Profiling:
Process Monitoring:
Use system monitoring tools like top , htop , or pm2 to track CPU usage in real-
time.
Use worker threads to offload CPU-bound tasks, ensuring that the event loop isn't
blocked.
Use child_process to fork separate processes for tasks that are CPU-heavy.
86/105
Code Splitting and Caching:
For CPU-bound tasks, use a cache layer like Redis or an in-memory cache.
This allows Node.js to remain responsive while CPU-bound tasks run in parallel.
Use exec or spawn to run external programs, shell scripts, or command-line utilities
like FFmpeg, ImageMagick, etc., from within a Node.js application.
Example:
javascript
Use fork for inter-process communication (IPC) between Node.js processes. For
example, you can use child processes for parallel data processing across multiple CPU
cores.
87/105
Node.js's cluster module can be used in combination with child processes to scale
applications across multiple CPU cores for better performance.
Use spawn to initiate background processes for tasks like downloading large files,
managing queues, or interacting with databases in parallel, without affecting the event
loop.
5. Managing Microservices:
Use child processes to run microservices in separate Node.js processes. This allows
different services to be isolated and scaled independently.
Example:
bash
Graceful restart ensures that old instances finish their ongoing requests before
being terminated, while new instances start processing requests.
2. Blue-Green Deployment:
Maintain two separate environments: one for the current version ( blue ) and one for the
new version ( green ).
88/105
Switch traffic from the old version to the new version once it's fully deployed and tested.
3. Rolling Deployment:
Deploy new versions of your application incrementally, ensuring that only a subset of the
application is updated at any time.
This can be managed through Kubernetes, Docker, or a load balancer like NGINX.
Use a load balancer to distribute traffic across multiple instances of the application.
Perform canary releases by deploying the new version to a small subset of users and
monitoring performance before a full rollout.
1. Parsing:
The V8 engine first parses the JavaScript code into an Abstract Syntax Tree (AST).
This step checks for syntax errors and converts the code into a data structure that
can be easily manipulated.
2. Compilation:
V8 then compiles the parsed code into machine code (native code) using just-in-
time ( JIT) compilation. This process helps to optimize performance by converting
the code into executable machine instructions that can be executed directly by the
CPU.
3. Execution:
The generated machine code is executed by the V8 engine. This process also
includes managing scopes, variables, and the event loop.
4. Event Loop:
While the code executes, asynchronous callbacks are managed by the event loop.
This allows Node.js to handle multiple I/O-bound tasks (like reading from a file or
handling HTTP requests) concurrently without blocking the main thread.
89/105
90. What Are Advanced Patterns for Implementing Middleware in
Node.js?
Middleware in Node.js refers to functions that have access to the request and response
objects in an application. They are used for handling common tasks like authentication,
logging, error handling, and data parsing.
1. Composing Middleware:
You can create composable middleware functions that are organized in pipelines,
allowing you to separate concerns and create reusable middleware logic.
Example:
javascript
app.use(middleware1);
app.use(middleware2);
2. Asynchronous Middleware:
Example:
javascript
90/105
req.someData = result;
next();
} catch (error) {
next(error);
}
};
Example:
javascript
4. Conditional Middleware:
Example:
javascript
Middleware functions can return promises, and they can be chained. This is useful for
operations like database queries or external API calls.
Example:
91/105
javascript
By using these patterns, you can create flexible, scalable, and reusable middleware for
handling various aspects of your Node.js application.
Native Addons:
Node.js Native Addons are dynamically linked shared objects that extend the
functionality of Node.js with native code. These addons allow Node.js to call native
methods, access system libraries, or perform CPU-intensive operations efficiently.
You can write native code (C or C++) and compile it into a shared library (e.g., .node
files), which can be loaded into your Node.js application.
2. Write a C++ binding file that exposes functions from the native code.
3. Use require() to load the compiled addon into your JavaScript code.
javascript
N-API:
N-API is an API for building native Node.js modules that abstracts the complexities of
different Node.js versions. It helps ensure that native modules work across different
versions of Node.js without needing recompilation.
92/105
Example:
cpp
#include <node_api.h>
napi_value Add(napi_env env, napi_callback_info info) {
// Implementation of native function
}
NAPI_MODULE(NODE_GYP_MODULE_NAME, Init);
By using native addons or N-API, you can achieve high-performance, low-level system
interaction with Node.js.
1. Load Balancing:
2. Clustering:
Use the cluster module to take advantage of multiple CPU cores. This allows you to
run multiple Node.js processes that share the same server port, balancing the load
across them.
Example:
javascript
93/105
if (cluster.isMaster) {
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
} else {
http.createServer((req, res) => {
res.writeHead(200);
res.end('Hello, Node.js!');
}).listen(8000);
}
3. Microservices Architecture:
Split your application into smaller, independent microservices that can be scaled
individually. Each service can be deployed on a separate server or container.
4. Asynchronous Processing:
Offload time-consuming tasks (like image processing, data analysis, etc.) to background
queues using Worker Threads or external tools like Redis or RabbitMQ.
5. Caching:
Use caching layers like Redis or Memcached to reduce load on your database and
improve response times for frequently requested data.
Cache heavy computational results or static resources (e.g., HTML pages, images).
Move static assets (e.g., images, JavaScript files) to CDNs or edge servers closer to the
users, which helps reduce latency and offloads your primary server.
7. Rate Limiting:
94/105
Vertical Scaling: Increase resources (CPU, memory) on a single server to handle higher
loads.
Steps:
1. Create Worker Thread File: Create a file (e.g., worker.js ) that performs a task. This will
be executed by each worker.
javascript
// worker.js
const { parentPort } = require('worker_threads');
parentPort.on('message', (task) => {
const result = task * 2; // Example task
parentPort.postMessage(result);
});
2. Create Worker Pool: Use the worker_threads module to spawn multiple workers and
manage them.
javascript
class WorkerPool {
constructor(size, workerFile) {
this.size = size;
this.workerFile = workerFile;
this.workers = [];
this.queue = [];
this.busyWorkers = 0;
95/105
}
execute(task, callback) {
if (this.busyWorkers < this.size) {
this._dispatchTask(task, callback);
} else {
this.queue.push({ task, callback });
}
}
_dispatchTask(task, callback) {
this.busyWorkers++;
const worker = this.workers[this.busyWorkers - 1];
worker.once('message', (result) => {
callback(null, result);
this.busyWorkers--;
if (this.queue.length > 0) {
const { task, callback } = this.queue.shift();
this._dispatchTask(task, callback);
}
});
worker.postMessage(task);
}
}
// Usage
const pool = new WorkerPool(4, './worker.js');
pool.execute(10, (err, result) => console.log(result)); // Output: 20
By implementing a worker pool, you can efficiently handle concurrent tasks without
overwhelming the main event loop.
Steps:
96/105
1. Set Up Project:
javascript
Add a shebang line (#!/usr/bin/env node ) at the top of the script to make it
executable as a CLI tool.
Set the "bin" field in package.json to map the command to the script.
Example package.json :
json
"bin": {
"greet": "./cli.js"
}
Optionally, you can publish your CLI tool to npm for others to use.
97/105
Run npm link to test locally.
95. What Is the Role of the Repl Module in Node.js, and How Do You
Use It?
The repl (Read-Eval-Print Loop) module in Node.js provides an interactive shell where you
can execute JavaScript code and see the results immediately. It is primarily used for testing
small code snippets, debugging, or experimenting with APIs in a live environment.
Role:
It allows you to interact with your Node.js environment in a simple, interactive console. It
can evaluate expressions, execute code, and print results in real-time.
How to Use:
1. Basic REPL:
In a terminal, you can run node without any arguments to start the REPL.
Example:
bash
$ node
> console.log("Hello, Node.js!");
Hello, Node.js!
2. Custom REPL in Code: You can also create a custom REPL using the repl module to
define specific behavior.
Example:
javascript
98/105
});
server.on('exit', () => {
console.log('REPL exited');
});
The REPL module is a powerful tool for experimentation and debugging, especially when
learning or prototyping in Node.js.
96. How Do You Write Tests for Node.js Using Advanced Mocking
Techniques?
In Node.js, writing tests typically involves using testing frameworks such as Mocha, Jest, or
Jasmine. Mocking is used to simulate the behavior of external dependencies (e.g., databases,
APIs, or other services) to test units of your code in isolation.
1. Use a Mocking Library: Libraries like Sinon.js, jest.mock(), or proxyquire are commonly
used for mocking.
Sinon.js allows for creating mocks, stubs, and spies, which you can use to control
and track function calls during tests.
2. Mocking Dependencies:
Use Sinon’s spies and stubs to replace real functions with mock functions in your
unit tests.
javascript
99/105
expect(result).toEqual('mocked data');
fakeApi.restore(); // Restore original behavior
});
});
Proxyquire allows you to mock the dependencies of the module being tested.
javascript
describe('myModule', () => {
it('should use the mocked API', () => {
const result = myModule.fetchData();
expect(result).toBe('mocked response');
});
});
4. Mocking Time:
Example:
javascript
jest.useFakeTimers();
const callback = jest.fn();
setTimeout(callback, 1000);
jest.runAllTimers(); // Fast-forward the timers
expect(callback).toHaveBeenCalled();
100/105
97. What Are Best Practices for Securing a Node.js Application?
Securing a Node.js application is critical to prevent unauthorized access, data leaks, and
other vulnerabilities. Here are some best practices:
Regularly update dependencies to avoid known security vulnerabilities. Use tools like
npm audit or Snyk to detect vulnerabilities in your dependencies.
Always validate and sanitize user inputs to prevent injection attacks (e.g., SQL injection,
NoSQL injection, Cross-Site Scripting (XSS)).
3. Use HTTPS:
Always use HTTPS instead of HTTP to encrypt data in transit. Tools like Let’s Encrypt can
provide free SSL certificates.
Do not store sensitive data like passwords or API keys in plaintext. Use hashing (e.g.,
bcrypt for passwords) and encryption to secure sensitive information.
Use secure methods like JWT ( JSON Web Tokens) or OAuth for authentication and
authorization.
Helmet is a middleware that helps set various HTTP headers to secure your application
(e.g., X-Content-Type-Options , Strict-Transport-Security ).
Example:
javascript
101/105
const helmet = require('helmet');
app.use(helmet());
8. Error Handling:
Avoid revealing stack traces or detailed error messages to users in production. Use a
logging library (e.g., Winston) to record detailed logs for debugging.
Use secure cookies, and implement session expiration to prevent session fixation or
hijacking attacks.
98. How Does Node.js Interact with the Operating System’s Network
Stack?
Node.js interacts with the operating system's network stack primarily through its net and
http modules. It uses the libuv library, which provides a cross-platform abstraction over the
1. TCP/UDP Sockets: The net module allows Node.js to create TCP or UDP servers and
clients. When you create a socket (e.g., net.createServer() ), libuv abstracts and
interacts with the OS’s network stack to establish the connection.
2. HTTP/HTTPS Requests: Node.js’s http and https modules internally handle the
creation and parsing of HTTP/HTTPS requests, utilizing underlying OS network
functionality to manage connections, headers, and data transfers.
3. Event Loop: Node.js uses its event-driven architecture and libuv’s event loop to handle
network events asynchronously. When a network event occurs (e.g., data received on a
socket), libuv pushes that event into the event loop, and Node.js processes it
asynchronously.
4. DNS Resolution: Node.js uses the operating system’s DNS resolver to handle domain
name lookups when making HTTP requests or when establishing socket connections.
102/105
99. How Do You Use the inspector Module for Advanced Debugging in
Node.js?
The inspector module in Node.js allows you to perform advanced debugging using the
Chrome DevTools or any other debugging tools that support the V8 Inspector Protocol.
How to Use:
1. Start the Inspector: You can start the Node.js application with the --inspect or --
inspect-brk flag to enable debugging.
--inspect-brk : Starts the debugger and pauses at the first line of code.
Example:
bash
Click on "Inspect" under "Remote Targets" to connect to your Node.js process and
start debugging.
3. Remote Debugging:
You can also use the Node.js inspector remotely (e.g., through a VS Code or another
IDE).
4. Programmatic Access: You can programmatically control the inspector using the
inspector module.
Example:
javascript
This allows you to open the debugging port and make API calls to interact with the
debugger.
103/105
100. How Do You Implement Distributed Systems with Node.js?
Node.js can be used to implement distributed systems by breaking the application into
smaller, independent services (microservices) that communicate with each other.
Key Components:
1. Message Queues:
Queue-based systems can ensure reliable delivery of messages even in the event of
failure.
2. Service Discovery:
Consul or Eureka can be used for service discovery, allowing services to register and
discover each other dynamically.
3. Cluster Module:
Use the cluster module to scale a Node.js application horizontally across multiple
processes and cores. This helps handle high concurrency.
4. API Gateway:
Implement an API gateway (e.g., Kong, API Gateway in AWS) to provide a single
entry point for the client, routing requests to the appropriate microservices.
5. Stateful Services:
Distributed systems often require managing state. Use distributed databases (e.g.,
Cassandra, MongoDB) and shared cache systems (e.g., Redis) to handle distributed
data storage and caching.
6. Containerization:
Implement strategies for fault tolerance, such as circuit breakers (e.g., Hystrix) and
retry logic to ensure that your system can handle service failures gracefully.
8. Load Balancing:
Use load balancers (e.g., NGINX, HAProxy) to distribute traffic evenly across multiple
service instances.
104/105
By leveraging Node.js's asynchronous I/O, event-driven architecture, and external tools like
message queues and service discovery, you can build efficient and scalable distributed
systems.
105/105