Global Vs Local Installation: NPM Install Express

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 20

Node:

Global vs Local Installation


npm install express

By default, npm installs dependency in local mode. Here local mode specifies the folder where Node
application is present. For example if you installed express module, it created node_modules directory
in the current directory where it installed express module.

You can use npm ls command to list down all the locally installed modules.
Open the Node.js command prompt and execute "npm ls":

Globally installed packages/dependencies are stored in system directory. Let's install express module
using global installation. Although it will also produce the same result but modules will be installed
globally.
Open Node.js command prompt and execute the following code:
npm install express -g  

Uninstalling a Module

To uninstall a Node.js module, use the following command:


npm uninstall express 

"npm search express" command is used to search express or module.


npm search express 

Callbacks and its drawback:

Using callbacks can be quite "messy" if you have a sequence of dependent asynchronous operations that
must be performed in order, because this results in multiple levels of nested callbacks. This problem is
commonly known as "callback hell". This problem can be reduced by good coding practices (see
http://callbackhell.com/), using a module like async, or even moving to ES6 features like Promises.

What is an error-first callback?

Error-first callbacks are used to pass errors and data as well. You have to pass the error as the first
parameter, and it has to be checked to see if something went wrong. Additional arguments are used to
pass data.

fs.readFile(filePath, function(err, data) {

if (err) {

// handle the error, the return is important here

// so execution stops here


return console.log(err)

// use the data object

console.log(data)

})

How can you avoid callback hells?

There are lots of ways to solve the issue of callback hells:

modularization: break callbacks into independent functions

use a control flow library, like async

use generators with Promises

use async/await (note that it is only available in the latest v7 release and not in the LTS version.

What are Promises?

Promises are a concurrency primitive, first described in the 80s. Now they are part of most modern
programming languages to make your life easier. Promises can help you better handle async operations.

An example can be the following snippet, which after 100ms prints out the result string to the standard
output. Also, note the catch, which can be used for error handling. Promises are chainable.

new Promise((resolve, reject) => {

setTimeout(() => {

resolve('result')

}, 100)

})

.then(console.log)

.catch(console.error)

What's a stub? Name a use case!

Stubs are functions/programs that simulate the behaviors of components/modules. Stubs provide
canned answers to function calls made during test cases.

An example can be writing a file, without actually doing so.


var fs = require('fs')

var writeFileStub = sinon.stub(fs, 'writeFile', function (path, data, cb) {

return cb(null)

})

expect(writeFileStub).to.be.called writeFileStub.restore()

When are background/worker processes useful? How can you handle worker tasks?

Worker processes are extremely useful if you'd like to do data processing in the background, like
sending out emails or processing images.

There are lots of options for this like RabbitMQ or Kafka.

How can you secure your HTTP cookies against XSS attacks?

XSS occurs when the attacker injects executable JavaScript code into the HTML response.

To mitigate these attacks, you have to set flags on the set-cookie HTTP header:

HttpOnly - this attribute is used to help prevent attacks such as cross-site scripting since it does not allow
the cookie to be accessed via JavaScript.

secure - this attribute tells the browser to only send the cookie if the request is being sent over HTTPS.

So it would look something like this: Set-Cookie: sid=<cookie-value>; HttpOnly. If you are using Express,
with express-cookie session, it is working by default.

How can you make sure your dependencies are safe?

When writing Node.js applications, ending up with hundreds or even thousands of dependencies can
easily happen.

For example, if you depend on Express, you depend on 27 other modules directly, and of course on
those dependencies' as well, so manually checking all of them is not an option!

The only option is to automate the update / security audit of your dependencies. For that there are free
and paid options:

npm outdated
Trace by RisingStack
NSP
GreenKeeper
Snyk
What's wrong with the code snippet?

new Promise((resolve, reject) => {

throw new Error('error')


}).then(console.log)

The Solution

As there is no catch after the then. This way the error will be a silent one, there will be no indication of
an error thrown.

To fix it, you can do the following:

new Promise((resolve, reject) => {

throw new Error('error')

}).then(console.log).catch(console.error)

If you have to debug a huge codebase, and you don't know which Promise can potentially hide an issue,
you can use the unhandledRejection hook. It will print out all unhandled Promise rejections.

process.on('unhandledRejection', (err) => {

console.log(err)

})

What's wrong with the following code snippet?

function checkApiKey (apiKeyFromDb, apiKeyReceived) {


if (apiKeyFromDb === apiKeyReceived) {
return true
}
return false
}
The Solution

When you compare security credentials it is crucial that you don't leak any information, so you have to
make sure that you compare them in fixed time. If you fail to do so, your application will be vulnerable
to timing attacks.

But why does it work like that?

V8, the JavaScript engine used by Node.js, tries to optimize the code you run from a performance point
of view. It starts comparing the strings character by character, and once a mismatch is found, it stops
the comparison operation. So the longer the attacker has right from the password, the more time it
takes.

To solve this issue, you can use the npm module called cryptiles.

function checkApiKey (apiKeyFromDb, apiKeyReceived) {

return cryptiles.fixedTimeComparison(apiKeyFromDb, apiKeyReceived)

What's the output of following code snippet?


Promise.resolve(1)

.then((x) => x + 1)

.then((x) => { throw new Error('My Error') })

.catch(() => 1)

.then((x) => x + 1)

.then((x) => console.log(x))

.catch(console.error)

The Answer

The short answer is 2 - however with this question I'd recommend asking the candidates to explain what
will happen line-by-line to understand how they think. It should be something like this:

A new Promise is created, that will resolve to 1.

The resolved value is incremented with 1 (so it is 2 now), and returned instantly.

The resolved value is discarded, and an error is thrown.

The error is discarded, and a new value (1) is returned.

The execution did not stop after the catch, but before the exception was handled, it continued, and a
new, incremented value (2) is returned.

The value is printed to the standard output.

This line won't run, as there was no exception.

How does Node.js handle child threads?

Node.js, in its essence, is a single thread process. It does not expose child threads and thread
management methods to the developer. Technically, Node.js does spawn child threads for certain tasks
such as asynchronous I/O, but these run behind the scenes and do not execute any application
JavaScript code, nor block the main event loop.

If threading support is desired in a Node.js application, there are tools available to enable it, such as the
ChildProcess module.

Difference between Events and Callbacks:

Although, Events and Callbacks look similar but the differences lies in the fact that callback functions are
called when an asynchronous function returns its result where as event handling works on the observer
pattern. Whenever an event gets fired, its listener function starts executing. Node.js has multiple in-built
events available through events module and EventEmitter class which is used to bind events and event
listeners.

// Import events module


var events = require('events');
// Create an eventEmitter object
var eventEmitter = new events.EventEmitter();
// Create an event handler as follows
var connectHandler = function connected() {
console.log('connection succesful.');
// Fire the data_received event
eventEmitter.emit('data_received');
}
// Bind the connection event with the handler
eventEmitter.on('connection', connectHandler);
// Bind the data_received event with the anonymous function
eventEmitter.on('data_received', function(){
console.log('data received succesfully.');
});
// Fire the connection event
eventEmitter.emit('connection');
console.log("Program Ended.");

Node.js Global Objects


Node.js global objects are global in nature and available in all modules. You don't need to
include these objects in your application; rather they can be used directly. These objects are
modules, functions, strings and object etc. Some of these objects aren't actually in the
global scope but in the module scope.

A list of Node.js global objects are given below:

o __dirname
o __filename
o Console
o Process
o Buffer
o setImmediate(callback[, arg][, ...])
o setInterval(callback, delay[, arg][, ...])
o setTimeout(callback, delay[, arg][, ...])
o clearImmediate(immediateObject)
o clearInterval(intervalObject)
o clearTimeout(timeoutObject)

console.log(), console.error(), console.warn()

Node.js Creating Buffers


There are many ways to construct a Node buffer. Following are the three mostly used
methods:
1. Create an uninitiated buffer: Following is the syntax of creating an uninitiated
buffer of 10 octets:
var buf = new Buffer(10);  
2. Create a buffer from array: Following is the syntax to create a Buffer from a given
array:
var buf = new Buffer([10, 20, 30, 40, 50]);   
1. Create a buffer from string: Following is the syntax to create a Buffer from a
given string and optionally encoding type:
var buf = new Buffer("Simply Easy Learning", "utf-8");   

Node.js Writing to buffers


Following is the method to write into a Node buffer:
Syntax: buf.write(string[, offset][, length][, encoding])  

Node.js Child Process

The Node.js child process module provides the ability to spawn child processes in a similar
manner to popen(3).

There are three major way to create child process:

o child_process.exec() method: This method runs a command in a console and


buffers the output.
o child_process.spawn() method: This method launches a new process with a given
command.
o child_process.fork() method: This method is a special case of spawn() method to
create child processes.

Node.js child_process.exec() method

The child_process.exec() method runs a command in a console and buffers the output.

Syntax:

1. child_process.exec(command[, options], callback)  

Parameters:

1) command: It specifies the command to run, with space-separated arguments.

2) options: It may contain one or more of the following options:

o cwd: It specifies the current working directory of the child process.


o env: It specifies environment key-value pairs.
o encoding: String (Default: 'utf8')
o shell: It specifies string Shell to execute the command with (Default: '/bin/sh' on
UNIX, 'cmd.exe' on Windows, The shell should understand the -c switch on UNIX
or /s /c on Windows. On Windows, command line parsing should be compatible with
cmd.exe.)
o timeout: Number (Default: 0)
o maxBuffer: Number (Default: 200*1024)
o killSignal: String (Default: 'SIGTERM')
o uid Number: Sets the user identity of the process.
o gid Number: Sets the group identity of the process.

callback: The callback function specifies three arguments error, stdout and stderr which is
called with the following output when process terminates.

Node.js child_process.exec() example 1


Let's see the simple process example to print architecture, pid, platform and
version of the process.
File: child_process_example1.js
const exec = require('child_process').exec;  
exec('my.bat', (err, stdout, stderr) => {  
  if (err) {  
    console.error(err);  
    return;  
  }  
  console.log(stdout);  
});  
Create a batch file named my.bat having the following code:
File: my.bat
dir  
mkdir child  
Open Node.js command prompt and run the following code:
node child_process_example1.js  

It will execute two commands dir and mkdir child. The dir command will display list of
current directory and mkdir command will create a new directory. For linux, you can you ls
command to display the current directory list.
It will create a new directory also.

Node.js child_process.exec() example 2


Create two js files named support.js and master.js, having the following code:
File: support.js
1. console.log("Child Process " + process.argv[2] + " executed." );  
File: master.js
const fs = require('fs');  
const child_process = require('child_process');  
for(var i=0; i<3; i++) {  
   var workerProcess = child_process.exec('node support.js '+i,  
      function (error, stdout, stderr) {  
         if (error) {  
            console.log(error.stack);  
            console.log('Error code: '+error.code);  
            console.log('Signal received: '+error.signal);  
         }  
         console.log('stdout: ' + stdout);  
         console.log('stderr: ' + stderr);  
      });  
      workerProcess.on('exit', function (code) {  
      console.log('Child process exited with exit code '+code);  
   });  
}  
Open Node.js command prompt and run the following code:
1. node master.js  
Node.js child_process.spawn() method
The child_process.spawn() method launches a new process with a given command. This
method returns streams (stdout & stderr) and it is generally used when the process returns
large amount of data.
Syntax:
1. child_process.spawn(command[, args][, options])   
Parameters:
1) command: It specifies the command to run.
2) args: It specifies an array List of string arguments.
3) options: It may contain one or more of the following options:
o cwd: It specifies the current working directory of the child process.
o env: It specifies environment key-value pairs.
o stdio: Array|String Child's stdio configuration
o customFds: Array Deprecated File descriptors for the child to use for stdio
o detached Boolean : The child will be a process group leader
o uid Number: Sets the user identity of the process.
o gid Number: Sets the group identity of the process.

Node.js child_process.spawn() example


Create two js files named support.js and master.js, having the following code:
File: support.js
1. console.log("Child Process " + process.argv[2] + " executed." );  

File: master.js
const fs = require('fs');  
const child_process = require('child_process');  
 for(var i=0; i<3; i++) {  
   var workerProcess = child_process.spawn('node', ['support.js', i]);  
  workerProcess.stdout.on('data', function (data) {  
      console.log('stdout: ' + data);  
   });  
 workerProcess.stderr.on('data', function (data) {  
console.log('stderr: ' + data);  
   });  
 workerProcess.on('close', function (code) {  
      console.log('child process exited with code ' + code);  
   });  
}  
Open Node.js command prompt and run the following code:
1. node master.js  

Node.js child_process.fork() method


The child_process.fork method is a special case of the spawn() to create Node processes.
This method returns object with a built-in communication channel in addition to having all
the methods in a normal ChildProcess instance.
Syntax:
1. child_process.fork(modulePath[, args][, options])   
Parameters:
1) modulePath: This is a string specifies the module to run in the child.
2) args: It specifies an array List of string arguments.
3) options: It may contain one or more of the following options:
o cwd: It specifies the current working directory of the child process.
o env: It specifies environment key-value pairs.
o execPath: This is a string Executable used to create the child process.
o execArgv: It specifies Array List of string arguments passed to the executable
(Default: process.execArgv).
o silent: It specifies Boolean If true, stdin, stdout, and stderr of the child will be piped
to the parent, otherwise they will be inherited from the parent, see the "pipe" and
"inherit" options for spawn()'s stdio for more details (default is false).
o uid Number: Sets the user identity of the process.
o gid Number: Sets the group identity of the process.

Node.js child_process.fork() example


Create two js files named support.js and master.js, having the following code:
File: support.js
const fs = require('fs');  
const child_process = require('child_process');  
 for(var i=0; i<3; i++) {  
   var worker_process = child_process.fork("support.js", [i]);    
  worker_process.on('close', function (code) {  
      console.log('child process exited with code ' + code);  
   });  
}  
Open Node.js command prompt and run the following code:
1. node master.js  

Node.js Streams
Streams are the objects that facilitate you to read data from a source and write data to a
destination. There are four types of streams in Node.js:
o Readable: This stream is used for read operations.
o Writable: This stream is used for write operations.
o Duplex: This stream can be used for both read and write operations.
o Transform: It is type of duplex stream where the output is computed according to
input.
Each type of stream is an Event emitter instance and throws several events at different
times. Following are some commonly used events:
o Data:This event is fired when there is data available to read.
o End:This event is fired when there is no more data available to read.
o Error: This event is fired when there is any error receiving or writing data.
o Finish:This event is fired when all data has been flushed to underlying system.

Node.js Reading from stream

Create a text file named input.txt having the following content:

1. Javatpoint is a one of the best online tutorial website to learn different technologies i
n a very easy and efficient manner.   

File: main.js

var fs = require("fs");  
var data = '';  
// Create a readable stream  
var readerStream = fs.createReadStream('input.txt');  
// Set the encoding to be utf8.   
readerStream.setEncoding('UTF8');  
// Handle stream events --> data, end, and error  
readerStream.on('data', function(chunk) {  
data += chunk;  
});  
readerStream.on('end',function(){  
console.log(data);  
});  
readerStream.on('error', function(err){  
console.log(err.stack);  
});  
console.log("Program Ended");  

Now, open the Node.js command prompt and run the main.js
Run: node main.js  

Node.js Writing to stream

Create a JavaScript file named main.js having the following code:

File: main.js

var fs = require("fs");  
var data = 'A Solution of all Technology';  
// Create a writable stream  
var writerStream = fs.createWriteStream('output.txt');  
// Write the data to stream with encoding to be utf8  
writerStream.write(data,'UTF8');  
// Mark the end of file  
writerStream.end();  
// Handle stream events --> finish, and error  
writerStream.on('finish', function() {  
    console.log("Write completed.");  
});  
writerStream.on('error', function(err){  
   console.log(err.stack);  
});  
console.log("Program Ended");  

Now open the Node.js command prompt and run the main.js
Now, you can see that a text file named "output.txt" is created where you had saved
"input.txt" and "main.js" file. In my case, it is on desktop.

Open the "output.txt" and you will see the following content.

Node.js Piping Streams

Piping is a mechanism where output of one stream is used as input to another stream.
There is no limit on piping operation.

Let's take a piping example for reading from one file and writing it to another file.

File: main.js

var fs = require("fs");  
// Create a readable stream  
var readerStream = fs.createReadStream('input.txt');  
// Create a writable stream  
var writerStream = fs.createWriteStream('output.txt');  
// Pipe the read and write operations  
// read input.txt and write data to output.txt  
readerStream.pipe(writerStream);  
console.log("Program Ended");   

Open the Node.js and run the mian.js

Now, you can see that a text file named "output.txt" is created where you had saved ?
main.js? file. In my case, it is on desktop.

Open the "output.txt" and you will see the following content.
Node.js Chaining Streams

Chaining stream is a mechanism of creating a chain of multiple stream operations by


connecting output of one stream to another stream. It is generally used with piping
operation.

Let's take an example of piping and chaining to compress a file and then decompress the
same file.

File: main.js

var fs = require("fs");  
var zlib = require('zlib');  
// Compress the file input.txt to input.txt.gz  
fs.createReadStream('input.txt')  
  .pipe(zlib.createGzip())  
  .pipe(fs.createWriteStream('input.txt.gz'));  
  console.log("File Compressed.");  

Open the Node.js command prompt and run main.js

Now you will see that file "input.txt" is compressed and a new file is created named
"input.txt.gz" in the current file.

To Decompress the same file: put the following code in the js file "main.js"

File: main.js

var fs = require("fs");  
var zlib = require('zlib');  
// Decompress the file input.txt.gz to input.txt  
fs.createReadStream('input.txt.gz')  
  .pipe(zlib.createGunzip())  
  .pipe(fs.createWriteStream('input.txt'));  
  console.log("File Decompressed.");  

Open the Node.js command prompt and run main.js.

Callbacks and its drawback:


Using callbacks can be quite "messy" if you have a sequence of dependent asynchronous
operations that must be performed in order, because this results in multiple levels of nested
callbacks. This problem is commonly known as "callback hell". This problem can be reduced
by good coding practices (see http://callbackhell.com/), using a module like async, or even
moving to ES6 features like Promises.

What is an error-first callback?

Error-first callbacks are used to pass errors and data as well. You have to pass the error as
the first parameter, and it has to be checked to see if something went wrong. Additional
arguments are used to pass data.

fs.readFile(filePath, function(err, data) {


if (err) {
// handle the error, the return is important here
// so execution stops here
return console.log(err)
}
// use the data object
console.log(data)
})

How can you avoid callback hells?

There are lots of ways to solve the issue of callback hells:


modularization: break callbacks into independent functions
use a control flow library, like async
use generators with Promises
use async/await (note that it is only available in the latest v7 release and not in the LTS
version.

What are Promises?

Promises are a concurrency primitive, first described in the 80s. Now they are part of most
modern programming languages to make your life easier. Promises can help you better
handle async operations.

An example can be the following snippet, which after 100ms prints out the result string to
the standard output. Also, note the catch, which can be used for error handling. Promises
are chainable.

new Promise((resolve, reject) => {


setTimeout(() => {
resolve('result')
}, 100)
})
.then(console.log)
.catch(console.error)

What's a stub? Name a use case!

Stubs are functions/programs that simulate the behaviors of components/modules. Stubs


provide canned answers to function calls made during test cases.

An example can be writing a file, without actually doing so.

var fs = require('fs')

var writeFileStub = sinon.stub(fs, 'writeFile', function (path, data, cb) {


return cb(null)
})

expect(writeFileStub).to.be.called writeFileStub.restore()

When are background/worker processes useful? How can you handle worker tasks?

Worker processes are extremely useful if you'd like to do data processing in the
background, like sending out emails or processing images.

There are lots of options for this like RabbitMQ or Kafka.

How can you secure your HTTP cookies against XSS attacks?

XSS occurs when the attacker injects executable JavaScript code into the HTML response.

To mitigate these attacks, you have to set flags on the set-cookie HTTP header:

HttpOnly - this attribute is used to help prevent attacks such as cross-site scripting since it
does not allow the cookie to be accessed via JavaScript.

secure - this attribute tells the browser to only send the cookie if the request is being sent
over HTTPS.

So it would look something like this: Set-Cookie: sid=<cookie-value>; HttpOnly. If you are
using Express, with express-cookie session, it is working by default.

How can you make sure your dependencies are safe?

When writing Node.js applications, ending up with hundreds or even thousands of


dependencies can easily happen.

For example, if you depend on Express, you depend on 27 other modules directly, and of
course on those dependencies' as well, so manually checking all of them is not an option!
The only option is to automate the update / security audit of your dependencies. For that
there are free and paid options:

npm outdated
Trace by RisingStack
NSP
GreenKeeper
Snyk

What's wrong with the code snippet?

new Promise((resolve, reject) => {


throw new Error('error')
}).then(console.log)

The Solution

As there is no catch after the then. This way the error will be a silent one, there will be no
indication of an error thrown.

To fix it, you can do the following:


new Promise((resolve, reject) => {
throw new Error('error')
}).then(console.log).catch(console.error)

If you have to debug a huge codebase, and you don't know which Promise can potentially
hide an issue, you can use the unhandledRejection hook. It will print out all unhandled
Promise rejections.

process.on('unhandledRejection', (err) => {


console.log(err)
})
What's wrong with the following code snippet?
function checkApiKey (apiKeyFromDb, apiKeyReceived) {
if (apiKeyFromDb === apiKeyReceived) {
return true
}
return false
}

The Solution

When you compare security credentials it is crucial that you don't leak any information, so
you have to make sure that you compare them in fixed time. If you fail to do so, your
application will be vulnerable to timing attacks.

But why does it work like that?


V8, the JavaScript engine used by Node.js, tries to optimize the code you run from a
performance point of view. It starts comparing the strings character by character, and once
a mismatch is found, it stops the comparison operation. So the longer the attacker has right
from the password, the more time it takes.

To solve this issue, you can use the npm module called cryptiles.
function checkApiKey (apiKeyFromDb, apiKeyReceived) {
return cryptiles.fixedTimeComparison(apiKeyFromDb, apiKeyReceived)
}

What's the output of following code snippet?

Promise.resolve(1)
.then((x) => x + 1)
.then((x) => { throw new Error('My Error') })
.catch(() => 1)
.then((x) => x + 1)
.then((x) => console.log(x))
.catch(console.error)

The Answer

The short answer is 2 - however with this question I'd recommend asking the candidates to
explain what will happen line-by-line to understand how they think. It should be something
like this:

A new Promise is created, that will resolve to 1.


The resolved value is incremented with 1 (so it is 2 now), and returned instantly.
The resolved value is discarded, and an error is thrown.
The error is discarded, and a new value (1) is returned.
The execution did not stop after the catch, but before the exception was handled, it
continued, and a new, incremented value (2) is returned.
The value is printed to the standard output.
This line won't run, as there was no exception.

How does Node.js handle child threads?

Node.js, in its essence, is a single thread process. It does not expose child threads and
thread management methods to the developer. Technically, Node.js does spawn child
threads for certain tasks such as asynchronous I/O, but these run behind the scenes and do
not execute any application JavaScript code, nor block the main event loop.

If threading support is desired in a Node.js application, there are tools available to enable it,
such as the ChildProcess module.

Difference between Events and Callbacks:

Although, Events and Callbacks look similar but the differences lies in the fact that callback
functions are called when an asynchronous function returns its result where as event
handling works on the observer pattern. Whenever an event gets fired, its listener function
starts executing. Node.js has multiple in-built events available through events module and
EventEmitter class which is used to bind events and event listeners.

// Import events module


var events = require('events');
// Create an eventEmitter object
var eventEmitter = new events.EventEmitter();

// Create an event handler as follows


var connectHandler = function connected() {
console.log('connection succesful.');

// Fire the data_received event


eventEmitter.emit('data_received');
}

// Bind the connection event with the handler


eventEmitter.on('connection', connectHandler);
// Bind the data_received event with the anonymous function
eventEmitter.on('data_received', function(){
console.log('data received succesfully.');
});
// Fire the connection event
eventEmitter.emit('connection');
console.log("Program Ended.");

You might also like