Global Vs Local Installation: NPM Install Express
Global Vs Local Installation: NPM Install Express
Global Vs Local Installation: NPM Install Express
By default, npm installs dependency in local mode. Here local mode specifies the folder where Node
application is present. For example if you installed express module, it created node_modules directory
in the current directory where it installed express module.
You can use npm ls command to list down all the locally installed modules.
Open the Node.js command prompt and execute "npm ls":
Globally installed packages/dependencies are stored in system directory. Let's install express module
using global installation. Although it will also produce the same result but modules will be installed
globally.
Open Node.js command prompt and execute the following code:
npm install express -g
Uninstalling a Module
Using callbacks can be quite "messy" if you have a sequence of dependent asynchronous operations that
must be performed in order, because this results in multiple levels of nested callbacks. This problem is
commonly known as "callback hell". This problem can be reduced by good coding practices (see
http://callbackhell.com/), using a module like async, or even moving to ES6 features like Promises.
Error-first callbacks are used to pass errors and data as well. You have to pass the error as the first
parameter, and it has to be checked to see if something went wrong. Additional arguments are used to
pass data.
if (err) {
console.log(data)
})
use async/await (note that it is only available in the latest v7 release and not in the LTS version.
Promises are a concurrency primitive, first described in the 80s. Now they are part of most modern
programming languages to make your life easier. Promises can help you better handle async operations.
An example can be the following snippet, which after 100ms prints out the result string to the standard
output. Also, note the catch, which can be used for error handling. Promises are chainable.
setTimeout(() => {
resolve('result')
}, 100)
})
.then(console.log)
.catch(console.error)
Stubs are functions/programs that simulate the behaviors of components/modules. Stubs provide
canned answers to function calls made during test cases.
return cb(null)
})
expect(writeFileStub).to.be.called writeFileStub.restore()
When are background/worker processes useful? How can you handle worker tasks?
Worker processes are extremely useful if you'd like to do data processing in the background, like
sending out emails or processing images.
How can you secure your HTTP cookies against XSS attacks?
XSS occurs when the attacker injects executable JavaScript code into the HTML response.
To mitigate these attacks, you have to set flags on the set-cookie HTTP header:
HttpOnly - this attribute is used to help prevent attacks such as cross-site scripting since it does not allow
the cookie to be accessed via JavaScript.
secure - this attribute tells the browser to only send the cookie if the request is being sent over HTTPS.
So it would look something like this: Set-Cookie: sid=<cookie-value>; HttpOnly. If you are using Express,
with express-cookie session, it is working by default.
When writing Node.js applications, ending up with hundreds or even thousands of dependencies can
easily happen.
For example, if you depend on Express, you depend on 27 other modules directly, and of course on
those dependencies' as well, so manually checking all of them is not an option!
The only option is to automate the update / security audit of your dependencies. For that there are free
and paid options:
npm outdated
Trace by RisingStack
NSP
GreenKeeper
Snyk
What's wrong with the code snippet?
The Solution
As there is no catch after the then. This way the error will be a silent one, there will be no indication of
an error thrown.
}).then(console.log).catch(console.error)
If you have to debug a huge codebase, and you don't know which Promise can potentially hide an issue,
you can use the unhandledRejection hook. It will print out all unhandled Promise rejections.
console.log(err)
})
When you compare security credentials it is crucial that you don't leak any information, so you have to
make sure that you compare them in fixed time. If you fail to do so, your application will be vulnerable
to timing attacks.
V8, the JavaScript engine used by Node.js, tries to optimize the code you run from a performance point
of view. It starts comparing the strings character by character, and once a mismatch is found, it stops
the comparison operation. So the longer the attacker has right from the password, the more time it
takes.
To solve this issue, you can use the npm module called cryptiles.
.then((x) => x + 1)
.catch(() => 1)
.then((x) => x + 1)
.catch(console.error)
The Answer
The short answer is 2 - however with this question I'd recommend asking the candidates to explain what
will happen line-by-line to understand how they think. It should be something like this:
The resolved value is incremented with 1 (so it is 2 now), and returned instantly.
The execution did not stop after the catch, but before the exception was handled, it continued, and a
new, incremented value (2) is returned.
Node.js, in its essence, is a single thread process. It does not expose child threads and thread
management methods to the developer. Technically, Node.js does spawn child threads for certain tasks
such as asynchronous I/O, but these run behind the scenes and do not execute any application
JavaScript code, nor block the main event loop.
If threading support is desired in a Node.js application, there are tools available to enable it, such as the
ChildProcess module.
Although, Events and Callbacks look similar but the differences lies in the fact that callback functions are
called when an asynchronous function returns its result where as event handling works on the observer
pattern. Whenever an event gets fired, its listener function starts executing. Node.js has multiple in-built
events available through events module and EventEmitter class which is used to bind events and event
listeners.
o __dirname
o __filename
o Console
o Process
o Buffer
o setImmediate(callback[, arg][, ...])
o setInterval(callback, delay[, arg][, ...])
o setTimeout(callback, delay[, arg][, ...])
o clearImmediate(immediateObject)
o clearInterval(intervalObject)
o clearTimeout(timeoutObject)
The Node.js child process module provides the ability to spawn child processes in a similar
manner to popen(3).
The child_process.exec() method runs a command in a console and buffers the output.
Syntax:
1. child_process.exec(command[, options], callback)
Parameters:
callback: The callback function specifies three arguments error, stdout and stderr which is
called with the following output when process terminates.
It will execute two commands dir and mkdir child. The dir command will display list of
current directory and mkdir command will create a new directory. For linux, you can you ls
command to display the current directory list.
It will create a new directory also.
File: master.js
const fs = require('fs');
const child_process = require('child_process');
for(var i=0; i<3; i++) {
var workerProcess = child_process.spawn('node', ['support.js', i]);
workerProcess.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
workerProcess.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
workerProcess.on('close', function (code) {
console.log('child process exited with code ' + code);
});
}
Open Node.js command prompt and run the following code:
1. node master.js
Node.js Streams
Streams are the objects that facilitate you to read data from a source and write data to a
destination. There are four types of streams in Node.js:
o Readable: This stream is used for read operations.
o Writable: This stream is used for write operations.
o Duplex: This stream can be used for both read and write operations.
o Transform: It is type of duplex stream where the output is computed according to
input.
Each type of stream is an Event emitter instance and throws several events at different
times. Following are some commonly used events:
o Data:This event is fired when there is data available to read.
o End:This event is fired when there is no more data available to read.
o Error: This event is fired when there is any error receiving or writing data.
o Finish:This event is fired when all data has been flushed to underlying system.
1. Javatpoint is a one of the best online tutorial website to learn different technologies i
n a very easy and efficient manner.
File: main.js
var fs = require("fs");
var data = '';
// Create a readable stream
var readerStream = fs.createReadStream('input.txt');
// Set the encoding to be utf8.
readerStream.setEncoding('UTF8');
// Handle stream events --> data, end, and error
readerStream.on('data', function(chunk) {
data += chunk;
});
readerStream.on('end',function(){
console.log(data);
});
readerStream.on('error', function(err){
console.log(err.stack);
});
console.log("Program Ended");
Now, open the Node.js command prompt and run the main.js
Run: node main.js
File: main.js
var fs = require("fs");
var data = 'A Solution of all Technology';
// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');
// Write the data to stream with encoding to be utf8
writerStream.write(data,'UTF8');
// Mark the end of file
writerStream.end();
// Handle stream events --> finish, and error
writerStream.on('finish', function() {
console.log("Write completed.");
});
writerStream.on('error', function(err){
console.log(err.stack);
});
console.log("Program Ended");
Now open the Node.js command prompt and run the main.js
Now, you can see that a text file named "output.txt" is created where you had saved
"input.txt" and "main.js" file. In my case, it is on desktop.
Open the "output.txt" and you will see the following content.
Piping is a mechanism where output of one stream is used as input to another stream.
There is no limit on piping operation.
Let's take a piping example for reading from one file and writing it to another file.
File: main.js
var fs = require("fs");
// Create a readable stream
var readerStream = fs.createReadStream('input.txt');
// Create a writable stream
var writerStream = fs.createWriteStream('output.txt');
// Pipe the read and write operations
// read input.txt and write data to output.txt
readerStream.pipe(writerStream);
console.log("Program Ended");
Now, you can see that a text file named "output.txt" is created where you had saved ?
main.js? file. In my case, it is on desktop.
Open the "output.txt" and you will see the following content.
Node.js Chaining Streams
Let's take an example of piping and chaining to compress a file and then decompress the
same file.
File: main.js
var fs = require("fs");
var zlib = require('zlib');
// Compress the file input.txt to input.txt.gz
fs.createReadStream('input.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('input.txt.gz'));
console.log("File Compressed.");
Now you will see that file "input.txt" is compressed and a new file is created named
"input.txt.gz" in the current file.
To Decompress the same file: put the following code in the js file "main.js"
File: main.js
var fs = require("fs");
var zlib = require('zlib');
// Decompress the file input.txt.gz to input.txt
fs.createReadStream('input.txt.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('input.txt'));
console.log("File Decompressed.");
Error-first callbacks are used to pass errors and data as well. You have to pass the error as
the first parameter, and it has to be checked to see if something went wrong. Additional
arguments are used to pass data.
Promises are a concurrency primitive, first described in the 80s. Now they are part of most
modern programming languages to make your life easier. Promises can help you better
handle async operations.
An example can be the following snippet, which after 100ms prints out the result string to
the standard output. Also, note the catch, which can be used for error handling. Promises
are chainable.
var fs = require('fs')
expect(writeFileStub).to.be.called writeFileStub.restore()
When are background/worker processes useful? How can you handle worker tasks?
Worker processes are extremely useful if you'd like to do data processing in the
background, like sending out emails or processing images.
How can you secure your HTTP cookies against XSS attacks?
XSS occurs when the attacker injects executable JavaScript code into the HTML response.
To mitigate these attacks, you have to set flags on the set-cookie HTTP header:
HttpOnly - this attribute is used to help prevent attacks such as cross-site scripting since it
does not allow the cookie to be accessed via JavaScript.
secure - this attribute tells the browser to only send the cookie if the request is being sent
over HTTPS.
So it would look something like this: Set-Cookie: sid=<cookie-value>; HttpOnly. If you are
using Express, with express-cookie session, it is working by default.
For example, if you depend on Express, you depend on 27 other modules directly, and of
course on those dependencies' as well, so manually checking all of them is not an option!
The only option is to automate the update / security audit of your dependencies. For that
there are free and paid options:
npm outdated
Trace by RisingStack
NSP
GreenKeeper
Snyk
The Solution
As there is no catch after the then. This way the error will be a silent one, there will be no
indication of an error thrown.
If you have to debug a huge codebase, and you don't know which Promise can potentially
hide an issue, you can use the unhandledRejection hook. It will print out all unhandled
Promise rejections.
The Solution
When you compare security credentials it is crucial that you don't leak any information, so
you have to make sure that you compare them in fixed time. If you fail to do so, your
application will be vulnerable to timing attacks.
To solve this issue, you can use the npm module called cryptiles.
function checkApiKey (apiKeyFromDb, apiKeyReceived) {
return cryptiles.fixedTimeComparison(apiKeyFromDb, apiKeyReceived)
}
Promise.resolve(1)
.then((x) => x + 1)
.then((x) => { throw new Error('My Error') })
.catch(() => 1)
.then((x) => x + 1)
.then((x) => console.log(x))
.catch(console.error)
The Answer
The short answer is 2 - however with this question I'd recommend asking the candidates to
explain what will happen line-by-line to understand how they think. It should be something
like this:
Node.js, in its essence, is a single thread process. It does not expose child threads and
thread management methods to the developer. Technically, Node.js does spawn child
threads for certain tasks such as asynchronous I/O, but these run behind the scenes and do
not execute any application JavaScript code, nor block the main event loop.
If threading support is desired in a Node.js application, there are tools available to enable it,
such as the ChildProcess module.
Although, Events and Callbacks look similar but the differences lies in the fact that callback
functions are called when an asynchronous function returns its result where as event
handling works on the observer pattern. Whenever an event gets fired, its listener function
starts executing. Node.js has multiple in-built events available through events module and
EventEmitter class which is used to bind events and event listeners.