Problem
We are required to write a JavaScript function that takes in an array of integers, arr, as the first and the only argument.
Suppose two indices, i and j in the array which satisfy the following conditions −
i < j, and
arr[i] <= arr[j]
Out of all such index tuples (i, j), our function should return the difference j - i, where it is the maximum.
For example, if the input to the function is −
const arr = [6, 0, 8, 2, 1, 5];
Then the output should be −
const output = 4;
Output Explanation
The maximum difference is achieved at (i, j) = (1, 5): arr[1] = 0 and arr[5] = 5.
Example
The code for this will be −
const arr = [6, 0, 8, 2, 1, 5]; const maximumDifference = (arr = []) => { let max = 0 const stack = [0] for (let i = 1; i < arr.length; i++) { if (arr[i] < arr[stack[stack.length - 1]]) { stack.push(i) } } for (let i = arr.length - 1; i >= 0; i--) { while (arr[i] >= arr[stack[stack.length - 1]]) { max = Math.max(max, i - stack.pop()) } } return max; }; console.log(maximumDifference(arr));
Output
And the output in the console will be −
4