Problem
We are required to write a JavaScript function that takes in an array of numbers, arr, as the first and the only argument.
The array, arr, of length N contains all integers from 0 to N-1. Our function is supposed to find and return the longest length of set S, where S[i] = {A[i], A[A[i]], A[A[A[i]]], ... } subjected to the rule below.
Suppose the first element in S starts with the selection of element A[i] of index = i, the next element in S should be A[A[i]], and then A[A[A[i]]]… By that analogy, we stop adding right before a duplicate element occurs in S.
For example, if the input to the function is −
const arr = [5, 4, 0, 3, 1, 6, 2];
Then the output should be−
const output = 4;
Output Explanation
A[0] = 5, A[1] = 4, A[2] = 0, A[3] = 3, A[4] = 1, A[5] = 6, A[6] = 2.
One of the longest S[K]:
S[0] = {A[0], A[5], A[6], A[2]} = {5, 6, 2, 0}
Example
Following is the code −
const arr = [5, 4, 0, 3, 1, 6, 2]; const arrayNesting = (arr = []) => { const visited = {} const aux = (index) => { if (visited[index]) { return 0 } visited[index] = true return aux(arr[index], visited) + 1 } let max = 0 arr.forEach((n, index) => { if (!visited[index]) { max = Math.max(max, aux(index)) } ) return max } console.log(arrayNesting(arr));
Output
Following is the console output −
4