Monday, 5 February 2024 - Amsterdam

Recursive JS function

Validating a long list of websites can take a lot of time. In addition, it’s not clear how much time is required to load the details of a given site. This simple post, looks at the approach taken to solve the issue.

Starting with a list of sites to validate, each site is checked individually. Let’s first look at a basic recursion example. It takes a starting index and an array.

const recursiveLoop = function (i = 0, arr) {
    if (!(arr && arr.length) || i >= arr.length) return;

    console.log(arr[i]);
    return recursiveLoop((i += 1), arr);
};

recursiveLoop(0, [1, 2, 3, 4]);

The following example shows how this would work to actually get data from the network

import fetch from "node-fetch";

const sites = [
    "https://mytoori.com",
    "https://bbc.co.uk",
    "https://news.ycombinator.com/",
];

const validateXFrameHeaders = function (siteName, headers) {
    // do some validation and return boolean
    const hasXFrameOptions = headers["X-Frame-options"];
    let msg;
    if (
        hasXFrameOptions &&
        ["deny", "sameorigin"].includes(hasXFrameOptions.toLowerCase())
    ) {
        msg = "blocks iframe through x-frame-options";
    } else msg = "allows iframes";

    console.table({ [siteName]: msg });
};

// recursive container function using closure
const fetchHeaders = function (sitesInput, actionInput) {
    let sites = sitesInput,
        action = actionInput;

    return async function recursiveLoop(i) {
        // stop the loop if the end has been reached
        if (!(sites && sites.length) || i >= sites.length) {
            console.log(
                `------------ \nFinished looping over ${sites.length} sites`
            );
            return;
        }

        // validation (this is very crude for the sake of the example)
        const data = await fetch(sites[i]);
        action(sites[i], data?.headers);

        // continue with the next iteration
        return await recursiveLoop((i += 1));
    };
};

// sites to go over; action to take
const start = fetchHeaders(sites, validateXFrameHeaders);
(async () => await start(0))(); // starting point is 0

The benefit of this is that it doesn’t matter how long it takes to get the data back from the site. The loop will only continue once that information has been received.

If any of the sites go over a certain amount of time the network request will timeout and the loop will continue. If that wait time needs to be shortened then setTimeout can be used.