web

web / ErrorScanner / 0.1.3

README.md

Introduction

This algorithm crawls through your website using the web/SiteMap algorithm and returns all of the broken links.

Note: Due to the nature of the algorithm, it may take considerably more time than 5 minutes to run on larger websites. For increasing timeout, please refer to the official Algorithmia documentation.

Input:

  • (Required): Web URL.
  • (Optional): Depth of crawling. (default = 2)

Output:

  • A list of broken links with referral webpage.

Example(s)

Example 1.

  • Parameter 1: Web page.
{
  "url": "https://website.com"
}

Output:

{
  "brokenLinks": [
    "https://website.com/link1.html",
    "https://website.com/link2.html",
    .
    .
    .
    "https://website.com/linkN.html"
  ]
}

Example 2.

  • Parameter 1: Web page
  • Parameter 2: Depth of 3
{
  "url": "https://website.com",
  "depth": 3
}

Output:

{
  "brokenLinks": [
    "https://website.com/link1.html",
    "https://website.com/link2.html",
    .
    .
    .
    "https://website.com/linkN.html"
  ]
}