Tools / Crawler / APIs / Configuration
Type: object
Parameter syntax
safetyChecks: {
  beforeIndexPublishing: {
    maxLostRecordsPercentage: number,
    maxFailedUrls: number
  }
}

About this parameter

Checks to ensure the crawl was successful.

For more information, see Safety checks

Examples

1
2
3
4
5
6
7
8
9
{
  safetyChecks: {
    beforeIndexPublishing: {
      maxLostRecordsPercentage: 10,
      maxFailedUrls: 15

    }
  }
}

Parameters

beforeIndexPublishing
type: object
Optional

Checks triggered after the Crawler finishes, and before pushing the records to Algolia into the final index.

beforeIndexPublishing ➔ maxLostRecordsPercentage

maxLostRecordsPercentage
type: number
Optional

Defines the limit of records difference between the new and the last crawl as a percentage of total records (inclusive).

Default: 10.

Minimum: 0
Maximum: 100.

If the new number of records is less than last number of records * (1 - maxLostRecordsPercentage / 100), the process throws a SafeReindexingError, blocking the Crawler until manual restart.

beforeIndexPublishing ➔ maxFailedUrls

maxFailedUrls
type: number
Optional

Stops the crawler if a specified number of pages fail to crawl. If undefined (null), the crawler won’t stop if it encounters such errors.

Default: null

Did you find this page helpful?