This package provides a powerful, easy to use class to crawl links on a website. Under the hood, Guzzle promises are used to crawl multiple URLs concurrently.
Because the crawler can execute JavaScript, it can crawl JavaScript rendered sites. Under the hood, Chrome and Puppeteer are used to power this feature.
Here's a quick example:
use Spatie\Crawler\Crawler;
use Spatie\Crawler\CrawlResponse;
Crawler::create('https://example.com')
->onCrawled(function (string $url, CrawlResponse $response) {
echo "{$url}: {$response->status()}\n";
})
->start();
Or collect all URLs on a site:
$urls = Crawler::create('https://example.com')
->internalOnly()
->depth(3)
->foundUrls();
##We got badges
