NodeJS robots.txt parser with support for wildcard (*) matching.
-
Updated
Oct 28, 2024 - JavaScript
NodeJS robots.txt parser with support for wildcard (*) matching.
A pure-Python robots.txt parser with support for modern conventions.
An extensible robots.txt parser and client library, with full support for every directive and specification.
🤖 robots.txt as a service. Crawls robots.txt files, downloads and parses them to check rules through an API
Go robots.txt parser
Alternative robots parser module for Python
Robots.txt parser and fetcher for Elixir
A lightweight robots.txt parser for Node.js with support for wildcards, caching and promises.
A lightweight and simple robots.txt parser in node
Visual App for Testing URLs and User-agents blocked by robots.txt Files
🤖 Ruby gem wrapper around Google Robotstxt Parser C++ library
RFC 9309 spec compliant robots.txt builder and parser. 🦾 No dependencies, fully typed.
Robots.txt parser and generator - Work in progress
A parser for robots.txt with support for wildcards. See also RFC 9309.
💧 Test your robots.txt with this testing tool. Check if a URL is blocked, which statement is blocking it and for which user agent. You can also check if the resources for the page (CSS and JavaScript) are disallowed!. Robots.txt files help you guide how search engines crawl your site, and can be an integral part of your SEO strategy.
Robots.txt parser / generator
A small, tested, no-frills parser of robots.txt files in Swift.
Parse robots.txt and traverse sitemaps.
python binding for Google robots.txt parser C++ library
Add a description, image, and links to the robots-parser topic page so that developers can more easily learn about it.
To associate your repository with the robots-parser topic, visit your repo's landing page and select "manage topics."