WebRunner is a powerful and versatile reconnaissance scanner designed for web security assessments. It performs comprehensive website scans by extracting key information such as URLs, email addresses, and custom data defined by regular expressions. With configurable crawl depth, users can tailor the scanning process—from shallow overviews to deep, exhaustive analysis—making it ideal for both quick reconnaissance and in-depth security evaluations. Additionally, WebRunner can clone entire websites, providing an offline replica for further analysis, and it includes specialized scanning features to identify vulnerabilities like path traversal.
git clone https://github.com/sp34rh34d/WebRunner.git
cd DirRunner
python3 -m venv env
source env/bin/activate
pip3 install -r requirements.txt
chmod +x WebRunner.py
git clone https://github.com/sp34rh34d/WebRunner.git && cd WebRunner && python3 -m venv env && source env/bin/activate && pip3 install -r requirements.txt && chmod +x WebRunner.py
- Web Scraping/Crawler [url | email | RegEx Query]
- Web Cloner
- Path Traversal Scanner
- 403 Bypass Scanner
- SQLi Scanner
Uses Scraping mode
Usage:
python3 WebRunner.py scraping [flags]
Flags:
--url Set target URL single mode
--url-file Load targets URL from txt file
--depth Set depth level to scan
Global Flags:
--user-agent Set user-agent header, 'DirRunner v1.0' by default
-c, --cookie Set cookies to use for every HTTP requests
-k, --no-tls-validation Skip TLS certificate verification
-r, --follow-redirect Follow redirects
--timeout HTTP Timeout (default 10s)
--proxy Set proxy setting for every HTTP request [<https://proxy:port> or <https://username:passwd@proxy:port>]
-h, --help Show this message

Uses Email extractor mode
Usage:
python3 WebRunner.py email-extractor [flags]
Flags:
--url Set target URL single mode
--url-file Load targets URL from txt file
--depth Set depth level to scan
Global Flags:
--user-agent Set user-agent header, 'DirRunner v1.0' by default
-c, --cookie Set cookies to use for every HTTP requests
-k, --no-tls-validation Skip TLS certificate verification
-r, --follow-redirect Follow redirects
--timeout HTTP Timeout (default 10s)
--proxy Set proxy setting for every HTTP request [<https://proxy:port> or <https://username:passwd@proxy:port>]
-h, --help Show this message

Uses Regx mode
Usage:
python3 WebRunner.py regx [flags]
Flags:
--url Set target URL single mode
--url-file Load targets URL from txt file
--regx Set RegEx query to seek into every http response
--depth Set depth level to scan
Global Flags:
--user-agent Set user-agent header, 'DirRunner v1.0' by default
-c, --cookie Set cookies to use for every HTTP requests
-k, --no-tls-validation Skip TLS certificate verification
-r, --follow-redirect Follow redirects
--timeout HTTP Timeout (default 10s)
--proxy Set proxy setting for every HTTP request [<https://proxy:port> or <https://username:passwd@proxy:port>]
-h, --help Show this message

Uses Clone mode
Usage:
python3 WebRunner.py clone [flags]
Flags:
--url Set target URL single mode
--url-file Load targets URL from txt file
--name Set project name
Global Flags:
--user-agent Set user-agent header, 'DirRunner v1.0' by default
-c, --cookie Set cookies to use for every HTTP requests
-k, --no-tls-validation Skip TLS certificate verification
-r, --follow-redirect Follow redirects
--timeout HTTP Timeout (default 10s)
--proxy Set proxy setting for every HTTP request [<https://proxy:port> or <https://username:passwd@proxy:port>]
-h, --help Show this message

When scan a single URL this should ends with /
, example https://www.example.com/
. You can scan a specific GET parameter in the URL using FUZZ
string, example https://www.example.com/image?filename=FUZZ
. When you specify the URL https://www.example.com/path/javascript.js
the URL for scan will be https://www.example.com/path/
.
Uses Path Traversal mode
Usage:
python3 WebRunner.py traversal [flags]
Flags:
--url Set target URL single mode
--url-file Load targets URL from txt file
--threads Set threads
--depth Set depth level to scan
--min-depth This can help for traversal payloads, if u dont wanna set ../ and wanna start with ../../../ for payloads
--os Set target Operation System (windows/linux/all)
--custom-path Set a custom path to create payloads example path "cgi-bin/", every payload will start as "cgi-bin/../../../etc/passwd"
--custom-traversal-string Set a custom traversal string to create payloads example path "....//", every payload will start as ""....//....//etc/passwd"
-v,--verbose Show all requested URLs with the payload used
Global Flags:
--user-agent Set user-agent header, 'DirRunner v1.0' by default
-c, --cookie Set cookies to use for every HTTP requests
-k, --no-tls-validation Skip TLS certificate verification
-r, --follow-redirect Follow redirects
--timeout HTTP Timeout (default 10s)
--proxy Set proxy setting for every HTTP request [<https://proxy:port> or <https://username:passwd@proxy:port>]
-h, --help Show this message

With --custom-path
and --custom-traversal-string
args.
