Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
epi052 committed Nov 25, 2023
1 parent f305bb2 commit 3dbfa14
Show file tree
Hide file tree
Showing 3 changed files with 100 additions and 25 deletions.
96 changes: 73 additions & 23 deletions content/docs/Configuration/command-line.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,7 @@ Target selection:
Read url(s) from STDIN
--resume-from <STATE_FILE>
State file from which to resume a partially complete scan (ex. --resume-from
ferox-1606586780.state)
State file from which to resume a partially complete scan (ex. --resume-from ferox-1606586780.state)
Composite settings:
--burp
Expand All @@ -41,7 +40,7 @@ Composite settings:
Set --replay-proxy to http://127.0.0.1:8080 and set --insecure to true
--smart
Set --extract-links, --auto-tune, --collect-words, and --collect-backups to true
Set --auto-tune, --collect-words, and --collect-backups to true
--thorough
Use the same settings as --smart and set --collect-extensions to true
Expand All @@ -58,13 +57,13 @@ Proxy settings:
Request settings:
-a, --user-agent <USER_AGENT>
Sets the User-Agent (default: feroxbuster/2.8.0)
Sets the User-Agent (default: feroxbuster/2.10.2)
-A, --random-agent
Use a random User-Agent
-x, --extensions <FILE_EXTENSION>...
File extension(s) to search for (ex: -x php -x pdf js)
File extension(s) to search for (ex: -x php -x pdf js); reads values (newline-separated) from file if input starts with an @ (ex: @ext.txt)
-m, --methods <HTTP_METHODS>...
Which HTTP request method(s) should be sent (default: GET)
Expand Down Expand Up @@ -93,8 +92,7 @@ Response filters:
Filter out messages of a particular size (ex: -S 5120 -S 4927,1970)
-X, --filter-regex <REGEX>...
Filter out messages via regular expression matching on the response's body (ex: -X
'^ignore me$')
Filter out messages via regular expression matching on the response's body (ex: -X '^ignore me$')
-W, --filter-words <WORDS>...
Filter out messages of a particular word count (ex: -W 312 -W 91,82)
Expand All @@ -106,8 +104,7 @@ Response filters:
Filter out status codes (deny list) (ex: -C 200 -C 401)
--filter-similar-to <UNWANTED_PAGE>...
Filter out pages that are similar to the given page (ex. --filter-similar-to
http://site.xyz/soft404)
Filter out pages that are similar to the given page (ex. --filter-similar-to http://site.xyz/soft404)
-s, --status-codes <STATUS_CODE>...
Status Codes to include (allow list) (default: All Status Codes)
Expand All @@ -122,6 +119,15 @@ Client settings:
-k, --insecure
Disables TLS certificate validation in the client
--server-certs <PEM|DER>...
Add custom root certificate(s) for servers with unknown certificates
--client-cert <PEM>
Add a PEM encoded certificate for mutual authentication (mTLS)
--client-key <PEM>
Add a PEM encoded private key for mutual authentication (mTLS)
Scan settings:
-t, --threads <THREADS>
Number of concurrent threads (default: 50)
Expand All @@ -135,9 +141,8 @@ Scan settings:
--force-recursion
Force recursion attempts on all 'found' endpoints (still respects recursion depth)
-e, --extract-links
Extract links from response body (html, javascript, etc...); make new requests based on
findings
--dont-extract-links
Don't extract links from response body (html, javascript, etc...)
-L, --scan-limit <SCAN_LIMIT>
Limit total number of concurrent scans (default: 0, i.e. no limit)
Expand All @@ -152,7 +157,7 @@ Scan settings:
Limit total run time of all scans (ex: --time-limit 10m)
-w, --wordlist <FILE>
Path to the wordlist
Path or URL of the wordlist
--auto-tune
Automatically lower scan rate when an excessive amount of errors are encountered
Expand All @@ -165,26 +170,23 @@ Scan settings:
Dynamic collection settings:
-E, --collect-extensions
Automatically discover extensions and add them to --extensions (unless they're in
--dont-collect)
Automatically discover extensions and add them to --extensions (unless they're in --dont-collect)
-B, --collect-backups
Automatically request likely backup extensions for "found" urls
-B, --collect-backups [<collect_backups>...]
Automatically request likely backup extensions for "found" urls (default: ~, .bak, .bak2, .old, .1)
-g, --collect-words
Automatically discover important words from within responses and add them to the wordlist
-I, --dont-collect <FILE_EXTENSION>...
File extension(s) to Ignore while collecting extensions (only used with
--collect-extensions)
File extension(s) to Ignore while collecting extensions (only used with --collect-extensions)
Output settings:
-v, --verbosity...
Increase verbosity level (use -vv or more for greater effect. [CAUTION] 4 -v's is probably
too much)
Increase verbosity level (use -vv or more for greater effect. [CAUTION] 4 -v's is probably too much)
--silent
Only print URLs + turn off logging (good for piping a list of urls to other commands)
Only print URLs (or JSON w/ --json) + turn off logging (good for piping a list of urls to other commands)
-q, --quiet
Hide progress bars and banner (good for tmux windows w/ notifications)
Expand All @@ -200,4 +202,52 @@ Output settings:
--no-state
Disable state output file (*.state)
```
Update settings:
-U, --update
Update feroxbuster to the latest version
NOTE:
Options that take multiple values are very flexible. Consider the following ways of specifying
extensions:
./feroxbuster -u http://127.1 -x pdf -x js,html -x php txt json,docx
The command above adds .pdf, .js, .html, .php, .txt, .json, and .docx to each url
All of the methods above (multiple flags, space separated, comma separated, etc...) are valid
and interchangeable. The same goes for urls, headers, status codes, queries, and size filters.
EXAMPLES:
Multiple headers:
./feroxbuster -u http://127.1 -H Accept:application/json "Authorization: Bearer {token}"
IPv6, non-recursive scan with INFO-level logging enabled:
./feroxbuster -u http://[::1] --no-recursion -vv
Read urls from STDIN; pipe only resulting urls out to another tool
cat targets | ./feroxbuster --stdin --silent -s 200 301 302 --redirects -x js | fff -s 200 -o js-files
Proxy traffic through Burp
./feroxbuster -u http://127.1 --burp
Proxy traffic through a SOCKS proxy
./feroxbuster -u http://127.1 --proxy socks5://127.0.0.1:9050
Pass auth token via query parameter
./feroxbuster -u http://127.1 --query token=0123456789ABCDEF
Ludicrous speed... go!
./feroxbuster -u http://127.1 --threads 200
Limit to a total of 60 active requests at any given time (threads * scan limit)
./feroxbuster -u http://127.1 --threads 30 --scan-limit 2
Send all 200/302 responses to a proxy (only proxy requests/responses you care about)
./feroxbuster -u http://127.1 --replay-proxy http://localhost:8080 --replay-codes 200 302 --insecure
Abort or reduce scan speed to individual directory scans when too many errors have occurred
./feroxbuster -u http://127.1 --auto-bail
./feroxbuster -u http://127.1 --auto-tune
Examples and demonstrations of all features
https://epi052.github.io/feroxbuster-docs/docs/examples/```
1 change: 1 addition & 0 deletions content/docs/Configuration/default-values.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,3 +24,4 @@ Configuration begins with the following built-in default values baked into the b
- auto-filter wildcards - `true`
- output: `stdout`
- save_state: `true` (create a state file in cwd when `Ctrl+C` is received)
- backup_extensions: `["~", ".bak", ".bak2", ".old", ".1"]`
28 changes: 26 additions & 2 deletions content/docs/Examples/dynamic-collections.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,7 @@ feature is that additional requests are made based on the target site's observed

## Collect Backups

Using `--collect-backups` means that for every file found during a scan, `feroxbuster` sends additional queries for the
following extensions:
Using `--collect-backups` means that for every file found during a scan, `feroxbuster` sends additional queries for the following default extensions:

- `~`
- `.bak`
Expand All @@ -26,6 +25,12 @@ following extensions:

Any url that wasn't filtered out via status code, size, similarity, etc... is considered _found_.

Version 2.10.2 added the ability to specify a custom list of extensions to query:

```
feroxbuster -u https://some-example-site.com --collect-backups .bak .save .save~
```

## Collect Extensions

Using `--collect-extensions` means that any found url that has an extension will be added as an extension to try for
Expand All @@ -47,6 +52,8 @@ of parsed html.

### Collect Backups

#### Default extensions

```
feroxbuster -u https://some-example-site.com --collect-backups
```
Expand All @@ -64,6 +71,23 @@ The following **additional** requests are made to `some-example-site.com`:
- `index.bak` - (replace original extension)
- `.index.php.swp` - (vim swap, prefix `.` and append `.swp`)


#### Custom extension list

```
feroxbuster -u https://some-example-site.com --collect-backups .save .save~
```

Assuming a found file of:

`200 GET 127l 292w 4161c https://some-example-site.com/index.php`

The following **additional** requests are made to `some-example-site.com`:
- `index.php.save` (append extension)
- `index.php.save~`
- `index.bak` - (replace original extension)
- `.index.php.swp` - (vim swap, prefix `.` and append `.swp`)

### Collect Extensions

> most useful with `--extract-links`; may cause number of expected requests to explode
Expand Down

0 comments on commit 3dbfa14

Please sign in to comment.