Skip to content

Latest commit

 

History

History
43 lines (26 loc) · 1.46 KB

cariddi-examples.md

File metadata and controls

43 lines (26 loc) · 1.46 KB

Examples bulb

cariddi -examples (Print the examples)

ot NAME.txt :: output in text file

oh NAME.html :: output in html file

QUICK SCAN w/ juicy (lvl 2 of 7), secrets, endpoints

cat urls | cariddi -e -ext 2 -e -s -d 1 -c 10

DO FUCKING EVERYTHING SLOWLY FOREVER

cat urls | cariddi -e -ext 6 -e -s -intensive -c 50 -ef ~/wordlist/crucial.txt cat urls | cariddi -s (Hunt for secrets)

cat urls | cariddi -d 2 (2 seconds between a page crawled and another)

cat urls | cariddi -c 200 (Set the concurrency level to 200)

cat urls | cariddi -e (Hunt for juicy endpoints)

cat urls | cariddi -plain (Print only useful things)

cat urls | cariddi -ot target_name (Results in txt file)

cat urls | cariddi -oh target_name (Results in html file)

cat urls | cariddi -ext 2 (Hunt for juicy (level 2 of 7) files)

cat urls | cariddi -e -ef endpoints_file (Hunt for custom endpoints)

cat urls | cariddi -s -sf secrets_file (Hunt for custom secrets)

cat urls | cariddi -i forum,blog,community,open (Ignore urls containing these words)

cat urls | cariddi -it ignore_file (Ignore urls containing at least one line in the input file)

cat urls | cariddi -cache (Use the .cariddi_cache folder as cache)

cat urls | cariddi -t 5 (Set the timeout for the requests)

cat urls | cariddi -intensive (Crawl searching for resources matching 2nd level domain)

For Windows use powershell.exe -Command "cat urls | .\cariddi.exe"