-
Notifications
You must be signed in to change notification settings - Fork 49
CLI: Command Line Interface
Credentialdigger offers some base functionalities with a command line interface.
Obviously, you need to install the credentialdigger
package first. You can refer to the README.md
for this.
All the commands support both sqlite and postgres databases. In order to use sqlite you will need to set the path of the db as argument (--sqlite /path/to/data.db
), whereas for postgres you can either export all the credentials as environment variables or pass an .env
file as argument (more on this later).
-
Download models(deprecated from v4.4.0) - Add rules
- Scan a repository
- Scan the snapshot of a repository
- Scan a pull request
- Scan all the repositories of a user
- Scan wiki page
- Scan local files and directories
- Get discoveries
If you use a version of
credentialdigger
<4.7, please prependpython -m
to the CLIs.
This feature has been deprecated in v4.4.0, when we implemented automatic download of models
Download and link a machine learning model.
Refer to Machine Learning Models for the complete explanation of how machine learning models work.
python -m credentialdigger download model_name
Add the rules contained in a file, that will be used to scan a repository.
path_to_rules <Required> The path of the file that contains the rules.
--dotenv DOTENV <Optional> The path to the .env file which will be used in all
commands. If not specified, the one in the current
directory will be used (if present).
--sqlite SQLITE <Optional> If specified, scan the repo using the sqlite client
passing as argument the path of the db. Otherwise, use postgres
(must be up and running)
Sqlite:
# Add the rules to the database using sqlite
credentialdigger add_rules /path/to/rules.yml --sqlite /path/to/mydata.db
Postgres:
# Add the rules to the database using postgres
export POSTGRES_USER=...
export ...
credentialdigger add_rules /path/to/rules.yml
or
# Add the rules to the database using postgres and an environment file
credentialdigger add_rules /path/to/rules.yml --dotenv /path/to/.env
TIP: if your env file is in the current directory and it's named
.env
, you don't need to specify the--dotenv
parameter.
The scan command allows to scan a git repo directly via the command line. It can accept multiple arguments:
repo_url <Required> The URL of the git repository to be
scanned.
--dotenv DOTENV The path to the .env file which will be used in all
commands. If not specified, the one in the current
directory will be used (if present).
--sqlite SQLITE If specified, scan the repo using the sqlite client
passing as argument the path of the db. Otherwise, use
postgres (must be up and running)
--category CATEGORY If specified, scan the repo using all the rules of
this category, otherwise use all the rules in the db
--models MODELS [MODELS ...]
A list of models for the ML false positives detection.
Cannot accept empty lists.
--debug Flag used to decide whether to visualize the
progressbars during the scan (e.g., during the
insertion of the detections in the db)
--git_username GIT_USER
Username to be used to authenticate to the git server.
It is not required for GitHub (.com and Enterprise),
but it is required for Bitbucket.
--git_token GIT_TOKEN
Git personal access token to authenticate to the git
server
--local If True, get the repository from a local directory
instead of the web
--force Force a complete re-scan of the repository, in case it
has already been scanned previously
--similarity Build and use the similarity model to compute
embeddings and allow for automatic update of similar
snippets
Sqlite:
credentialdigger scan https://github.com/user/repo --sqlite /path/to/mydata.db --models PathModel PasswordModel
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
credentialdigger scan https://github.com/user/repo [--dotenv /path/to/my/.env] --models PathModel PasswordModel
The user can authenticate to the git server in order to run a scan. This is not mandatory, but needed for some use cases (e.g., scan private repositories).
There are two options that can be set to this aim:
- the
git_token
(i.e., a GitHub personal access token or a Bitbucket app password) - the
git_username
used for authenticating the tool (i.e., it can differ from the repo owner) is optional. If not set,oauth2
is used (which is the one adopted by GitHub). Please note that agit_username
has to be set to scan private bitbucket repos.
The scan command also returns an exit status that is equal to the number of discoveries it has made during the scan. Here are two samples on how we can make use of the exit status.
credentialdigger scan https://github.com/user/repo
# $? = exit status
if [ $? -gt 0 ]; then
echo "This repo contains leaks"
else
echo "This repo contains no leaks"
fi
public class credentialdigger{
public static void main(String[] args) {
String command = "python -m credentialdigger scan https://github.com/user/repo";
try {
Process p = Runtime.getRuntime().exec(command);
p.waitFor();
int numberOfDiscoveries = p.exitValue();
if(numberOfDiscoveries>0){
System.out.println("This repo contains leaks.");
}
else{
System.out.println("This repo contains no leaks.");
}
} catch (Exception e) {
//IGNORE
}
}
}
Scan the snapshot of a repository, i.e., scan the repository at a given commit id, or at the last commit id of a given branch. The arguments are the same as in scan
, plus the following:
--snapshot <Required> The name of the branch, or the commit id
Sqlite:
credentialdigger scan_snapshot https://github.com/user/repo --snapshot my_branch_name --sqlite /path/to/mydata.db --models PathModel PasswordModel
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
credentialdigger scan_snapshot https://github.com/user/repo --snapshot my_branch_name [--dotenv /path/to/my/.env] --models PathModel PasswordModel
Scan a pull request open (or closed) in a repository, i.e., scan all the new lines introduced in the commits referenced in a pull request. The arguments are the same as in scan
(with the exception of git_username
and local
that are not supported), plus the following:
--pr <Required> The id of the pull request (its number)
--api_endpoint API_ENDPOINT
<Optional> API endpoint of the git server
Sqlite:
credentialdigger scan_pr https://github.com/user/repo --pr PR_NUMBER --sqlite /path/to/mydata.db --models PathModel PasswordModel
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
credentialdigger scan_pr https://github.com/user/repo --pr PR_NUMBER [--dotenv /path/to/my/.env] --models PathModel PasswordModel
Scan all the public repositories of a user. The arguments are the same as in scan
plus the following:
--forks <Optional> Scan also repositories forked by this user
--api_endpoint API_ENDPOINT
<Optional> API endpoint of the git server
Sqlite:
credentialdigger scan_user username --sqlite /path/to/mydata.db --models PathModel PasswordModel
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
credentialdigger scan_user username [--dotenv /path/to/my/.env] --models PathModel PasswordModel
Scan the wiki page of a project. All the arguments are the same as in scan
.
Sqlite:
credentialdigger scan_wiki https://github.com/user/repo --sqlite /path/to/mydata.db
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
credentialdigger scan_wiki https://github.com/user/repo [--dotenv /path/to/my/.env]
The 'scan_path' module can be used to scan a local directory or file on the fly from the terminal.
credentialdigger scan_path path/to/scan
credentialdigger scan_path path/to/scan --max_depth 10
A scan can be made with the --max_depth argument enabled. Maximum depth describes the maximum number of subdirectories to be scanned. If it is set to -1 or not specified, all subdirectories will be scanned.
Get discoveries after a repository has been scanned.
repo_url <Required> The url of the repo we want to retrieve the discoveries
from. Please make sure it has been scanned beforehand.
--dotenv DOTENV The path to the .env file which will be used in all
commands. If not specified, the one in the current
directory will be used (if present)
--sqlite SQLITE If specified, get the discoveries using the sqlite
client, passing as argument the path of the db.
Otherwise, use postgres (must be up and running)
--filename FILENAME Show only the discoveries contained in this file
--state STATE The state to filter discoveries on. Possible options:
[new, false_positive, addressing, not_relevant, fixed]'
--with_rules If specified, add the rule details to the discoveries
--save SAVE If specified, export the discoveries to the path passed
as an argument instead of showing them on the terminal
Sqlite:
credentialdigger get_discoveries https://github.com/user/repo --sqlite /path/to/mydata.db --save /path/to/report.csv --state new
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
credentialdigger get_discoveries https://github.com/user/repo [--dotenv /path/to/my/.env] --save /path/to/report.csv --state new
- Installation instructions: Readme
- Preparation for the scanner's rules
- Deploy over HTTPS (Optional)
- How to update the project
- How to install on MacOS ARM
- Python library
- CLI
- Web UI through the Docker installation
- Pre-commit hook