-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support new Prompt Injection
scan test
#417
Labels
Type: enhancement
New feature or request.
Comments
aborovsky
added a commit
that referenced
this issue
Jul 28, 2023
aborovsky
added a commit
that referenced
this issue
Jul 28, 2023
aborovsky
changed the title
Support new scan test
Support new Jul 29, 2023
Prompt Injection Test
Prompt Injection
scan test
denis-maiorov-brightsec
pushed a commit
that referenced
this issue
Jul 31, 2023
aborovsky
added a commit
that referenced
this issue
Jul 31, 2023
@orubin, please refrain from closing the issues manually. They are supposed to be closed once the next version is released, and this will be done automatically. Otherwise, we'll lose the link to the development. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Prompt Injection Vulnerabilities in LLMs involve crafty inputs leading to undetected manipulations. The impact ranges from data exposure to unauthorized actions, serving attackers goals.
The text was updated successfully, but these errors were encountered: