Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support new Prompt Injection scan test #417

Closed
aborovsky opened this issue Jul 28, 2023 · 1 comment · Fixed by #418
Closed

Support new Prompt Injection scan test #417

aborovsky opened this issue Jul 28, 2023 · 1 comment · Fixed by #418
Assignees
Labels
Type: enhancement New feature or request.

Comments

@aborovsky
Copy link
Contributor

Prompt Injection Vulnerabilities in LLMs involve crafty inputs leading to undetected manipulations. The impact ranges from data exposure to unauthorized actions, serving attackers goals.

@aborovsky aborovsky added the Type: enhancement New feature or request. label Jul 28, 2023
@aborovsky aborovsky self-assigned this Jul 28, 2023
aborovsky added a commit that referenced this issue Jul 28, 2023
aborovsky added a commit that referenced this issue Jul 28, 2023
@aborovsky aborovsky changed the title Support new scan test Prompt Injection Test Support new Prompt Injection scan test Jul 29, 2023
aborovsky added a commit that referenced this issue Jul 31, 2023
@orubin orubin closed this as completed Aug 2, 2023
@derevnjuk
Copy link
Member

@orubin, please refrain from closing the issues manually. They are supposed to be closed once the next version is released, and this will be done automatically. Otherwise, we'll lose the link to the development.

@derevnjuk derevnjuk reopened this Aug 2, 2023
@derevnjuk derevnjuk linked a pull request Aug 2, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Type: enhancement New feature or request.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants