Awesome_GPT_Super_Prompting
未分类CyberAlbSecOP
ChatGPT Jailbreaks, GPT Assistants Prompt Leaks, GPTs Prompt Injection, LLM Prompt Security, Super Prompts, Prompt Hack, Prompt Security, Ai Prompt Engineering, Adversarial Machine Learning.
2.2k
Stars
287
Forks
0
Issues
1
Contributors
47
Watchers
adversarial-machine-learningchatgptgptgpt-3gpt-4hackingjailbreakleaksllmprompt-engineeringprompt-injectionpromptsagentaiassistantprompt-securitysystem-prompt
{"name":"GNU General Public License v3.0","spdxId":"GPL-3.0"}
Project Description
A collection of super prompts for ChatGPT on GitHub, which collects jailbreak prompts for ChatGPT, GPT agent prompts, prompt injection and protection, various GPT prompts, and Prompt Engineering learning materials, etc. The content is comprehensive and continuously updated.