2 Years of Service
70%
• Prompt Jailbreak: An attack that attempts to bypassthe LLMs’ alignment to produce restricted content bymanipulating the input prompt.
• Prompt Injection: a prompt attack that aims to overridethe original prompts by using untrusted input to produceundesired or malicious output.
• Prompt Leaking: An attack aiming to extract the system prompt by carefully crafting prompts that reveal theoriginal system prompt.
download ebook :
• Prompt Injection: a prompt attack that aims to overridethe original prompts by using untrusted input to produceundesired or malicious output.
• Prompt Leaking: An attack aiming to extract the system prompt by carefully crafting prompts that reveal theoriginal system prompt.
download ebook :
Last edited by a moderator: