LLMs can make mistakes, leak data, or be tricked into doing things they were not meant to do. Garak is a free, open-source tool designed to test these weaknesses. It checks for problems like hallucinations, prompt injections, jailbreaks, and toxic outputs. By running different tests, it helps developers understand where a model might fail and how to make it safer. Garak works with a wide range of models and platforms. It supports Hugging Face Hub … More
The post Garak: Open-source LLM vulnerability scanner appeared first on Help Net Security.
This article has been indexed from Help Net Security
Read the original article: