Skip to content

KamalAres/LLM_OWASP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

OWASP Top 10 for LLMs (2025) - Vulnerability Simulation Lab

A Python-based laboratory for demonstrating and testing the OWASP Top 10 vulnerabilities for Large Language Models using Ollama and local models.

Prerequisites

  1. Install Ollama: Download from ollama.ai
  2. Pull a model: ollama pull llama3
  3. Install Python dependencies: pip install -r requirements.txt

Usage

Run the lab:

python owasp_llm_lab.py

Menu Options:

  • 0: Run all vulnerability tests
  • 1-10: Run individual vulnerability tests
  • 11: Interactive testing mode (manual prompt testing)
  • 12: Test Ollama connection
  • 13: Start network server (0.0.0.0:4444)
  • 14: Exit

Vulnerabilities Covered

ID Vulnerability Description
LLM01 Prompt Injection Malicious inputs override system instructions
LLM02 Sensitive Information Disclosure Model reveals private information
LLM03 Supply Chain Compromised components in LLM ecosystem
LLM04 Data and Model Poisoning Malicious training data affects behavior
LLM05 Improper Output Handling Unsafe handling of model outputs
LLM06 Excessive Agency LLM given too much autonomy
LLM07 System Prompt Leakage Model reveals system instructions
LLM08 Vector and Embedding Weaknesses Vulnerabilities in vector databases
LLM09 Misinformation Model generates false information
LLM10 Unbounded Consumption Excessive resource usage

Configuration

Change the model by modifying the OWASPLLMLab initialization:

lab = OWASPLLMLab(model="llama3")  # or any other Ollama model

Output

Each test displays:

  • Vulnerability description
  • Test prompt sent to the model
  • Model's response
  • Explanation of why the response demonstrates the vulnerability

Interactive Testing Mode

Use option 11 to enter interactive mode where you can:

  • Test custom prompts manually
  • Set system prompts with system <your prompt>
  • Experiment with different vulnerability scenarios
  • Type exit to return to main menu

Network Server Mode

Use option 13 to start a network server on 0.0.0.0:4444 that allows multiple users to connect simultaneously:

Connect via netcat or telnet:

nc <server_ip> 4444
# or
telnet <server_ip> 4444

Network Commands:

  • help - Show available commands
  • list - Show all vulnerabilities
  • test <1-10> - Run specific vulnerability test
  • prompt <text> - Send custom prompt to LLM
  • system <text> - Set system prompt
  • exit - Disconnect

Troubleshooting

  • Use option 12 to test your Ollama connection
  • Ensure Ollama is running: ollama serve
  • Check available models: ollama list
  • If timeouts occur, the model might be loading
  • For network mode, ensure port 4444 is not blocked by firewall

Notes

  • Responses may vary based on the model used
  • Some vulnerabilities may not be fully demonstrated depending on model safety measures
  • This is for educational purposes only

About

A Python-based laboratory for demonstrating and testing the OWASP Top 10 vulnerabilities for Large Language Models using Ollama and local models.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages