No bugs can survive the test of fire; not even the ones you wrote into your codebase πͺ².
π₯ flamethrower is an open source, multi-agent, context-intelligent, debugger that utilizes AI superpowers to automate the painful task of debugging. Think a combination of GitHub Copilot's context-awareness in KillianLucas' Open Interpreter packed into a beautiful shell that works out of the box with any existing repo.
Automate: [ Write Code β Run Action β Check Logs β Repeat ] ππ
Main Differentiators
- π₯ Automate the most painful part of writing code: print statements & error logs
- βοΈ Specialized context agent for operating within existing repo
- π€ Debugging agent optimized to iteratively brute-force locate and fix bugs
- π¦ Out of the box support for any unix machine (no VS Code or VS Code alternatives)
- π¨ Seamless integration into any existing repo; just type
flamethrower
Feb_06.mp4
pip install flamethrower
Or, if you have an existing version and are looking to upgrade to the latest version
pip install --upgrade flamethrower
Navigate to your current workspace, and simply run flamethrower, or ft for the pros.
cd ./unbelievably/complicated/workspace
flamethrower
Use lowercase letters for commands you run in the shell, like python main.py or node server.ts
π₯ flamethrower: Debugging on Autopilot
Instructions:
- β¨οΈ Regular shell Use commands like ls, cd, python hello.py
- π€ LLM assistance Start command with a Capital letter, try Who are you?
- π Context Intelligent context-awareness from command, files, and stdout logs
- πͺ΅ Terminal logs All conversation & code output inside flamethrower is logged
...
$ python main.py -> SOME_ERROR
$ Wtf???? # Literally type this in the terminal
An implementation run is initiated with a natural language query that begins with an uppercase letter.
If you say 'Yes', π₯ flamethrower will debug in the background while you focus on other tasks at hand. It acts similarly to any other human engineer: adding print statements to find the root cause of the issue (which, as we know is the most annoying part). We find this pattern strikingly effective, and is where we believe LAMs have the strongest use case.
If it looks like π₯ flamethrower is obviously headed in the direction of doom, simply press CTRL+C and give it more suggestions or context.
As long as any shell command or natural language query happens within the context of π₯ flamethrower, then it is by default captured in the conversation history. That means you can:
- ask about an error that just happened, or happened 2 dialogues ago
- follow up on a previous response provided by π₯ flamethrower
Prompts sent to LLM are transparent and easy to observe. All π₯ flamethrower metadata are neatly kept in a .flamethrower subdirectory, including prompts, conversations, logs, directory info, summaries, and other metadata.
Everytime you send a query, the latest version of your files are sent over, meaning π₯ flamethrower understands that you changed your files, and are ready to process those changes.
Closed source GitHub Copilot draws context very effectively, and Quick Fix is a neat feature that explains error from stdout logs if the last command returned a non-zero return code.
The Open Interpreter, an open-source gem, specializes in crafting new code from the ground up. It's a favorite among data scientists and those needing sophisticated chart plotting, thanks to its iterative approach to achieving desired results.
π₯ flamethrower combines the agency afforded by Large Action Models (LAM) with the workspace awareness of Copilot, allowing it to take context-specific suggestions and continue iteration until a successful outcome. π₯ flamethrower is workspace-first, and aims to serve software engineers in complex tasks that need a lot of context management.
π₯ flamethrower is everyone's debugger. Fork it for your own use case, and, one PR at a time we can make the world a more bug-free place β¨ just ping me at scottsus@usc.edu and I'll help you get started.
- π§ͺ Better testing
- π Telemetry and the ability to opt in/out
- π₯½ LLM Vision to debug visual elements
- π¦ Running CodeLlama locally
- π€ Other models besides OpenAI
- π¦Ύ Default model finetuned on telemetry data
- ποΈ VS Code integration
- π» Browser interface