This project will eventually use Azure AI services, CosmosDB, and other Azure cloud services. For now its all local dev.
- Create a Python virtual environment: Make sure 3.12 is installed and active before you create your venv. Version verification -
brew list | grep python
export PATH="$(brew --prefix)/opt/python@3.12/libexec/bin:$PATH"Ensure you have python 3.12 on your system and selected it in your terminal before running the following:
python -m venv venv- Activate the virtual environment:
- On macOS/Linux:
source venv/bin/activate- Install required packages:
pip install -r requirements.txt
docker compose up-
Setup ODBC Driver Follow all instructions in odbc/README.md
-
Run the ingestion script which takes the sqllite stored data and loads it to sqlserver
export PYTHONPATH=./src
./src/loadin/run_data_importer.shAt this point our data has been ingested into the sqlserver database and relationships/keys have been created.
- Make Metadata Now we want to get the metadata about the tables.
export PYTHONPATH=./src
python src/metadata/get_database_ddl.pyThat was mostly informational - now we can use the metadata to generate the augmented metadata that will drive the graph db schema.
export OPENAI_API_KEY=YOUR_OPENAI_API_KEY
export PYTHONPATH=./src
python src/metadata/enrich_metadata.py- Extract the Metadata and Use It in Chat
This time when we extract the metadata we should see details about the tables and columns that will be useful for sql/dax generation.
export PYTHONPATH=./src
python src/metadata/get_database_ddl.py- Run the "Plain" LLM Chat
Ok we should be ready to run the SQL Chatbot.
pip install -r src/app/chat/plain_llm/requirements.txt
export OPENAI_API_KEY='your-api-key-here'
export PYTHONPATH=./src
python -m streamlit run src/app/chat/plain_llm/prompt_chain_app.pyExtract the Metadata and put it in Graphdb ?? TODO