DORA (Draft Outline Research Assistant) is an advanced AI-driven tool designed to streamline the process of drafting academic papers and other related documents.
- CPU: 4 cores
- RAM: 8 GB
- Storage: 20 GB
- Network: High-speed internet for API calls
The easiest way to start the DORA services is through Docker Compose. Please make sure that Docker and Docker Compose are installed on your machine.
-
Clone the repository:
git clone git@github.com:insilicomedicine/DORA.git
-
Navigate to the DORA directory:
cd DORA/ -
Copy the environment configuration file:
cp backend/.env.example backend/.env
-
Configure the required settings in
backend/.env:DORA uses a BYOK (Bring Your Own Key) approach, meaning you need to provide your own API keys for AI services. Edit the
backend/.envfile with your API credentials:-
OPENAI_API_KEY- Your OpenAI API key (essential for document generation and all LLM-related features) -
EMBEDDING_OPENAI_API_CONFIGS- Needed for Custom bibliography and Web search functionality
Choose your AI provider:
-
Option A: Using OpenAI directly
OPENAI_API_TYPE=openai OPENAI_API_KEY=<your_openai_api_key> EMBEDDING_OPENAI_API_CONFIGS=[{"model": "text-embedding-3-small", "api_key": "<your_openai_api_key>"}]
-
Option B: Using Azure OpenAI
OPENAI_API_TYPE=azure OPENAI_API_BASE_URL=https://<your_base_url>.openai.azure.com/ OPENAI_API_VERSION=2024-10-21 OPENAI_API_DEPLOYMENT_NAME=<your_deployment> OPENAI_API_KEY=<your_azure_openai_key> EMBEDDING_OPENAI_API_CONFIGS=[{"model": "text-embedding-3-small", "base_url": "https://<your_base_url>.openai.azure.com", "version": "2023-05-15", "deployment_name": "<your_deployment>", "api_key": "<your_api_key>"}]
-
Option C: Using another openai-compatible LLM deployment (self-hosted or third-party)
OPENAI_API_TYPE=openai OPENAI_API_BASE_URL=https://<your-private-llm-endpoint> OPENAI_API_KEY=<your_private_api_key> # Additional settings may be required depending on your deployment # OPENAI_API_VERSION=<your_api_version> (if needed) # OPENAI_API_DEPLOYMENT_NAME=<your_model_name> (if needed) EMBEDDING_OPENAI_API_CONFIGS=[{"model": "text-embedding-3-small", "base_url": "https://<your-private-llm-endpoint>", "version": "<your_api_version>", "deployment_name": "<your_deployment>", "api_key": "<your_api_key>"}]
π‘ New to AI APIs? You can get an OpenAI API key at platform.openai.com. The free tier is sufficient for testing DORA's capabilities.
-
-
You can use the example docker-compose.yml, and modify it based on your own requirements. Start the services using Docker Compose:
docker compose up -d
-
Wait for the backend to be ready (usually within 30 seconds):
docker logs -f dora_backend
Press
Ctrl+Cwhen the backend is ready. -
Load initial data: Initial data needs to be populated to the database, including sample account, templates, prompts, documents, etc. Run this command to load the data. This step is only necessary when you start the services for the first time.
docker compose exec dora_backend python manage.py loaddata initial_data.json -
Open your browser and navigate to:
http://localhost -
Login with the default credentials:
- Username:
dora@test.com - Password:
dora - Alternatively, you can create your own admin user:
docker compose exec dora_backend python manage.py createsuperuser
- You can configure the templates, tools, users, etc. using Django Admin Panel at:
http://localhost/admin
For more detailed instructions about how to generate a document, or configure the templates, please refer to the User Manual.
If you need to modify the code and build the images locally instead of using the pre-built images, follow these steps:
The frontend uses a two-stage build process with a base image containing dependencies:
-
Build the base image (contains Node.js, nginx, pnpm, and dependencies):
docker build -f frontend/Base.Dockerfile -t dora-frontend-base:local frontend/
-
Build the frontend application image:
docker build --build-arg BASE_IMAGE=dora-frontend-base:local \ -t dora-frontend:local frontend/ -
Update docker-compose.yml to use your local image:
frontend: image: dora-frontend:local # ... rest of the configuration
-
Build the backend image:
docker build -t dora-backend:local backend/
-
Update docker-compose.yml to use your local images:
dora_backend: image: dora-backend:local # ... rest of the configuration dora_worker: image: dora-backend:local # ... rest of the configuration dora_backend_ws: image: dora-backend:local # ... rest of the configuration
You can customize the frontend build with these arguments:
BASE_IMAGE: The base image to use (should bedora-frontend-base:localfor local builds)GENERATE_SOURCEMAP: Set totrueto generate source maps for debuggingVITE_ENVIRONMENT: Set todevelopmentorproductionVITE_GOOGLE_CLIENT_ID: Google OAuth client ID (optional)VITE_SENTRY_DSN: Sentry DSN for error tracking (optional)SENTRY_AUTH_TOKEN: Sentry auth token for uploading source maps (optional)VITE_GA_MEASUREMENT_ID: Google Analytics measurement ID (optional)VITE_JIRA_WIDGET_KEY: Jira service desk key (optional)
After building the images locally, make sure to update the image: references in your docker-compose.yml file to point to your local images instead of the pre-built ones.
| Feature | OPEN DORA FREE Community Edition |
BASIC DORA SaaS Edition |
ULTIMATE DORA Enterprise Edition |
|---|---|---|---|
| Use your own LLM API keys | β | β | β |
| Generate documents | β | β | β |
| Pre-built basic templates included | β | β | β |
| Web search with DuckDuckGo or Serper API | β * | β | β |
| Use your custom tools & templates | β | β | β |
| AI review | β | β | β |
| Add citations & find references via web search | β | β | β |
| AI-powered text editing | β | β | β |
| Visual summary | β | β | β |
| Export documents to .docx and PDF files | β | β | β |
| Precious3GPT integration | β ** | β | β |
| 20+ advanced templates | β | β | |
| Built-in advanced AI models (no key needed) | β | β | |
| 10+ AI agents and resources | β | β | |
| Automatic citation formatting | β | β | |
| Access to Insilico curated databases: | β | β | |
| Β Β Β Β a. Full-text publication database | β | β | |
| Β Β Β Β b. PandaOmics multiomics data | β | β | |
| Β Β Β Β c. Clinical trials database | β | β | |
| Β Β Β Β d. Biomedical Knowledge graph | β | β | |
| On-premise deployment | β | ||
| Custom AI models | β | ||
| Custom agents & tools by request | β | ||
| Priority support | β |
Note
* Web search is supported in limited regions. You can configure your own Seper API key in .env file.
** Requires a deployment of the Precious3GPT model.
