An autonomous AI system that takes raw data and problem descriptions, then automatically develops, trains, and deploys machine learning models using Google's Gemini and Vertex AI.
This is a simplified version focused on core ML pipeline functionality:
- β Automated problem analysis
- β Data processing and labeling
- β Model selection and training
- β Real-time progress tracking
- β Model deployment and testing
.
βββ backend/ # FastAPI backend
β βββ app/
β β βββ api/ # API endpoints
β β βββ core/ # Config and core utilities
β β βββ schemas/ # Pydantic data models
β β βββ services/ # Business logic
β β β βββ agent/ # AI agent components
β β β βββ cloud/ # GCP integrations
β β βββ utils/ # Helper utilities
β βββ tests/ # Backend tests
β βββ pyproject.toml # Python dependencies
β
βββ frontend/ # React frontend
β βββ src/
β β βββ components/ # React components
β β βββ pages/ # Page components
β β βββ hooks/ # Custom hooks
β β βββ services/ # API client
β β βββ contexts/ # React contexts
β βββ package.json # Node dependencies
β
βββ docs/ # Documentation
β βββ guides/ # Setup guides
β
βββ docker-compose.yml # Local development setup
- Python 3.11+
- Node.js 18+
- Docker & Docker Compose
- Google Cloud account with billing enabled
Follow the detailed guide: docs/guides/google-cloud-setup.md
Quick version:
# Enable required APIs
gcloud services enable aiplatform.googleapis.com storage.googleapis.com
# Create service account and download key
gcloud iam service-accounts create agentic-platform-sa
gcloud iam service-accounts keys create gcp-key.json \
--iam-account=agentic-platform-sa@YOUR_PROJECT_ID.iam.gserviceaccount.com
# Move key to credentials directory
mkdir -p gcp-credentials
mv gcp-key.json gcp-credentials/# Copy example env file
cp .env.example .env
# Edit .env and update:
# - GOOGLE_CLOUD_PROJECT
# - GCS_BUCKET_NAME
# - GEMINI_API_KEY# Start all services
docker-compose up
# Backend will be available at http://localhost:8000
# Frontend will be available at http://localhost:3000cd backend
poetry installcd backend
# Install dependencies
poetry install
# Run development server
poetry run uvicorn app.main:app --reload
# Run tests
poetry run pytest
# Format code
poetry run black .
poetry run ruff check .cd frontend
# Install dependencies
npm install
# Run development server
npm run dev
# Run tests
npm test
# Build for production
npm run buildSee .kiro/specs/agentic-model-training-platform/tasks.md for detailed task list.
Completed:
- β Project infrastructure setup
- β Docker configurations
- β Environment configuration
- β GCS storage structure
In Progress:
- π GCS storage manager and Pydantic schemas
- π Core agent components
- π API endpoints
- π Frontend UI
- Automated Problem Analysis: Upload data and describe your problem - the system analyzes and determines the best ML approach
- Smart Data Processing: Automatic data validation, cleaning, and feature engineering
- Intelligent Labeling: Zero-shot labeling for unlabeled datasets using Gemini
- Model Selection: AI-powered model architecture and hyperparameter recommendations
- Automated Training: Hands-off training on Vertex AI with progress monitoring
- Real-time Updates: WebSocket-based live progress tracking
- Easy Deployment: One-click model deployment with API endpoints
- Interactive Testing: Test your models directly in the UI
Backend:
- FastAPI (Python web framework)
- Google Cloud Storage (Metadata and file storage)
- Celery + Redis (Task queue)
- Google Vertex AI (Model training)
- Google Gemini (AI agent)
- Pydantic (Data validation)
Frontend:
- React 18 + TypeScript
- Material-UI (Components)
- React Query (Data fetching)
- React Router (Routing)
- Vite (Build tool)
Infrastructure:
- Docker + Docker Compose
- Google Cloud Storage
- Google Cloud Logging
Once the backend is running, visit:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
# Check if services are running
docker-compose ps
# View logs
docker-compose logs backend
docker-compose logs redis# Verify credentials file exists
ls -la gcp-credentials/gcp-key.json
# Test authentication
gcloud auth activate-service-account --key-file=gcp-credentials/gcp-key.json# Stop all containers
docker-compose down
# Check what's using the port
lsof -i :8000 # or :3000 for frontendThis is a hackathon project. Feel free to fork and experiment!
MIT License - feel free to use this for your own projects.