I am creating this application as a practice project to learn more about
- big data
- neural networks / machine learning
- web frameworks
This project aims to build a visualizer for personal finance data, saved in Ledger format, as a web app. The application will use neural networks to predict the next transactions.
The application has two main pages: a "sumbit" page and a "view" page
-
the sumbit page will send data of a transaction to the server
-
the view page will show all the data in an organized way
The view page must be accessible only after authentication and authorization
The view page will show a prediction of the next transactions
The application uses the following technologies:
-
backend: Django -
backend ASGI: Uvicorn -
API query: Graphene (GraphQL) -
frontend: Angular -
Nginx:- Cache
- Static content server for frontend
- Proxy
-
database: Redis -
task management: Celery -
structured logging: python-json-logger -
linter: Prospector + ESLint -
unit tests: Django -
documentation: Sphinx -
stream processing: Spark -
stream producer / consumer: kafka - full
githubworkflow with issues, roadmap and milestones
Deploy / Infrastructure
- Containerized with Docker
- Production infrastructure
- Deploy with kubernetes
I strongly encourage to use Nix to have a consistant developement environment across devices.
You can enter the developement environmenti with the following command:
nix developIf you don't have Nix, you need to have python3.11 and node 20 and to install all the dependencies.
You can use a python virtual environment to download backend dependencies, you can place the environment to backend/ folder but It's up to your preference.
python3 -m venv backend/You can install dependencies for python via pip:
cd backend && pip install -r requirements.txtYon can install delendencies for Node via npm
cd forntend && npm iYou can run the backend in developement server with the following command inside backend/:
python3 manage.py runserverFor production, use gunicorn + uvicorn:
gunicorn -c gunicorn.conf.py backend.asgi -k uvicorn.worker.UvicornWorkerYou will also need a Celery worker to handle tasks:
celery -A celeryApp worker -l INFOProspector is used as a static code analyzer and linter for python, simply run:
prospectorRun unit tests with:
python3 manage.py testYou can run the frontend in dev mode with the following command:
cd frontend
npx ng serve --openOr build for production with:
npx ng build --configuration=production
You also need docker to run nginx for production
You can (and should) use a linter with:
npx ng lintYou can read the documentation in docs/_build/html/index.html
First, make sure you have the correct npm modules installed:
cd frontend && npm iDocker is very handy to setup our dev environment.
Run the containers with docker compose:
sudo docker compose up --buildThis will create and run 4 containers:
frontendimage, runnin in$LEDGER_BOARD_FRONTEND:4200backendimage, running in$LEDGER_BOARD_BACKEND:8000nginximage, running in$LEDGER_BOARD_NGINX:80redisimage, running in$LEDGER_BOARD_REDIS:6379
You should use the nginx server to test the application. It's also the only one connected to the host network when running in production.
The environment values are located in .env
Those containers use volumes, so that they don't copy any data inside: both frontend and backend automatically restart after you make a change in the code. This is very handy but available only for developement, we need a different infrastructure for production.
First, mae sure you have build the frontend for production:
cd frontend && npm i
npx ng build --configuration=productionFor production we can't use shared volumes (kubernetes doesn't let us and It's not a good choice), to run the infrastructure for production with docker, run:
sudo docker compose -f docker-compose.production.yaml up --buildThere are different docker configs for production for each container, those contain the string .production in the name.
- the backend uses gunicorn
- the frontend uses nginx to serve static files
- nginx is used as cache and proxy. The cache is only effective when running in production.
You need kind installed for local test deploy. First, create a cluster with the following command:
cd kubernetes &&
sudo kind create cluster --name ledger-board-cluster --config kubernetes-config.yamlThen load the docker images:
sudo kind load docker-image ledger-board-nginx --name ledger-board-cluster
sudo kind load docker-image ledger-board-backend --name ledger-board-cluster
sudo kind load docker-image ledger-board-backend --name ledger-board-clusterIn actual procuction, you would have to publish the images in some repo. This also enables easy CD.
Load the pod and a service with:
sudo kubectl apply -f ledger-board-nginx-service.yaml
sudo kubectl apply -f ledger-board-pod.yamlCheck pod status:
sudo kubectl get podsTest by connecting to nginx:
sudo kubectl port-forward ledger-board-application 80:80More verbose explaination can be found in kubernetes/README.md
Note: We are not using a database inside the local cluster, since It's much more stable to use a well known distributed DB provider. The connection to the DB shall be made to the provider. So far, I didn't set up any online instance so the kubernetes cluster won't be able to connect to the DB.
You can use other usefil tools like:
pre-committo run checks automatically before committingactto simulate github actionsjupyter labto open an interactive web notebook for python