Inspiration
People spend a lot of time designing and fine tuning their models and at the end of the process, many want to move their work to production. However, serving your model as an API is often a hassle and requires a lot of boilerplate code. This library aims at streamlining that process and making it extremely easy (1 line!) to serve and deploy your pytorch model as an API.
What it does
pytorch-deploy is a minimalist python package that allows a user to serve a pytorch model as an API in just one line of code! Just install the package with pip, import the package, and call deploy with your model as an argument.
Example
import torch
import torchvision.models as models
from torch_deploy import deploy
resnet18 = models.resnet18(pretrained=True)
resnet18.eval()
deploy(resnet18, pre=torch.tensor)
How we built it
We used fastapi and uvicorn to create a robust and scalable API endpoint to serve a pytorch model. pytorch was used to offer custom pre and post processing functions. In addition, the torch.nn.Module interface is leveraged for the introspection that is done to offer the maximum amount of flexibility in the inference pipeline.
Challenges we ran into
It was hard to work with teammates remotely since we are all in different time zones around the world and have different day to day schedules. Without the ability of being physically together, it was difficult to stay motivated and help each other out on problems and learning new API's or libraries.
Accomplishments that we're proud of
Being able to iterate on our product and slowly make the package more comprehensive. We came up with a simple solution to a common problem and we were able to finish building the core functionalities that we envisioned for this tool.
- Simple and intuitive to use
- Offers flexibility with custom pre and post processing functions
- Works with any PyTorch model
- A variety of sample code to showcase usage
What we learned
The process of serving a model as an API was something all team members were unfamiliar with at the start. Fiona also learned how to build a package, the general structure of packages, and how to upload a package to PyPI in order for packages to be installed with the pip command. Owen (Chang Heng) learned a lot about how to build an API using the FastAPI library. Hulbert learned how to use the FastAPI security and OAuth2 library as well as how to use JSON Web Tokens for better security.
What's next for torch-deploy
We are still working on an OAuth2 login system that requires correct user credentials to use torch-deploy with secure password encryption and temporary JWT tokens. In the future we want to expand upon our analytics features for model usage and make it more comprehensive.


Log in or sign up for Devpost to join the conversation.