Getting Started with Forge Demos
This document walks you through how to set up to run demo models using TT-Forge. The following topics are covered:
NOTE: If you encounter issues, please request assistance on the TT-Forge Issues page.
NOTE: If you plan to do development work, please see the build instructions for the repo you want to work with.
Setting up a Front End to Run a Demo
Requirements: Ubuntu 24.04, Python 3.12.
This section provides instructions for how to set up your frontend so you can run models from the TT-Forge repo.
Before running one of the demos in TT-Forge, you must:
-
Determine which frontend you want to use:
- TT-XLA - For use with JAX, TensorFlow, PyTorch
- TT-Forge-ONNX - For use with ONNX and PaddlePaddle
-
Decide what setup you want to use for the frontend:
- Wheel
- Docker
- Build From Source
NOTE: At this time, if you want to use TT-Forge-ONNX, you must use Docker or the build from source option.
-
Follow the installation instructions from the repo for your selected setup method:
-
Return to this repo and follow the instructions in the Running a Demo section.
Running a Demo
To run a demo, do the following:
- Clone the TT-Forge repo (alternatively, you can download the script for the model you want to try):
git clone https://github.com/tenstorrent/tt-forge.git
- Navigate into TT-Forge and run the following command:
git submodule update --init --recursive
- Navigate to the folder for the frontend you want:
In this walkthrough, the resnet_demo.py from the TT-XLA folder is used.
- From the main folder in the TT-Forge repository, run the resnet_demo.py script:
export PYTHONPATH=.
python demos/tt-xla/cnn/resnet_demo.py
If all goes well, you should see an image of a cat, and terminal output where the model predicts what the image is and presents a score indicating how confident it is in its prediction.