Configure OpenAI and Postman for your FastAPI Python Application
I'll walk you through setting up the Postman client to send requests to OpenAI for your FastAPI Python application.
Set up your OpenAI account.
You'll need to register an account to start with OpenAI’s API. Once your account is set up, you must register an API Token.
Visit Quickstart for more information.
Add Required Packages
In the last lesson, we set up PDM to manage all the dependencies and development workflow. So let’s get started.
You will need to install the following packages:
`openai`🔗 Hopefully, for apparent reasons.`pydantic`🔗 For defining models and data validation.`python-dotenv`🔗 To manage your environment variables.
$> pdm add openai pydantic python-dotenv
Adding packages to default dependencies: openai, pydantic, python-dotenv
🔒 Lock successful
Changes are written to pyproject.toml.
Synchronizing working set with resolved packages: 7 to add, 0 to update, 0 to remove
✔ Install distro 1.8.0 successful
✔ Install python-dotenv 1.0.0 successful
✔ Install certifi 2023.11.17 successful
✔ Install httpx 0.25.2 successful
✔ Install httpcore 1.0.2 successful
✔ Install tqdm 4.66.1 successful
✔ Install openai 1.3.9 successful
Installing the project as an editable package...
✔ Update the-full-stack 0.0.1 -> 0.0.1 successful
🎉 All complete!Setup .env file for managing secrets
Setting up a dotenv account isn’t required. However, you may want to do it to get used to the workflow. We are just focused on the .env file and its contents. Visit their documentation if you’d like to learn more about dotenv.org. Later in this series, I will cover deployment and get deeper into managing the portability of your secrets with dotenv vault.
For now, you can add the OpenAI Token to your application.
Create a new file in the root of the application named `.env` .
By default, the `openai` client library searches for an environment variable named `OPENAI_API_KEY`. If you decide to name this variable differently, you must inform the client about the new key name when creating your client. To add the API token, you should open the file in your text editor and paste the token into the appropriate location.
Please note that this step is crucial to ensure that the API calls are authenticated and authorized.
OPENAI_API_KEY=[YOUR_TOKEN_PROVIDED_BY_OPENAI]⚠️⚠️⚠️ This API token should be considered sensitive. It is directly connected to your OpenAI account and billing. Be sure to add a reference to the `.env` file to your `.gitignore` configuration to avoid committing the secret. You can use the following command to append this to your project. ⚠️⚠️⚠️
echo -e ".env" >> .gitignoreAdd our feature to interact with OpenAI
We’ll be modifying the `main.py` file in `the-full-stack` application. See the Step 3 branch for all the code from this lesson.
Add additional imports
In the head of the `main.py` file, we need to import some libraries from the packages installed above.
from dotenv import load_dotenv
from openai import OpenAI
from pydantic import BaseModel, Field, validator
from typing import List, Optional, UnionLoad secrets into the environment
The `python-dotenv` package provides a simple command, `load_dotenv()` to provide access to variables defined in the `.env` file as if they came from the systems environment. More details can be found on their GitHub repository.
load_dotenv() # take environment variables from .env.Instantiate an OpenAI Client
We then use the `OpenAI` constructor to create a client. The previous step should make the `OPENAI_API_KEY` available to the application.
Note: If you named your token something different in your .env file this is where you’d specify the environment variable used. See the documentation for more information.
client = OpenAI()Add an endpoint to manage this request
Typically, a request like this would be a HTTP POST request. I am going to call this endpoint `/gpt`.
We need to consider what the request should look like. For now, we can use a similar structure to what is defined in the Quickstart documentation.
# Taken from OpenAI Quickstart
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a poetic assistant, skilled in explaining complex programming concepts with creative flair."},
{"role": "user", "content": "Compose a poem that explains the concept of recursion in programming."}
]This is where Pydantic comes into play. We first define the root model for the request.
# gpt_request.py
from pydantic import BaseModel, Field, validator
from typing import List, Optional
from .message import Message
# Define the class for the GPT request
class GPTRequest(BaseModel):
gpt_model: Optional[str] = "gpt-3.5-turbo"
messages: List[Message]
@validator("gpt_model", pre=True, always=True)
def set_gpt_model(cls, v):
return v.lower() if v is not None else "gpt-3.5-turbo"
@validator("messages")
def validate_roles(cls, messages):
if not any(msg.role.lower() == "system" for msg in messages):
raise ValueError("At least one message with the 'system' role is required")
if not any(msg.role.lower() == "user" for msg in messages):
raise ValueError("At least one message with the 'user' role is required")
return messagesYou’ll notice that the `messages` field accepts a `List` of `Message`s. We need to define what the `Message` model looks like.
# message.py
from pydantic import BaseModel, Field, validator
# Define the class for individual messages
class Message(BaseModel):
role: str = Field(default=None, examples=["system", "user"])
content: str = Field(
default=None,
examples=[
"You are a poetic assistant, skilled in explaining complex programming concepts with creative flair.",
"Compose a poem that explains the concept of recursion in programming.",
],
)
@validator("content")
def content_must_not_be_empty(cls, value):
if not value or value.strip() == "":
raise ValueError("Content must not be empty")
return valueNow, we can add the endpoint to accept the JSON payload.
# main.py
@app.post("/gpt")
def post_gpt(request: GPTRequest):
try:
completion = client.chat.completions.create(
model=request.gpt_model,
messages=[
{"role": msg.role, "content": msg.content} for msg in request.messages
],
)
return completion.choices[0].message
except Exception as e:
return {"error": str(e)}
Test it all out
Although it is possible to use `curl` to POST a JSON request can get a bit fiddly from a workflow perspective.
I suggest using Postman for development.
NOTE: Ensure the Terms of Service agreement aligns with your Company’s compliance rules before using Postman commercially.
Start your application.
Use the `run` command configured in the last lesson to start your server.
$> pdm run uvicorn
INFO: Will watch for changes in these directories: ['/Users/tom/Workspace/the-full-stack']
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started reloader process [12579] using StatReload
INFO: Started server process [12581]
INFO: Waiting for application startup.
INFO: Application startup complete.Craft a request from Postman to your server.
From the collection view, add a new collection for “The Full Stack” and define the endpoints we have coded for our application.
GET {{base_url}}:{{port}}/POST {{base_url}}:{{port}}/gpt
Postman allows us to define variables scoped to a collection for reuse. I’ve defined {{base_url}} and {{port}} to the values provided by uvicorn.
This makes it easier to make changes or add additional endpoints later. See Fig 1.
For the `POST /gpt` request configuration we can add to the Body of the request a raw JSON block.
{
"gpt_model": "gpt-3.5-turbo",
"messages": [
{
"role": "system",
"content": "You are a Dungeon Master and an expert at 5e"
},
{
"role": "user",
"content": "I need a monster for an encounter with 5 party members at level 5. The monster should be very challenging. The party is in a winter mountain setting, and the night watch has just started. Please provide the stat block for this monster and any attacks it may have, as well as any weaknesses or resistance."
}
]
}Now click the Send button.
GPT will process your request and should respond with something like this.
{
"content": "For a challenging encounter in a winter mountain setting, I suggest using the Frost Giant Everlasting as the monster. Since this creature is formidable, it would be best to use it as the main boss of the encounter. Here is the stat block for the Frost Giant Everlasting:\n\nFrost Giant Everlasting\nGargantuan giant, neutral evil\n\nAC: 15 (natural armor)\nHP: 180 (19d20 + 76)\nSpeed: 40 ft.\n\nSTR: 25 (+7)\nDEX: 9 (-1)\nCON: 19 (+4)\nINT: 12 (+1)\nWIS: 14 (+2)\nCHA: 16 (+3)\n\nSaving Throws: Str +12, Con +8, Wis +6\nSkills: Athletics +12, Perception +6, Survival +6\nDamage Resistances: Cold\nDamage Immunities: Cold (from its Rime Skin ability)\nSenses: Darkvision 60 ft., passive Perception 16\nLanguages: Giant\n\nLegendary Resistance (3/Day): If the Frost Giant Everlasting fails a saving throw, it can choose to succeed instead.\n\nSiege Monster: The Frost Giant Everlasting deals double damage to objects and structures.\n\nActions:\nMultiattack: The Frost Giant Everlasting makes two greataxe attacks or two rock attacks.\n\nGreataxe: Melee Weapon Attack: +12 to hit, reach 10 ft., one target.\nHit: 34 (6d6 + 12) slashing damage.\n\nRock: Ranged Weapon Attack: +12 to hit, range 60/240 ft., one target.\nHit: 30 (4d10 + 7) bludgeoning damage.\n\nRime Skin: Whenever the Frost Giant Everlasting takes cold damage, it regains hit points equal to half the cold damage taken.\n\nFreezing Glare (Recharge 5-6): The Frost Giant Everlasting targets one creature it can see within 60 ft. The target must succeed on a DC 16 Wisdom saving throw or be paralyzed until the end of its next turn. If the target fails by 5 or more, it is also restrained for the same duration. The target can retry the saving throw at the end of each of its turns, ending the effect on itself on a success.\n\nWeakness: Fire. The Frost Giant Everlasting is vulnerable to fire damage and takes double damage from it.\n\nWith this stat block, the Frost Giant Everlasting should prove to be a formidable challenge for the party of five level 5 adventurers. Remember to adjust the HP as necessary to match the challenge level you desire for your encounter. Good luck and have fun!",
"role": "assistant",
"function_call": null,
"tool_calls": null
}w00t! We now have our application working with the OpenAI API, complete with Frost Giants!!!
Summary
We set up our environment to use the
.envfile for secrets.We learned how to use Pydantic to define models for validating data in the request.
Added a
`POST /gpt`endpoint to the application to handle the request from a client.Used the Postman app as an API client for our application.
All code is provided on the Step 3 branch.



