Getting Started
In part 1 we walked through the process of setting up a board game recommendation system leveraging FastAPI and PostgreSQL. In Part 2 we continue this project and show how to deploy this project to a cloud service, in this case Render, to make it accessible to users.
To make this a reality, we’ll be working on setting up our PostgreSQL database on Render, populating it with our data, Dockerizing our FastAPI application, and finally deploying it to a Render Web Application.
Table of Contents
- Deploying a PostgreSQL database on Render
- Deploying a FastAPI app as a Render Web Application
– Dockerizing our application
– Pushing Docker Image to DockerHub
– Pulling from DockerHub to Render
Tooling Used
- Render
- Docker Desktop
- Docker Hub
Deploying on Render
Now we have a PostgreSQL database and a FastAPI application that work locally, and it’s time to deploy on a cloud service that can be accessed by a front-end application or end user (via Swagger). For this project, we’ll use Render; Render is a cloud platform that, for small projects, offers a more straightforward setup experience than larger cloud providers like AWS and Azure.
To get started, navigate to Render and create a new account, then you can create a new project by selecting the ‘New Project’ button shown below. Note, as of the time of this writing, Render has a trial period that should allow you to follow along at zero cost for the first month. We are calling this project fastapi-test, we then navigate into that project after it’s created.
Each project contains everything required for that project to work in a self-contained environment. In this case, we need two components: a database and a web server for our FastAPI application. Let’s start with creating our Database.
This is very simple, we select ‘Create New Service’ as shown in Figure 3 and then select ‘Postgres’. We’re then navigated to the field shown in Figure 4 to set up the database. We name our database “fastapi-database” and select the free tier to get started. Render only allows you to use the free tier database for a limited time, but it will be fine for this example, and if you needed to maintain a database longer term, the pricing is very reasonable.
After inputting our database information and selecting ‘Create’ it will take a minute to set up the database, and you’ll then be presented with the screen shown in Figure 5. We’ll save the Internal Database URL + External Database URL variables in our .env file, as we’ll need these to connect from our FastAPI application. We can then test our connection to the database using the External Database URL variable(connecting from our local machine is outside the Render Environment) and create the tables from our local machine before moving on to setting up our FastAPI application.
We then run our test database connection script, which attempts to connect to our database by using the External_Database_Url variable as the connection string and create a test table. Note that our External_Database_Url is our complete connection string for the database, so we can pass this as our single input. A successful run should result in a printout as shown in Figure 6.
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, Session
from sqlalchemy.ext.declarative import declarative_base
import os
from dotenv import load_dotenv
from utils.db_handler import DatabaseHandler
import pandas as pd
import uuid
import sys
from sqlalchemy.exc import OperationalError
import psycopg2
# Load environment variables from .env file (override=True reloads changed values)
load_dotenv(override=True)
# loaidng external database URL
database_url = os.environ.get("External_Database_Url")
if not database_url:
print("❌ External_Database_Url not found in environment variables")
print("Please check your .env file contains: External_Database_Url=your_render_postgres_url")
sys.exit(1)
print(f"Database URL loaded: {database_url[:50]}...")
# Parse the database URL to extract components for testing
from urllib.parse import urlparse
import socket
def parse_database_url(url):
"""Parse database URL to extract connection components"""
parsed = urlparse(url)
return {
'host': parsed.hostname,
'port': parsed.port or 5432,
'database': parsed.path.lstrip('/'),
'username': parsed.username,
'password': parsed.password
}
db_params = parse_database_url(database_url)
def test_network_connectivity():
"""Test network connectivity to Render PostgreSQL endpoint"""
print("\n=== Network Connectivity Tests ===")
# 1. Test DNS resolution
try:
ip_address = socket.gethostbyname(db_params['host'])
print(f"✅ DNS Resolution successful")
except socket.gaierror as e:
print(f"❌ DNS Resolution failed: {e}")
return False
# 2. Test port connectivity
try:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(10) # 10 second timeout
result = sock.connect_ex((db_params['host'], int(db_params['port'])))
sock.close()
if result == 0:
print(f"✅ Port {db_params['port']} is accessible")
return True
else:
print(f"❌ Port {db_params['port']} is NOT accessible")
print(" This might indicate a network connectivity issue")
return False
except Exception as e:
print(f"❌ Port connectivity test failed: {e}")
return False
# Run connectivity tests
network_ok = test_network_connectivity()
if not network_ok:
print("\n🔍 TROUBLESHOOTING STEPS:")
print("1. Check your internet connection")
print("2. Verify the Render PostgreSQL URL is correct")
print("3. Ensure your Render PostgreSQL instance is active")
print("4. Check if there are any Render service outages")
sys.exit(1)
print("\n=== Attempting Database Connection ===")
# connect to the database using psycopg2
try:
conn = psycopg2.connect(
host=db_params['host'],
database=db_params['database'],
user=db_params['username'],
password=db_params['password'],
port=db_params['port'],
connect_timeout=30 # 30 second timeout
)
# If the connection is successful, you can perform database operations
cursor = conn.cursor()
# Example: Execute a simple query
cursor.execute("SELECT version();")
db_version = cursor.fetchone()
print(f"✅ PostgreSQL Database Version: {db_version[0]}")
# Test creating a simple table to verify permissions
cursor.execute("CREATE TABLE IF NOT EXISTS connection_test (id SERIAL PRIMARY KEY, test_time TIMESTAMP DEFAULT NOW());")
conn.commit()
print("✅ Database permissions verified - can create tables")
cursor.close()
conn.close()
print("✅ psycopg2 connection successful!")
except psycopg2.OperationalError as e:
print(f"❌ Database connection failed: {e}")
if "timeout" in str(e).lower():
print("\n🔍 TIMEOUT TROUBLESHOOTING:")
print("- Check your internet connection")
print("- Verify the Render PostgreSQL URL is correct")
print("- Check if Render service is experiencing issues")
elif "authentication" in str(e).lower():
print("\n🔍 AUTHENTICATION TROUBLESHOOTING:")
print("- Verify the database URL contains correct credentials")
print("- Check if your Render PostgreSQL service is active")
print("- Ensure the database URL hasn't expired or changed")
sys.exit(1)
except Exception as e:
print(f"❌ Unexpected error: {e}")
sys.exit(1)
# If we get here, connection was successful, so exit the test
print(f"\n✅ All tests passed! Render PostgreSQL connection is working.")
print(f"✅ Connected to database: {db_params['database']}")
print("✅ Ready for use in your application!")
Loading Database
Now that we’ve verified that we can connect to our database from our local machine, it’s time to set up our database tables and populate them. To load our database, we’ll use our src/load_database.py file, which we previously walked through the individual pieces of this script at the start of this article, so we won’t go into further detail on it here. The only notable points are that we’re again using our External_Database_Url as our connection string, and then at the end, we’re using the test_table function that we’ve defined as part of our DatabaseHandler class. This function attempts to connect to the table name passed to it and returns the number of rows in that table.
Running this script should result in an output as shown in Figure 11, where each of the tables was created, and then at the end we recheck that we can return data from them and show that the output rows match the input rows.
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, Session
from sqlalchemy.ext.declarative import declarative_base
import os
from dotenv import load_dotenv
from utils.db_handler import DatabaseHandler
import pandas as pd
import uuid
import sys
from sqlalchemy.exc import OperationalError
import psycopg2
# Load environment variables from .env file
load_dotenv(override=True)
# Construct PostgreSQL connection URL for Render
URL_database = os.environ.get("External_Database_Url")
# Initialize DatabaseHandler with the constructed URL
engine = DatabaseHandler(URL_database)
# loading initial user data
users_df = pd.read_csv("Data/steam_users.csv")
games_df = pd.read_csv("Data/steam_games.csv")
user_games_df = pd.read_csv("Data/steam_user_games.csv")
user_recommendations_df = pd.read_csv("Data/user_recommendations.csv")
game_tags_df = pd.read_csv("Data/steam_game_tags.csv")
# Defining queries to create tables
user_table_creation_query = """CREATE TABLE IF NOT EXISTS users (
id UUID PRIMARY KEY,
username VARCHAR(255) UNIQUE NOT NULL,
password VARCHAR(255) NOT NULL,
email VARCHAR(255) NOT NULL,
role VARCHAR(50) NOT NULL
)
"""
game_table_creation_query = """CREATE TABLE IF NOT EXISTS games (
id UUID PRIMARY KEY,
appid VARCHAR(255) UNIQUE NOT NULL,
name VARCHAR(255) NOT NULL,
type VARCHAR(255),
is_free BOOLEAN DEFAULT FALSE,
short_description TEXT,
detailed_description TEXT,
developers VARCHAR(255),
publishers VARCHAR(255),
price VARCHAR(255),
genres VARCHAR(255),
categories VARCHAR(255),
release_date VARCHAR(255),
platforms TEXT,
metacritic_score FLOAT,
recommendations INTEGER
)
"""
user_games_query = """CREATE TABLE IF NOT EXISTS user_games (
id UUID PRIMARY KEY,
username VARCHAR(255) NOT NULL,
appid VARCHAR(255) NOT NULL,
shelf VARCHAR(50) DEFAULT 'Wish_List',
rating FLOAT DEFAULT 0.0,
review TEXT
)
"""
recommendation_table_creation_query = """CREATE TABLE IF NOT EXISTS user_recommendations (
id UUID PRIMARY KEY,
username VARCHAR(255),
appid VARCHAR(255),
similarity FLOAT
)
"""
game_tags_creation_query = """CREATE TABLE IF NOT EXISTS game_tags (
id UUID PRIMARY KEY,
appid VARCHAR(255) NOT NULL,
category VARCHAR(255) NOT NULL
)
"""
# Running queries to create tables
engine.delete_table('user_recommendations')
engine.delete_table('user_games')
engine.delete_table('game_tags')
engine.delete_table('games')
engine.delete_table('users')
# Create tables
engine.create_table(user_table_creation_query)
engine.create_table(game_table_creation_query)
engine.create_table(user_games_query)
engine.create_table(recommendation_table_creation_query)
engine.create_table(game_tags_creation_query)
# Ensuring each row of each dataframe has a unique ID
if 'id' not in users_df.columns:
users_df['id'] = [str(uuid.uuid4()) for _ in range(len(users_df))]
if 'id' not in games_df.columns:
games_df['id'] = [str(uuid.uuid4()) for _ in range(len(games_df))]
if 'id' not in user_games_df.columns:
user_games_df['id'] = [str(uuid.uuid4()) for _ in range(len(user_games_df))]
if 'id' not in user_recommendations_df.columns:
user_recommendations_df['id'] = [str(uuid.uuid4()) for _ in range(len(user_recommendations_df))]
if 'id' not in game_tags_df.columns:
game_tags_df['id'] = [str(uuid.uuid4()) for _ in range(len(game_tags_df))]
# Populates the 4 tables with data from the dataframes
engine.populate_table_dynamic(users_df, 'users')
engine.populate_table_dynamic(games_df, 'games')
engine.populate_table_dynamic(user_games_df, 'user_games')
engine.populate_table_dynamic(user_recommendations_df, 'user_recommendations')
engine.populate_table_dynamic(game_tags_df, 'game_tags')
# Testing if the tables were created and populated correctly
print(engine.test_table('users'))
print(engine.test_table('games'))
print(engine.test_table('user_games'))
print(engine.test_table('user_recommendations'))
print(engine.test_table('game_tags'))
Deploying a FastAPI Application on Render
We now have the first half of our project deployed on render, and it’s time to set up our FastAPI application. To do this, we’re going to use Render’s Web Application hosting service, which will allow us to deploy our FastAPI App as a web application that can be accessed by external services. If we wanted to build a full-stack application, we could then allow our front end to send requests to the FastAPI application on Render and return data to the user. However, because we’re not interested in building a front-end component at this time, we’ll instead interact with our App through the Swagger docs.
Containerizing our Application with Docker
We’ve set up our FastAPI project in a local environment, but now we need to transfer it, with all the code, dependencies, and environmental variables, to a container on Render. This could be a daunting challenge. Fortunately, Docker handles all the complicated pieces and allows us to do just that with a simple configuration file and a couple of commands. For those who haven’t used Docker, there is a great tutorial here. The brief overview is that Docker is a tool that simplifies the process of deploying and managing applications by allowing us to package our application with all its dependencies as an image and then deploy that image to a service like Render. In this project, we use DockerHub as our image repository, which serves as a central version-controlled storage area for our image, which we can then pull into Render.
Our overall flow for this project can be thought of like this FastAPI app running locally → A ‘Snapshot’ is taken with Docker and stored as a Docker Image → That Image is pushed to DockerHub → Render pulls this image and uses it to spin up a Container that runs the application on a Render Server. Getting started with this process, which we’ll walk through next, requires having Docker Desktop installed. Docker has a straightforward installation process which you can get started on here: https://www.docker.com/products/docker-desktop/
Additionally, if you don’t have one already, you’ll need a Docker Hub account as this will serve as the repository to save Docker Images to and then Pull them into Render. You can create a Docker Hub here: https://hub.docker.com/.
Building a Docker Image
To create a Docker Image for our project, first make sure Docker Desktop is running; if it isn’t, you’ll likely get an error when trying to create a Docker image. To ensure it’s running, open the Docker Desktop application from your search bar or desktop, click on the three dots in the bottom left as shown below, and ensure you see the Green dot followed by ‘Docker Desktop is running’.
Next, we need to tell Docker how to build our image, which is done by defining a Dockerfile. Our Dockerfile can be seen in Figure 9. We save it in our top-level directory, and it provides the instructions that tell Docker how to package our application into an image that can be deployed on a different piece of hardware. Let’s walk through this file to understand what it’s doing.
- FROM: Choosing Base Image: The first line in our Dockerfile specifies what base image we want to use to then extend for our application. In this case, we are using the python:3.13-slim-bullseye image, which is a lightweight Debian-based image that will serve as the base for our application.
- WORKDIR: Changing Work Directory: Here we are setting the default directory inside our container to /app
- RUN: Checking for updates to system dependencies
- COPY: Coping requirements.txt file, it’s critical that requirements.txt is up to date and contains all libraries required for the project, or the Image won’t run correctly when we try and spin it up
- RUN: Installing our requirements.txt file
- COPY: Copy our entire project from our local directory to /app, which we created in step 2
- RUN: Creating a logs directory at /app/logs
- EXPOSE: Document that the port we’ll be exposing is port 8000
- ENV: Sets our Python path to /app
- CMD: Runs our FastAPI app using Uvicorn, sets our app to the one defined in src.main:app, runs our app on port 8000
With our Dockerfile defined, we now have a set of instructions that we can give to Docker to containerize our application into an image that we can then push to Docker Hub. We can now do this with a couple of commands from our VS Code terminal, shown below. Each of these lines needs to be run separately in the VS Code terminal from the top directory of your project.
- First, we build our Docker image, which will likely take a minute or two. In this case, we are naming our image ‘recommendersystem’
- Next, we tag our image, the syntax here is image_name user_name/docker_hub_folder:image_name_on_dockerhub
- Finally, we push our image to Dockerhub again specifying the user_name/docker_hub_folder:image_name_on_dockerhub
docker build -t recommendersystem .
docker tag recommendersystem seelucas/fastapi_tutorial:fastapi_on_render
docker push seelucas/fastapi_tutorial:fastapi_on_render
After this is done, we should be able to log in to DockerHub, navigate to our project, and see that we have an image whose name matches what we gave it in the previous 3 commands, in this case, fastapi_on_render.
Pulling Docker Image to Render
Now we have our Docker Image on DockerHub, and it’s time to deploy that Image on Render. This can be done by navigating to the same project that we created our database in, “fastapi-test”, selecting “New”, in the top right, and then selecting “Web Service” as our FastAPI app will be deployed as a Web Application.
Because we are deploying our image from Dockerhub, we specify that our Source Code is an Existing Image, and as shown in Figure 11, we paste the Dockerhub Directory path to the Image we want to deploy into ‘Image URL’ in Render. We then get a notification that this is a private image, which means we’ll need to create a Dockerhub Access token that we can then use to securely pull the image from DockerHub into Render.
Fortunately, creating a DockerHub Access token is straightforward; we navigate to our DockerHub account -> Settings → Personal Access token. The screen should look like Figure 12. we provide an access token name, expiration date, and permissions. Since we’re pulling the image into Render, we only need read access rather than write or delete, so we select that.
Finally, selecting ‘Generate’ will generate our token, which we then need to copy over to render and enter as shown in Figure 13.
Once we’ve selected ‘Add Credential’ as shown above, it will then load for a minute as the credentials are saved. We’ll then be taken back to the previous screen, where we can select our credentials to use to connect to DockerHub. In this case, we’ll use the tutorial credentials we just created and select Connect. We will then have established a connection that we can use to pull our Docker Image from DockerHub to Render for Deployment.
On the next page, we proceed with setting up our Render Web applicaiton by selecting the free option and then importantly, on Environmental Variables, we copy and paste our .env file. While we don’t use all the variables in this file, we do use the ‘Internal_Database_Url’, which is the URL that FastAPI will look for in our main.py file. Without this, we won’t be able to connect to our database, so it’s critical that we provide this. Note: for testing, we previously used the ‘External_Database_Url’ because we were running the script from our local machine, which is external to our Render environment; however, here both the Database and Web Server are in the same Render environment, so we use the Internal_Database_Url in main.py.
After entering our environmental variables, we then choose ‘Deploy Web Service’.
The service will take a couple of minutes to deploy, but then you should get a notification like below that the service has deployed with a render link on top that we can access at.
Navigating to this link will take us to the Hello World method, if we add/docs to the end of it, we’ll be taken to the swagger docs in Figure 17. Here we can test and ensure our FastAPI Web Application is connected to our database by using the Fetch All Users method. We can see below that this does indeed return data.
Finally, we want to check if our user recommendations system is dynamically updating. In your previous API call, we can see that there’s a user ‘user_username’ in our database. Using the Fetch Recommended Game method with this username, we can see the top match is appid = B08BHHRSPK.
We update our users’ liked games by choosing a random one from our games appid = B0BHTKGN7F, which turns out to be ‘The Elder Scrolls: Skyrim Boardgame’, and leveraging our user_games POST method.
Adding a game to our user games table is supposed to automatically trigger the recommender pipeline to rerun for that user and generate new recommendations. If we navigate to our console, we can see that it appears to have happened as we get the new user recommendations generated message shown below.
If we navigate back to our Swagger docs, we can try the fetch recommendation method again, and we see in Figure 21 that we indeed do have a different list of recommendations than the one before. Our Recommender Pipeline is now automatically updating as users add more data and is accessible beyond our local environment.
Wrapping Up:
In this project, we’ve shown how to set up and deploy a recommendation system leveraging a FastAPI interaction layer with a PostgreSQL database to generate intelligent board game recommendations for our users. There are further steps we could take to make this system more robust, like implementing a hybrid recommendation system as we gain more user data or enabling user tagging to capture more features. Additionally, although we didn’t cover it, we did utilize a GitHub workflow to rebuild and push our Docker image whenever there’s a new update to our main branch, and this code is available in .github/workflows. This helped to greatly speedup development as we didn’t have to manually rebuild our Docker image whenever we made a small change.
I hope you enjoyed reading and that this helps you build and deploy your projects with FastAPI.
LinkedIn: https://www.linkedin.com/in/lucas-see-6b439188/
Email: [email protected]
Figures: All images, unless otherwise noted, are by the author.
Links:
- Github Repository for Project: https://github.com/pinstripezebra/recommender_system
- FastAPI Docs: https://fastapi.tiangolo.com/tutorial/
- Docker Tutorial: https://www.youtube.com/watch?v=b0HMimUb4f0
- Docker Desktop Download: https://www.youtube.com/watch?v=b0HMimUb4f0
- Docker Hub: https://hub.docker.com/
Source link
#Building #Video #Game #Recommender #System #FastAPI #PostgreSQL #Render #Part2