On this page
Introduction
To deploy an application means to perform the necessary steps to make it available to the users.
For a web API, it normally involves putting it in a remote machine, with a server program that provides good performance, stability, etc, so that your users can access the application efficiently and without interruptions or problems.
This is in contrast to the development stages, where you are constantly changing the code, breaking it and fixing it, stopping and restarting the development server, etc.
Serving a FastAPI application behind Nginx is the “gold standard” for production. Nginx acts as a reverse proxy, handling SSL termination, buffering, and load balancing, while an ASGI server like Uvicorn (often managed by Gunicorn) runs your actual Python code.
Get Started
Here is your step-by-step guide to getting it live.
Prepare Your FastAPI Application
First, ensure you have your application ready and the necessary production servers installed.
Create a simple main.py for testing:
=
return Configure Gunicorn with Uvicorn Workers
In production, we use Gunicorn to manage multiple Uvicorn worker processes. This makes your app more resilient.
Test it manually first:
-w 4: Runs 4 worker processes.-k uvicorn.workers.UvicornWorker: Tells Gunicorn to use Uvicorn as the class.
Create a Systemd Service
You don’t want to manually start your app every time the server reboots. Let’s automate it.
Create a service file: sudo nano /etc/systemd/system/fastapi_app.service
Paste the following configuration:
[Unit]
Description=Gunicorn instance to serve FastAPI
After=network.target
[Service]
User=your-user
Group=www-data
WorkingDirectory=/home/your-user/app
ExecStart=/home/your-user/app/venv/bin/gunicorn \
-w 4 \
-k uvicorn.workers.UvicornWorker \
-b unix:app.sock main:app
[Install]
WantedBy=multi-user.targetNote: We are using a Unix Socket (
app.sock) instead of an IP port. It’s slightly faster and more secure for local communication between Nginx and Gunicorn.
Start and enable the service:
Configure Nginx as a Reverse Proxy
Now, let’s tell Nginx to route external traffic (port 80) to our internal FastAPI socket.
Create a new Nginx configuration: sudo nano /etc/nginx/sites-available/fastapi_app
The Configuration:
server {
listen 80;
server_name your_domain_or_ip;
location / {
include proxy_params;
proxy_pass http://unix:/home/your-user/app/app.sock;
}
}Activate the config:
&& Summary Checklist
| Component | Role | Connection Point |
|---|---|---|
| Nginx | Reverse Proxy | Listens on Port 80/443 |
| Gunicorn | Process Manager | Listens on Unix Socket |
| Uvicorn | ASGI Worker | Executes Python Code |
| Systemd | Background Service | Keeps everything running |
Troubleshooting Tips
- Permissions: Ensure the
www-datagroup has access to your application folder so Nginx can read the socket file. - Logs: If something breaks, check the logs immediately:
- Nginx:
sudo tail -f /var/log/nginx/error.log - FastAPI:
sudo journalctl -u fastapi_app
Ansible
Moving from manual configuration to Ansible is a pro move. It ensures your deployment is idempotent (run it once or 100 times, the result is the same) and easily repeatable across multiple servers.
Project Structure
For a clean setup, we’ll use a standard Ansible structure. Create a folder for your project:
fastapi-deploy/
├── playbook.yml
├── templates/
│ ├── fastapi_app.service.j2
│ └── nginx_config.j2
└── vars.ymlThe Playbook (playbook.yml)
This playbook handles the entire lifecycle: installing system packages, setting up a virtual environment, and configuring services.
---
- name: Deploy FastAPI with Nginx and Gunicorn
hosts: webservers
become: yes
vars_files:
- vars.yml
tasks:
- name: Update cache and install system dependencies
apt:
name:
state: present
update_cache: yes
- name: Create application directory
file:
path: "{{ app_path }}"
state: directory
owner: "{{ ansible_user }}"
mode: '0755'
- name: Set up virtual environment and install packages
pip:
name:
virtualenv: "{{ app_path }}/venv"
virtualenv_command: python3 -m venv
- name: Deploy Systemd service file
template:
src: templates/fastapi_app.service.j2
dest: "/etc/systemd/system/{{ app_name }}.service"
notify: Restart FastAPI
- name: Deploy Nginx configuration
template:
src: templates/nginx_config.j2
dest: "/etc/nginx/sites-available/{{ app_name }}"
notify: Restart Nginx
- name: Enable Nginx site
file:
src: "/etc/nginx/sites-available/{{ app_name }}"
dest: "/etc/nginx/sites-enabled/{{ app_name }}"
state: link
- name: Remove default Nginx config
file:
path: /etc/nginx/sites-enabled/default
state: absent
notify: Restart Nginx
handlers:
- name: Restart FastAPI
systemd:
name: "{{ app_name }}"
state: restarted
enabled: yes
daemon_reload: yes
- name: Restart Nginx
service:
name: nginx
state: restartedConfiguration Templates
Using Jinja2 templates (.j2) allows you to inject variables dynamically.
Systemd Template (templates/fastapi_app.service.j2)
[Unit]
Description=Gunicorn instance for {{ app_name }}
After=network.target
[Service]
User={{ ansible_user }}
Group=www-data
WorkingDirectory={{ app_path }}
ExecStart={{ app_path }}/venv/bin/gunicorn \
-w 4 \
-k uvicorn.workers.UvicornWorker \
-b unix:{{ app_path }}/app.sock main:app
[Install]
WantedBy=multi-user.targetNginx Template (templates/nginx_config.j2)
server {
listen 80;
server_name {{ domain_name }};
location / {
include proxy_params;
proxy_pass http://unix:{{ app_path }}/app.sock;
}
}Define Your Variables (vars.yml)
Keep your configuration separate from your logic.
app_name: my_fastapi_app
app_path: /home/ubuntu/fastapi_app
domain_name: 123.456.78.90 # or your domain
ansible_user: ubuntuHow to Run It
- Ensure your target server is in your Ansible inventory file (
/etc/ansible/hosts). - Run the playbook:
Why this works better
- Handlers: The
notifysystem ensures Nginx and Gunicorn only restart if the configuration files actually change. - Unix Sockets: We continue using
.sockfiles for performance, managed dynamically by theapp_pathvariable. - Cleanliness: It removes the Nginx
defaultsite automatically to prevent conflicts.
Certbot
Adding SSL with Let’s Encrypt is the final step to making your application production-ready. It transitions your site from http:// to https://, encrypting all traffic.
Updated vars.yml
You’ll need an email address for Let’s Encrypt to send you renewal alerts (though we’ll automate the renewal anyway).
app_name: my_fastapi_app
app_path: /home/ubuntu/fastapi_app
domain_name: yourdomain.com # Must be a real domain pointing to your IP
cert_email: your-email@example.com
ansible_user: ubuntuUpdated playbook.yml
We will add a task to install python3-certbot-nginx and a command to request the certificate. The Certbot Nginx plugin is smart—it will automatically modify your Nginx config to include the SSL lines.
Add these tasks to your existing playbook:
- name: Install Certbot and Nginx plugin
apt:
name:
state: present
- name: Obtain SSL Certificate
command: >
certbot --nginx
-d {{ domain_name }}
--non-interactive
--agree-tos
-m {{ cert_email }}
--redirect
args:
creates: "/etc/letsencrypt/live/{{ domain_name }}/fullchain.pem"
notify: Restart NginxWhy we use creates:
The creates argument makes the task idempotent. Ansible will check if the certificate file already exists; if it does, it skips the task. This prevents you from hitting Let’s Encrypt’s rate limits by requesting a new cert every time you run the playbook.
Handling Automatic Renewal
Modern Certbot installations on Ubuntu/Debian automatically create a systemd timer or cron job for renewals. However, to be safe, you can add a task to ensure the timer is running:
- name: Ensure Certbot renewal timer is enabled
systemd:
name: certbot.timer
state: started
enabled: yesThe Complete Flow Summary
| Step | Ansible Module | Action |
|---|---|---|
| Dependencies | apt | Installs Python, Nginx, and Certbot. |
| App Setup | pip | Creates a virtualenv and installs FastAPI/Gunicorn. |
| Service | template | Creates the .service file to keep the app alive. |
| Networking | template | Sets up Nginx as a bridge to the app socket. |
| Security | command | Requests a free SSL cert and forces HTTPS redirects. |
Important Note on Firewalls
If you have ufw enabled on your server, ensure you allow HTTPS traffic:
(You can also automate this in Ansible using the community.general.ufw module!)