Cloud SQL Setup Guide
⚠️ DEPRECATED: This document is archived and refers to the old
flowpos-dbinstances.
Current Architecture: Database instances have been consolidated. The application now uses:
- Staging:
metabase-db-staging(hosts bothflowpos_stagingandmetabasedatabases)- Production:
metabase-db-production(hosts bothflowpos_productionandmetabasedatabases)See
deploy/gcp/cloud-sql/README.mdfor current setup instructions.
This guide will help you create a Cloud SQL Development tier instance with two databases (staging and production).
Prerequisites
- GCP Project ID
- GCP Region (default:
us-central1) - VPC Network (for private IP connection)
- VPC Connector name (default:
cloudrun-vpc-connector)
Step 1: Create Cloud SQL Instance
Option A: Using GCP Console (UI)
- Go to Cloud SQL Instances
- Click "Create Instance"
- Select "PostgreSQL"
- Choose "Development" tier (Enterprise Plus edition)
- Fill in the details:
- Instance ID:
flowpos-db - Password: Generate a strong password for the
postgresuser (save this!) - Database Version: PostgreSQL 17
- Region:
us-central1(or your preferred region) - Zone: Select any zone (single-zone is fine for Development tier)
- Instance ID:
- Under "Machine Configuration":
- Cloud SQL Edition: Enterprise Plus (Development tier)
- Should show: 4 vCPU, 32 GB RAM, 250 GB SSD storage
- Features: 35-day PITR, 99.99% SLA, sub-second maintenance
- Under "Connections":
- Public IP: Disable (we'll use private IP)
- Private IP: Enable
- VPC Network: Select your VPC network
- Private Service Connection: This will be created automatically
- Under "Backup":
- Automated backups: Enable
- Backup retention: 7 days (default, or up to 35 days)
- Click "Create Instance"
- Wait 5-10 minutes for the instance to be created
Option B: Using gcloud CLI
# Set variables
PROJECT_ID="your-project-id"
REGION="us-central1"
INSTANCE_NAME="flowpos-db"
VPC_NETWORK="default" # or your VPC network name
DB_PASSWORD="your-secure-password-here"
# Create the instance
gcloud sql instances create $INSTANCE_NAME \
--project=$PROJECT_ID \
--database-version=POSTGRES_17 \
--edition=ENTERPRISE_PLUS \
--tier=db-custom-4-16384 \
--region=$REGION \
--network=projects/$PROJECT_ID/global/networks/$VPC_NETWORK \
--no-assign-ip \
--enable-google-private-path \
--backup-start-time=03:00 \
--enable-bin-log \
--storage-type=SSD \
--storage-size=250GB \
--storage-auto-increase \
--maintenance-window-day=SUN \
--maintenance-window-hour=4 \
--database-flags=max_connections=100
Note: The --tier=db-custom-4-16384 creates a custom instance with 4 vCPU and 16GB RAM (Development tier equivalent). Adjust if needed.
Step 2: Get Connection Information
After the instance is created, get the private IP address:
Using Console
- Go to your instance details
- Note the "Private IP address" (e.g.,
10.x.x.x)
Using CLI
gcloud sql instances describe $INSTANCE_NAME \
--project=$PROJECT_ID \
--format="value(ipAddresses[0].ipAddress)"
Step 3: Create Databases
Option A: Using GCP Console
- Go to your Cloud SQL instance
- Click on "Databases" tab
- Click "Create Database"
- Create
flowpos_staging:- Database name:
flowpos_staging - Click "Create"
- Database name:
- Create
flowpos_production:- Database name:
flowpos_production - Click "Create"
- Database name:
Option B: Using gcloud CLI
# Create staging database
gcloud sql databases create flowpos_staging \
--instance=$INSTANCE_NAME \
--project=$PROJECT_ID
# Create production database
gcloud sql databases create flowpos_production \
--instance=$INSTANCE_NAME \
--project=$PROJECT_ID
Step 4: Create Database Users
Option A: Using GCP Console
- Go to your Cloud SQL instance
- Click on "Users" tab
- Click "Add User Account"
- Create staging user:
- Username:
flowpos_staging - Password: Generate a strong password (save this!)
- Click "Add"
- Username:
- Create production user:
- Username:
flowpos_production - Password: Generate a strong password (save this!)
- Click "Add"
- Username:
Option B: Using gcloud CLI
# Set passwords (use strong passwords!)
STAGING_PASSWORD="your-staging-password"
PRODUCTION_PASSWORD="your-production-password"
# Create staging user
gcloud sql users create flowpos_staging \
--instance=$INSTANCE_NAME \
--password=$STAGING_PASSWORD \
--project=$PROJECT_ID
# Create production user
gcloud sql users create flowpos_production \
--instance=$INSTANCE_NAME \
--password=$PRODUCTION_PASSWORD \
--project=$PROJECT_ID
Step 5: Grant Database Permissions
You need to connect to the database to grant permissions. Use Cloud SQL Proxy or connect via private IP.
Using psql (via Cloud SQL Proxy or private IP)
# Install Cloud SQL Proxy if needed
# https://cloud.google.com/sql/docs/postgres/connect-instance-auth-proxy
# Connect to the instance
cloud_sql_proxy -instances=$PROJECT_ID:$REGION:$INSTANCE_NAME=tcp:5432 &
# In another terminal, connect to the database
psql -h 127.0.0.1 -U postgres -d postgres
# Grant permissions (run these SQL commands)
GRANT ALL PRIVILEGES ON DATABASE flowpos_staging TO flowpos_staging;
GRANT ALL PRIVILEGES ON DATABASE flowpos_production TO flowpos_production;
# Connect to each database and grant schema permissions
\c flowpos_staging
GRANT ALL ON SCHEMA public TO flowpos_staging;
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON TABLES TO flowpos_staging;
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON SEQUENCES TO flowpos_staging;
\c flowpos_production
GRANT ALL ON SCHEMA public TO flowpos_production;
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON TABLES TO flowpos_production;
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON SEQUENCES TO flowpos_production;
\q
Step 6: Configure VPC Connector (if not already done)
Your Cloud Run services need to connect via the VPC connector:
# Check if VPC connector exists
gcloud compute networks vpc-access connectors describe cloudrun-vpc-connector \
--region=$REGION \
--project=$PROJECT_ID
# If it doesn't exist, create it:
gcloud compute networks vpc-access connectors create cloudrun-vpc-connector \
--region=$REGION \
--network=$VPC_NETWORK \
--range=10.8.0.0/28 \
--project=$PROJECT_ID
Step 7: Connection Strings
After setup, you'll have these connection strings:
Staging Database
postgresql://flowpos_staging:STAGING_PASSWORD@PRIVATE_IP:5432/flowpos_staging
Production Database
postgresql://flowpos_production:PRODUCTION_PASSWORD@PRIVATE_IP:5432/flowpos_production
Replace:
PRIVATE_IPwith your instance's private IP addressSTAGING_PASSWORDwith the staging user passwordPRODUCTION_PASSWORDwith the production user password
Step 8: Update GitHub Secrets
Add these secrets to your GitHub repository:
- Go to your GitHub repo → Settings → Secrets and variables → Actions
- Add the following secrets:
DATABASE_URL_STAGING=postgresql://flowpos_staging:PASSWORD@PRIVATE_IP:5432/flowpos_staging
DATABASE_URL_PRODUCTION=postgresql://flowpos_production:PASSWORD@PRIVATE_IP:5432/flowpos_production
Or if using a single DATABASE_URL that changes per environment:
- Update the
DATABASE_URLsecret in staging environment - Update the
DATABASE_URLsecret in production environment
Step 9: Test Connection
Test the connection from your local machine (using Cloud SQL Proxy):
# Start Cloud SQL Proxy
cloud_sql_proxy -instances=$PROJECT_ID:$REGION:$INSTANCE_NAME=tcp:5432
# In another terminal, test connection
psql -h 127.0.0.1 -U flowpos_staging -d flowpos_staging
psql -h 127.0.0.1 -U flowpos_production -d flowpos_production
Troubleshooting
Connection Issues
- Private IP not accessible: Ensure VPC connector is configured and Cloud Run has
--vpc-connectorflag - Authentication failed: Verify username and password
- Database doesn't exist: Check database names match exactly
Common Commands
# List instances
gcloud sql instances list --project=$PROJECT_ID
# Describe instance
gcloud sql instances describe $INSTANCE_NAME --project=$PROJECT_ID
# List databases
gcloud sql databases list --instance=$INSTANCE_NAME --project=$PROJECT_ID
# List users
gcloud sql users list --instance=$INSTANCE_NAME --project=$PROJECT_ID
# Reset user password
gcloud sql users set-password flowpos_staging \
--instance=$INSTANCE_NAME \
--password=NEW_PASSWORD \
--project=$PROJECT_ID
Next Steps
-
Run migrations on staging:
MIGRATOR_DATABASE_URL="postgresql://flowpos_staging:PASSWORD@PRIVATE_IP:5432/flowpos_staging" \
pnpm migration:push -
Run migrations on production:
MIGRATOR_DATABASE_URL="postgresql://flowpos_production:PASSWORD@PRIVATE_IP:5432/flowpos_production" \
pnpm migration:push -
Update deployment workflows to use the new connection strings