Next Steps: Cloud SQL Integration
⚠️ DEPRECATED: This document is archived and refers to the old
flowpos-dbinstances.
Current Architecture: Database instances have been consolidated. The application now uses:
- Staging:
metabase-db-staging(hosts bothflowpos_stagingandmetabasedatabases)- Production:
metabase-db-production(hosts bothflowpos_productionandmetabasedatabases)See
deploy/gcp/cloud-sql/README.mdfor current setup instructions.
✅ Completed
- Cloud SQL instance created (
flowpos-db) - Databases created (
flowpos_staging,flowpos_production) - Users created (
flowpos_staging,flowpos_production) - Permissions granted
📋 Next Steps
Step 1: Get Connection Strings
Your Cloud SQL instance details:
- Instance:
flowpos-db - Private IP:
34.60.215.167 - Project:
barto-dev - Region:
us-central1
Connection Strings:
Staging:
postgresql://flowpos_staging:YOUR_STAGING_PASSWORD@34.60.215.167:5432/flowpos_staging
Production:
postgresql://flowpos_production:YOUR_PRODUCTION_PASSWORD@34.60.215.167:5432/flowpos_production
Replace YOUR_STAGING_PASSWORD and YOUR_PRODUCTION_PASSWORD with the actual passwords you set.
Step 2: Add to GitHub Secrets
- Go to your GitHub repository
- Navigate to: Settings → Secrets and variables → Actions
- For staging environment:
- Click on "staging" environment (or create it if it doesn't exist)
- Add/Update secret:
DATABASE_URL - Value:
postgresql://flowpos_staging:PASSWORD@34.60.215.167:5432/flowpos_staging
- For production environment:
- Click on "production" environment (or create it if it doesn't exist)
- Add/Update secret:
DATABASE_URL - Value:
postgresql://flowpos_production:PASSWORD@34.60.215.167:5432/flowpos_production
Step 3: Update Deployment Workflows
The workflows have been updated to:
- Add Cloud SQL instance connection (
--add-cloudsql-instances) - Use the
DATABASE_URLfrom secrets (already configured)
Step 4: Test Connection (Optional)
Test the connection from your local machine:
# Install Cloud SQL Proxy (if not already installed)
brew install cloud-sql-proxy
# Authenticate
gcloud auth application-default login
# Start proxy
/opt/homebrew/bin/cloud-sql-proxy barto-dev:us-central1:flowpos-db --port=5432 &
# Test connection
psql -h 127.0.0.1 -U flowpos_staging -d flowpos_staging
Step 5: Run Migrations
After deployment, migrations will run automatically. Or run manually:
Staging:
MIGRATOR_DATABASE_URL="postgresql://flowpos_staging:PASSWORD@34.60.215.167:5432/flowpos_staging" \
pnpm migration:push
Production:
MIGRATOR_DATABASE_URL="postgresql://flowpos_production:PASSWORD@34.60.215.167:5432/flowpos_production" \
pnpm migration:push
Step 6: Deploy
- Push your changes to trigger the deployment workflow
- The workflow will:
- Build the Docker image
- Deploy to Cloud Run
- Connect to Cloud SQL via VPC connector
- Use the
DATABASE_URLfrom secrets
🔍 Verification
After deployment, verify:
-
Check Cloud Run logs:
gcloud run services logs read flowpos-backend --region=us-central1 --project=barto-dev -
Test API endpoint:
curl https://your-backend-url/api/health -
Check database connection in logs: Look for successful database connection messages
🚨 Troubleshooting
Connection Issues
-
Verify VPC Connector:
gcloud compute networks vpc-access connectors describe cloudrun-vpc-connector \
--region=us-central1 --project=barto-dev -
Check Cloud SQL instance:
gcloud sql instances describe flowpos-db --project=barto-dev -
Verify private IP:
gcloud sql instances describe flowpos-db --project=barto-dev \
--format="value(ipAddresses[0].ipAddress)"
Permission Issues
If you get permission errors, re-run the grant permissions script:
GRANT ALL PRIVILEGES ON DATABASE flowpos_staging TO flowpos_staging;
GRANT ALL PRIVILEGES ON DATABASE flowpos_production TO flowpos_production;
📝 Notes
- The connection uses private IP via VPC connector
- SSL is required for production connections (your code already handles this)
- Both staging and production share the same Cloud SQL instance but use separate databases
- You can split into separate instances later when budget allows