Metabase Export/Import Guide
This guide explains how to export dashboards and SQL queries from your local Metabase instance (Docker) and import them into staging and production environments.
Overview
The export/import workflow consists of three main steps:
- Export - Extract dashboards and questions from local Metabase
- Transform - Update database IDs and remove local-specific metadata
- Import - Import to staging/production and update backend database
Prerequisites
jqinstalled (for JSON processing)- macOS:
brew install jq - Linux:
apt-get install jqoryum install jq
- macOS:
- Access to local Metabase (http://localhost:3002)
- Access to staging/production Metabase instances
- Metabase session tokens for authentication
Step 1: Export from Local Metabase
Export all dashboards and questions from your local Metabase instance:
./scripts/metabase/export-metabase-content.sh [output-dir] [--creator-email EMAIL] [--creator-id ID]
Examples:
# Export all content
./scripts/metabase/export-metabase-content.sh ./metabase-exports/local
# Export only content created by a specific user (by email)
./scripts/metabase/export-metabase-content.sh ./metabase-exports/local --creator-email luisrangelc@gmail.com
# Export only content created by a specific user (by ID)
./scripts/metabase/export-metabase-content.sh ./metabase-exports/local --creator-id 1
Filtering Options:
--creator-email EMAIL: Filter by creator's email address (slower for questions, as it needs to check each question's details)--creator-id ID: Filter by creator's user ID (faster, recommended if you know the ID)
What it does:
- Connects to local Metabase (http://localhost:3002)
- Exports all dashboards to
output-dir/dashboards/(individual files) - Exports all questions and models (SQL queries) to
output-dir/questions/(individual files)- Questions are saved as
question-{id}-{name}.json - Models are saved as
model-{id}-{name}.json
- Questions are saved as
- Exports all collections to
output-dir/collections/(individual files) - Exports database information for reference
- Creates a summary file with export details
Note: Models are saved questions that can be used as data sources. They're exported alongside regular questions and can be filtered by creator just like questions.
Output structure:
metabase-exports/local/
├── dashboards/
│ ├── all-dashboards.json
│ ├── dashboard-1-Sales_Dashboard.json
│ └── dashboard-2-Inventory_Report.json
├── questions/
│ ├── all-questions.json
│ ├── question-1-Daily_Sales.json
│ └── question-2-Product_Performance.json
├── collections/
│ ├── all-collections.json
│ ├── collection-1-Sales_Reports.json
│ └── collection-2-Inventory_Reports.json
├── databases.json
└── export-summary.txt
Getting a session token:
The script will prompt you to enter the session token when you run it. To get the token:
- Log in to Metabase UI: http://localhost:3002
- Open browser DevTools (F12)
- Go to Application/Storage → Cookies
- Copy the value of the
metabase.SESSIONcookie - When the script prompts "Enter your Metabase session token:", paste the value
Note: You don't need to save the token anywhere. The script will ask for it interactively when you run it. Session tokens expire when you log out, so you may need to get a fresh one if the script fails to authenticate.
Step 2: Transform Export for Target Environment
Before importing, you need to update database IDs and remove local-specific metadata:
./scripts/metabase/transform-metabase-export.sh <export-dir> <target-db-id> [target-collection-id]
Example:
# For staging (assuming staging database ID is 2)
./scripts/metabase/transform-metabase-export.sh ./metabase-exports/local 2
# For production (assuming production database ID is 2, collection ID is 1)
./scripts/metabase/transform-metabase-export.sh ./metabase-exports/local 2 1
What it does:
- Removes IDs (id, creator_id, etc.) that shouldn't be imported
- Updates
database_idto match target environment - Removes timestamps (created_at, updated_at, etc.)
- Removes result_metadata (will be regenerated)
- Updates collection_id if provided
Finding target database ID:
- Log in to target Metabase (staging/production)
- Go to Settings → Admin → Databases
- Note the ID of your app database (not the Metabase metadata database)
Output:
Creates a _transformed directory with cleaned JSON files ready for import.
Step 3: Import to Staging/Production
Import the transformed files to your target environment:
./scripts/metabase/import-metabase-content.sh <environment> <export-dir> [update-backend]
Example:
# Import to staging
./scripts/metabase/import-metabase-content.sh staging ./metabase-exports/local_transformed
# Import to production and update backend database
./scripts/metabase/import-metabase-content.sh production ./metabase-exports/local_transformed true
What it does:
- Imports collections first (so questions/dashboards can reference them)
- Imports questions (SQL queries) second
- Imports dashboards third (with updated question IDs)
- Creates ID mapping files (old ID → new ID) for all types
- Optionally updates backend database tables
Output files:
question-id-mapping.json- Maps old question IDs to new IDsdashboard-id-mapping.json- Maps old dashboard IDs to new IDscollection-id-mapping.json- Maps old collection IDs to new IDsimport-log-YYYYMMDD-HHMMSS.txt- Detailed import log
Step 4: Update Backend Database
After importing, update your backend database with the new dashboard IDs:
Option A: Manual SQL Update
-
Get the new dashboard ID from the mapping file:
cat ./metabase-exports/local_transformed/dashboard-id-mapping.json -
Find your dashboard UUID in the backend:
SELECT id, code, name FROM metabase_dashboard WHERE code = 'sales-dashboard'; -
Update the dashboard instance:
UPDATE metabase_dashboard_instance
SET metabase_dashboard_id = <new_dashboard_id>,
updated_at = CURRENT_TIMESTAMP
WHERE dashboard_id = '<dashboard_uuid>'
AND environment = 'staging'; -- or 'production'
Option B: Using the Helper Script
./scripts/metabase/update-backend-dashboard-ids.sh <environment> <mapping-file> [dashboard-code]
Example:
./scripts/metabase/update-backend-dashboard-ids.sh staging \
./metabase-exports/local_transformed/dashboard-id-mapping.json \
sales-dashboard
This generates SQL statements that you can review and execute.
Post-Import Steps
After importing, you need to:
-
Verify dashboards in Metabase UI:
- Log in to staging/production Metabase
- Check that dashboards appear correctly
- Verify questions execute properly
-
Reconfigure embedding settings:
- Go to Dashboard → ⋮ → Embed this dashboard
- Lock the
business_idparameter (mark as locked and hidden) - Configure other parameters as needed
- Save embedding settings
-
Test embed URLs:
- Use your backend endpoint to generate embed URLs
- Test that dashboards load correctly in your application
- Verify
business_idis locked and cannot be changed
-
Update documentation:
- Document new dashboard IDs in your implementation plan
- Update any hardcoded dashboard IDs in code/config
Troubleshooting
Authentication Issues
Problem: "Authentication failed (HTTP 401)"
Solution:
- Make sure you're using a valid session token
- Session tokens expire - get a fresh one if needed
- Verify you're logged in to the correct Metabase instance
Database ID Mismatch
Problem: Questions fail to execute after import
Solution:
- Verify the target database ID is correct
- Check that the database connection exists in target Metabase
- Re-run the transform script with the correct database ID
Dashboard Cards Not Showing
Problem: Dashboard imports but cards are missing
Solution:
- Check that questions were imported first
- Verify question IDs in dashboard JSON match imported question IDs
- Check the import log for errors
Backend Database Update Issues
Problem: Backend still shows old dashboard IDs
Solution:
- Verify the dashboard UUID is correct
- Check that the environment matches (staging vs production)
- Review the SQL statements before executing
- Check database connection and permissions
Best Practices
-
Always test in staging first:
- Import to staging and verify everything works
- Then import to production
-
Keep backups:
- Export before making major changes
- Keep exported files in version control (without session tokens)
-
Document dashboard IDs:
- Maintain a mapping of dashboard codes to Metabase IDs
- Update your implementation plan with new IDs
-
Version control:
- Consider versioning your dashboard definitions
- Use the export files as a backup/version control mechanism
-
Automate when possible:
- Once the process is stable, consider automating it
- Create CI/CD pipelines for dashboard deployment
Complete Workflow Example
# 1. Export from local
./scripts/metabase/export-metabase-content.sh ./metabase-exports/local
# 2. Transform for staging (database ID: 2)
./scripts/metabase/transform-metabase-export.sh ./metabase-exports/local 2
# 3. Import to staging
./scripts/metabase/import-metabase-content.sh staging ./metabase-exports/local_transformed
# 4. Update backend database
./scripts/metabase/update-backend-dashboard-ids.sh staging \
./metabase-exports/local_transformed/dashboard-id-mapping.json \
sales-dashboard
# 5. Review and execute SQL
cat ./metabase-exports/local_transformed/dashboard-id-mapping.json_update.sql
# 6. Repeat for production (after testing in staging)
./scripts/metabase/transform-metabase-export.sh ./metabase-exports/local 2
./scripts/metabase/import-metabase-content.sh production ./metabase-exports/local_transformed
./scripts/metabase/update-backend-dashboard-ids.sh production \
./metabase-exports/local_transformed/dashboard-id-mapping.json \
sales-dashboard