Overview
Google Gemini provides advanced AI capabilities through Google Cloud’s Vertex AI platform, offering advanced language models and multimodal AI features. This guide walks you through setting up Google Gemini as an AI Provider in Elementum.Prerequisites: You’ll need a Google Cloud account with billing enabled and access to Vertex AI APIs.
Step 1: Set Up Google Cloud Project
Create or Select a Project
-
Access Google Cloud Console
- Go to console.cloud.google.com
- Sign in with your Google account
-
Create a New Project (or select an existing one)
- Click on the project selector at the top of the page
- Click “New Project”
- Enter a project name (e.g., “Elementum AI Integration”)
- Select your billing account
- Click “Create”
-
Enable Billing
- Ensure your project has billing enabled
- Navigate to Billing in the left sidebar
- Link a billing account if not already configured
Enable Required APIs
Enable the necessary APIs for Vertex AI access:1
Navigate to APIs & Services
In the Google Cloud Console, go to APIs & Services → Library
2
Enable Vertex AI API
Search for “Vertex AI API” and click “Enable”This may take a few minutes to complete
3
Enable Cloud Resource Manager API
Search for “Cloud Resource Manager API” and click “Enable”This is required for project access
4
Enable Additional APIs (Optional)
Depending on your needs, you may also want to enable:
- AI Platform API
- Cloud Translation API
- Cloud Vision API
Step 2: Create Service Account
Generate Service Account
1
Navigate to IAM & Admin
In the Google Cloud Console, go to IAM & Admin → Service Accounts
2
Create Service Account
Click “Create Service Account”Service Account Name: Enter a descriptive name (e.g., “elementum-ai-service”)Service Account ID: Will be auto-generatedDescription: Optional description for the service account
3
Grant Permissions
Assign the following roles to your service account:Required Roles:
- Vertex AI User - Access to Vertex AI models
- ML Engine Developer - Access to ML Engine APIs
- BigQuery User - If integrating with BigQuery
- Storage Object Viewer - If accessing Cloud Storage
4
Complete Creation
Click “Continue” and then “Done” to create the service account
Generate Service Account Key
1
Access Service Account
In the Service Accounts list, click on your newly created service account
2
Create Key
Go to the Keys tabClick “Add Key” → “Create new key”
3
Select Key Type
Choose JSON as the key typeClick “Create”
4
Download Key File
The JSON key file will be automatically downloadedImportant: Store this file securely - it contains credentials for your service account
Security Critical: The service account key file contains sensitive credentials. Store it securely and never commit it to version control or share it publicly.
Step 3: Configure Gemini in Elementum
Adding the Provider
-
Navigate to Organization Settings
- In Elementum, go to Organization Settings
- Select the Providers tab
-
Create New Provider
- Click ”+ Provider”
- Select Gemini from the provider options
- Configure Provider Settings
- Basic Configuration
- Service Account
Provider Name: Enter a descriptive name (e.g., “Google Gemini Production”)Location: Select your Google Cloud region (e.g., “us-central1”)Project ID: Enter your Google Cloud project IDCloudLink: Select “All CloudLinks” unless you have specific requirements
- Save Configuration
- Click “Save” to create the provider
- The system will validate your credentials and project access
Step 4: Test Your Connection
Verification Process
-
Connection Test
- After saving, Elementum will automatically test the connection
- Look for a green checkmark indicating successful connection
-
Model Availability
- Navigate to Services tab
- Click ”+ Service” to see available models
- You should see Google Gemini models listed
Available Models
Your Gemini provider will give you access to:Language Models (LLMs)
Model | Primary Use Case | Speed | Intelligence | Best For |
---|---|---|---|---|
Gemini 2.5 Pro | Complex reasoning and large responses | ⭐⭐ | ⭐⭐⭐⭐⭐ | Most complicated use cases, long-form content, advanced analysis |
Gemini 2.5 | Balanced performance | ⭐⭐⭐⭐ | ⭐⭐⭐⭐ | General-purpose tasks, customer support, content creation |
Gemini 1.5 Pro | Enhanced context understanding | ⭐⭐⭐ | ⭐⭐⭐⭐ | Large document analysis, comprehensive understanding |
Gemini 1.5 Flash | Speed-optimized responses | ⭐⭐⭐⭐⭐ | ⭐⭐⭐ | Quick responses, simple automation, cost-effective |
Model Recommendations: Use Gemini 2.5 for most applications requiring balanced performance. Choose Gemini 2.5 Pro for the most complicated use cases and when you need large, detailed responses. Gemini 1.5 Flash is ideal for speed-critical applications.
Note: Embeddings for AI Search are handled exclusively through Snowflake Cortex. Gemini models are used for LLM services only.
Step 5: Create AI Services
With your Gemini provider configured, create AI Services:1
Navigate to Services
In Organization Settings, go to the Services tab
2
Create LLM Service
Click ”+ Service” and select LLM (Language Model service)Service Name: Give it a descriptive name (e.g., “Gemini Customer Support”)Provider: Select your configured Gemini providerModel: Select from available Gemini modelsCost Per Million Tokens: Set for cost tracking (optional)
3
Test Services
Use the built-in testing interface to verify services work correctly
Usage Guidelines
Cost Management
Google Cloud charges for Vertex AI usage. To manage costs:Monitor Usage
Cloud Console: Monitor usage in Google Cloud ConsoleBilling Alerts: Set up billing alerts for cost controlQuotas: Review and adjust API quotas as neededUsage Reports: Regular review of usage patterns
Optimize Usage
Model Selection: Choose appropriate models for tasksBatch Requests: Process multiple requests togetherCaching: Cache responses when possibleRequest Optimization: Minimize unnecessary API calls
Best Practices
-
Model Selection
- Use Gemini 2.5 for most general-purpose tasks and customer support
- Use Gemini 2.5 Pro for the most complicated use cases and large responses
- Use Gemini 1.5 Flash for speed-critical applications
-
Security
- Regularly rotate service account keys
- Follow principle of least privilege
- Monitor service account usage
- Use appropriate IAM roles for access control
-
Performance
- Select regions closest to your users
- Choose Gemini 2.5 Pro for tasks requiring detailed analysis
- Use Gemini 1.5 Flash for high-volume, simple tasks
- Implement retry logic for transient errors
Troubleshooting
Authentication Errors
Authentication Errors
Symptoms: Service account authentication failuresCommon Causes:
- Invalid service account key
- Insufficient permissions
- Disabled APIs
- Verify service account key is valid JSON
- Check service account roles and permissions
- Ensure required APIs are enabled
- Regenerate service account key if needed
API Access Issues
API Access Issues
Symptoms: Cannot access Vertex AI APIsCommon Causes:
- APIs not enabled
- Billing not configured
- Regional restrictions
- Enable Vertex AI API in Google Cloud Console
- Verify billing is enabled and active
- Check regional availability of services
- Review project quotas and limits
Model Unavailable
Model Unavailable
Advanced Configuration
Multi-Region Setup
For global deployments:- Region Selection: Choose regions closest to your users
- Data Residency: Consider data residency requirements
- Failover: Implement failover strategies across regions
- Compliance: Ensure regional compliance requirements
Custom Model Access
For specialized or private models:- Model Registration: Register custom models in Vertex AI
- Access Control: Configure proper IAM permissions
- Performance Tuning: Optimize for your specific use case
- Monitoring: Set up custom monitoring and alerting
Integration with Other Google Services
BigQuery Integration
Direct Analysis: Run AI models directly on BigQuery dataData Pipeline: Integrate with your data pipelineCost Optimization: Reduce data movement costs
Cloud Storage Integration
File Processing: Process files stored in Cloud StorageBatch Processing: Handle large-scale file processingBackup & Recovery: Secure storage for AI assets
Security Considerations
Service Account Security
Key Management: Regularly rotate service account keysAccess Control: Use IAM roles for fine-grained accessMonitoring: Monitor service account usageAudit Logging: Enable audit logging for security
Data Privacy
Data Handling: Understand Google’s data handling policiesCompliance: Ensure compliance with data regulationsEncryption: Data encrypted in transit and at restAccess Logs: Monitor data access patterns
Next Steps
With Google Gemini configured as your AI Provider:Create AI Services
Set up specific LLM and embedding services using Gemini models
Configure Snowflake Cortex
Set up Snowflake Cortex for AI Search and embeddings
Build Agents
Create conversational AI assistants using Gemini models
Use AI Actions
Add Gemini AI capabilities to your automation workflows
Google Gemini provides cutting-edge AI capabilities with multimodal support and seamless integration with Google Cloud services. Once configured, you can leverage these advanced models across all of Elementum’s AI features.