Skip to main content
Get your CloudLink connection up and running with Databricks Lakebase Postgres. This guide covers setting up a native PostgreSQL role and connecting via the managed PostgreSQL database.

Secure Direct Access

Elementum provides secure, in-place data access to your Databricks Lakebase Postgres database:

Authentication

Native PostgreSQL role with password authenticates Elementum. You retain full control over the data and can terminate access at any time.

In-Place Access

Data stays in your Databricks instance. No data is copied or moved to external systems.

How It Works

1

Databricks Side

A Lakebase Postgres instance provides a managed PostgreSQL database. Service principal permissions control access to the database.
2

Elementum Side

Elementum connects via standard PostgreSQL protocol (port 5432) using native Postgres credentials. Further access restrictions are applied through Elementum access policies.

What is Lakebase Postgres?

Lakebase Postgres is a fully managed, cloud-native PostgreSQL database within Databricks. Key features:

Managed PostgreSQL

A complete PostgreSQL database with its own compute and storage, not just a gateway to Delta tables.

Serverless Scaling

Automatically scales compute based on workload. Supports scale-to-zero for cost efficiency.

Unity Catalog Integration

Optionally register managed database catalogs to sync with Unity Catalog tables.

Standard Postgres

Uses standard PostgreSQL syntax, permissions, and tools. Connect with any Postgres client.

Security Architecture

Data Encryption

At Rest:
  • Database encrypted using industry-standard algorithms
  • Credentials encrypted and never exposed
In Transit:
  • All traffic encrypted using TLS (sslmode=require)
  • Secure connection via PostgreSQL protocol

Access Control

Authentication:
  • Native PostgreSQL role with password
  • No token expiration (persistent connections)
Authorization:
  • PostgreSQL role-based permissions
  • Managed at the Lakebase instance level

Setting up Elementum Access in Databricks

Prerequisites

Before starting:
  1. Ensure you have Workspace Admin or Account Admin access
  2. Have access to create Lakebase Postgres instances
  3. Have access to create Lakebase Postgres instances

Setup Steps Overview

1

Create Lakebase Postgres Instance

Set up the managed PostgreSQL database.
2

Enable Native Postgres Login

Enable password-based authentication for Postgres roles.
3

Create Elementum Role

Create a Postgres role with password for Elementum to use.
4

Grant Permissions

Grant the role appropriate access to your data.
5

Create Platform Schema

Create an empty schema for Elementum platform operations.

Run These Steps in Databricks

1

Create Lakebase Postgres Instance

Set up the managed PostgreSQL database:
  1. In your Databricks workspace, go to Compute
  2. Click the Lakebase Postgres tab
  3. Click Create
  4. Configure the instance:
    • Name: elementum-lakebase
    • Instance size (Capacity Unit): 2 (adjust based on workload)
    • Serverless usage policy: None (or configure as needed)
  5. Click Create
  6. Wait for the instance to show Status: Available
2

Enable Native Postgres Login

Enable password-based authentication for the Lakebase instance:
  1. On the Lakebase instance page, click Edit in the upper-right
  2. Turn on Enable Postgres Native Role Login
  3. Click Save
This allows creating Postgres roles with passwords that don’t expire, which is required for persistent connections like Elementum CloudLink.
3

Get Connection Details

Note the connection parameters from the Lakebase instance:
  1. Click on the instance name (elementum-lakebase)
  2. Go to the Connection details tab
  3. Note the Connection parameters:
    • host: instance-<uuid>.database.cloud.databricks.com
    • dbname: databricks_postgres (default database)
    • port: 5432
The hostname format is: instance-<instance-id>.database.cloud.databricks.com
4

Create Elementum Role with Password

Connect to the Lakebase instance via the New Query button or psql, and create a role for Elementum:
-- Create the Elementum role with a strong password
CREATE ROLE elementum LOGIN PASSWORD 'your-strong-password-here';

-- Grant database-level access
GRANT ALL PRIVILEGES ON DATABASE databricks_postgres TO elementum;
GRANT CONNECT ON DATABASE databricks_postgres TO elementum;
Important: Use a strong, unique password. Store it securely - you’ll need it when configuring the Elementum CloudLink connection.
5

Grant Permissions to Elementum Role

Grant the Elementum role appropriate access to your data:

Choose Your Permission Level

For the simplest setup, grant superuser privileges:
-- Grant superuser role to elementum
GRANT databricks_superuser TO elementum;
Note: This grants full access to all schemas and tables. Use specific permissions below if you need tighter access control.

Permission Examples

-- Grant full access to the sales schema
GRANT USAGE ON SCHEMA sales TO elementum;
GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA sales 
  TO elementum;

-- Apply to future tables as well
ALTER DEFAULT PRIVILEGES IN SCHEMA sales 
  GRANT SELECT, INSERT, UPDATE, DELETE ON TABLES TO elementum;
6

Create Platform Schema for Elementum

Create the platform schema and grant the Elementum role full access:
-- Create the platform schema (MUST BE EMPTY - for Elementum internal use only)
CREATE SCHEMA IF NOT EXISTS elementum_platform;

-- Grant the elementum role full access to the platform schema
GRANT USAGE ON SCHEMA elementum_platform TO elementum;
GRANT ALL PRIVILEGES ON SCHEMA elementum_platform TO elementum;

-- Grant privileges on all existing objects
GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA elementum_platform TO elementum;
GRANT ALL PRIVILEGES ON ALL SEQUENCES IN SCHEMA elementum_platform TO elementum;
GRANT ALL PRIVILEGES ON ALL FUNCTIONS IN SCHEMA elementum_platform TO elementum;

-- Grant privileges on future objects (required for Elementum to create tables)
ALTER DEFAULT PRIVILEGES IN SCHEMA elementum_platform 
  GRANT ALL ON TABLES TO elementum;
ALTER DEFAULT PRIVILEGES IN SCHEMA elementum_platform 
  GRANT ALL ON SEQUENCES TO elementum;
ALTER DEFAULT PRIVILEGES IN SCHEMA elementum_platform 
  GRANT ALL ON FUNCTIONS TO elementum;
Critical: The elementum_platform schema is for Elementum’s internal operations. Do NOT put your data tables here. Your data tables should be in separate schemas.
7

(Optional) Register Managed Database Catalog

If you want to access Unity Catalog tables through Lakebase:
  1. On the Lakebase instance page, go to the Catalogs tab
  2. Click Create managed database catalog
  3. Select the Unity Catalog you want to expose
  4. This creates a bridge between your Delta tables and the Postgres interface
Managed database catalogs allow you to query Unity Catalog tables using standard PostgreSQL syntax through your Lakebase instance.
After completing the Databricks setup, configure the connection in Elementum:
1

Navigate to CloudLink Settings

Go to Settings > Cloud Links > Add Connection and select Databricks.
2

Enter Connection Details

Fill in the connection form:
FieldValueDescription
NameProduction DatabricksDescriptive name for your connection
Hostnameinstance-<uuid>.database.cloud.databricks.comYour Lakebase Postgres hostname
Port5432PostgreSQL port
Databasedatabricks_postgresThe Lakebase database name
Schemaelementum_platformThe empty platform schema (NOT your data schema)
UsernameelementumThe Postgres role name you created
Passwordyour-strong-password-hereThe password you set for the role
Critical - Schema Field: Enter the empty platform schema you created (e.g., elementum_platform), NOT your data schema. If you enter your data schema here, it will be hidden from workflow building and you won’t be able to access your data.
3

Test the Connection

Click Save to test the connection. The system will verify:
  • Network connectivity
  • PostgreSQL authentication
  • Schema access
If the connection saves successfully, your setup is complete.
4

Configure Data Access

After saving the connection:
  1. Select Tables: Choose which tables to expose in Elementum
  2. Configure Field Mapping: Map columns to Elementum field types
  3. Set Primary Key: Identify the unique identifier column for each table
  4. Configure Permissions: Set which users/roles can access the data

Verification and Testing

After completing the setup, verify everything is working:
1

Test Connection via psql

You can test the connection using the psql command from the Connection details tab:
psql "host=instance-<uuid>.database.cloud.databricks.com \
     user=<service-principal-uuid> \
     dbname=databricks_postgres \
     port=5432 \
     sslmode=require"
When prompted for password, enter the password you set for the elementum role.
2

Verify Schema Access

Once connected, verify the platform schema exists:
\dn  -- List schemas

-- Or:
SELECT schema_name FROM information_schema.schemata;
3

Test in Elementum

  1. Verify the connection shows as Connected in CloudLink settings
  2. Browse to the integrated tables in Elementum
  3. Verify data loads correctly
  4. Test creating/updating a record (if write access was granted)

Troubleshooting

Cannot Connect from Elementum:
  • Verify the Lakebase instance is in “Available” status
  • Confirm the password is correct
  • Check that the Postgres role has been created and granted appropriate permissions
  • Verify the hostname is correct (should be instance-<uuid>.database.cloud.databricks.com)
“Authentication failed” Error:
  • Verify the role name and password are correct
  • Ensure Enable Postgres Native Role Login is turned on
  • Confirm the role was created with LOGIN privilege
Tables not visible in Elementum:
  • Most common cause: You entered your data schema in the Schema field instead of the platform schema
  • Verify the Postgres role has the appropriate permissions
  • Check that tables exist in the database
  • If using managed database catalogs, verify the catalog is registered
“Permission denied” Errors:
  • Verify the Postgres role has appropriate GRANT permissions
  • Check that databricks_superuser or appropriate roles are granted
  • For specific table access, verify GRANT statements have been run
Instance not starting:
  • Check if you’ve reached capacity limits
  • Verify workspace has Lakebase Postgres enabled
  • Contact Databricks support if the instance stays in “Starting” state
Instance suspended:
  • Lakebase instances can scale to zero when idle
  • The instance will automatically resume when a connection is made
  • First connection after suspension may take a few seconds

Security Best Practices

Principle of Least Privilege

  • Grant only necessary PostgreSQL roles
  • Use specific table grants instead of superuser where possible
  • Regularly audit granted permissions
  • Remove unused Postgres roles

Credential Security

  • Rotate passwords periodically (recommended: every 90 days)
  • Store passwords securely (never in code)
  • Use separate Postgres roles for different environments
  • Never share credentials outside authorized personnel

Network Security

  • Lakebase uses TLS encryption by default (sslmode=require)
  • Consider workspace-level IP access controls
  • Monitor connection logs regularly
  • Set up alerts for suspicious activity

Monitoring

  • Review query history via Lakebase Metrics tab
  • Monitor compute costs
  • Set up cost alerts in Databricks
  • Track data access patterns

Next Steps

Configure Apps

Set up your first app in Elementum using your connected data

Create Automations

Build workflows that act on your Databricks data

Setup AI Features

Enable AI-powered search, automations, and insights

Data Best Practices

Optimize your data models for Elementum

Additional Resources

CloudLink Overview

Learn more about CloudLink architecture

Databricks Documentation

Official Lakebase Postgres documentation

Get Support

Contact our team for setup assistance

This guide reflects the latest Databricks Lakebase Postgres and Elementum best practices. For additional assistance, contact support@elementum.io.