databricks-enterprise-rbac

Configure Databricks enterprise SSO, Unity Catalog RBAC, and organization management. Use when implementing SSO integration, configuring role-based permissions, or setting up organization-level controls with Unity Catalog. Trigger with phrases like "databricks SSO", "databricks RBAC", "databricks enterprise", "unity catalog permissions", "databricks SCIM".

25 stars

Best use case

databricks-enterprise-rbac is best used when you need a repeatable AI agent workflow instead of a one-off prompt.

Configure Databricks enterprise SSO, Unity Catalog RBAC, and organization management. Use when implementing SSO integration, configuring role-based permissions, or setting up organization-level controls with Unity Catalog. Trigger with phrases like "databricks SSO", "databricks RBAC", "databricks enterprise", "unity catalog permissions", "databricks SCIM".

Teams using databricks-enterprise-rbac should expect a more consistent output, faster repeated execution, less prompt rewriting.

When to use this skill

  • You want a reusable workflow that can be run more than once with consistent structure.

When not to use this skill

  • You only need a quick one-off answer and do not need a reusable workflow.
  • You cannot install or maintain the underlying files, dependencies, or repository context.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/databricks-enterprise-rbac/SKILL.md --create-dirs "https://raw.githubusercontent.com/ComeOnOliver/skillshub/main/skills/jeremylongshore/claude-code-plugins-plus-skills/databricks-enterprise-rbac/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/databricks-enterprise-rbac/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How databricks-enterprise-rbac Compares

Feature / Agentdatabricks-enterprise-rbacStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexityUnknownN/A

Frequently Asked Questions

What does this skill do?

Configure Databricks enterprise SSO, Unity Catalog RBAC, and organization management. Use when implementing SSO integration, configuring role-based permissions, or setting up organization-level controls with Unity Catalog. Trigger with phrases like "databricks SSO", "databricks RBAC", "databricks enterprise", "unity catalog permissions", "databricks SCIM".

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

SKILL.md Source

# Databricks Enterprise RBAC

## Overview
Implement enterprise access control using Unity Catalog privileges, SCIM-provisioned groups, workspace entitlements, cluster policies, and audit logging. Unity Catalog uses a three-level namespace (`catalog.schema.object`) with privilege inheritance: granting `USAGE` on a catalog cascades to schemas. Account-level SCIM syncs groups from your IdP (Okta, Azure AD, Google Workspace).

## Prerequisites
- Databricks Premium or Enterprise with Unity Catalog enabled
- Account admin access for SCIM and group management
- Identity Provider supporting SAML 2.0 and SCIM 2.0

## Instructions

### Step 1: Provision Groups via SCIM API
Sync groups from your IdP at the account level. Max 10,000 users + service principals and 5,000 groups per account.

```bash
# Create account-level groups that map to IdP teams
databricks account groups create --json '{
  "displayName": "data-engineers",
  "entitlements": [
    {"value": "workspace-access"},
    {"value": "databricks-sql-access"}
  ]
}'

databricks account groups create --json '{
  "displayName": "data-analysts",
  "entitlements": [
    {"value": "workspace-access"},
    {"value": "databricks-sql-access"}
  ]
}'

databricks account groups create --json '{
  "displayName": "ml-engineers",
  "entitlements": [
    {"value": "workspace-access"},
    {"value": "databricks-sql-access"},
    {"value": "allow-cluster-create"}
  ]
}'
```

```python
# Assign groups to workspaces
from databricks.sdk import AccountClient

acct = AccountClient()

# Get workspace ID
workspaces = list(acct.workspaces.list())
prod_ws = next(ws for ws in workspaces if ws.workspace_name == "production")

# Assign group to workspace with permissions
acct.workspace_assignment.update(
    workspace_id=prod_ws.workspace_id,
    principal_id=group_id,
    permissions=["USER"],
)
```

### Step 2: Unity Catalog Privilege Hierarchy
```sql
-- Privilege model: CATALOG > SCHEMA > TABLE/VIEW/FUNCTION
-- USAGE grants must cascade from catalog to schema

-- Data Engineers: full ETL access
GRANT USAGE ON CATALOG analytics TO `data-engineers`;
GRANT CREATE SCHEMA ON CATALOG analytics TO `data-engineers`;
GRANT CREATE, MODIFY, SELECT ON SCHEMA analytics.bronze TO `data-engineers`;
GRANT CREATE, MODIFY, SELECT ON SCHEMA analytics.silver TO `data-engineers`;
GRANT SELECT ON SCHEMA analytics.gold TO `data-engineers`;

-- Data Analysts: read-only curated data
GRANT USAGE ON CATALOG analytics TO `data-analysts`;
GRANT SELECT ON SCHEMA analytics.gold TO `data-analysts`;

-- ML Engineers: full ML lifecycle
GRANT USAGE ON CATALOG analytics TO `ml-engineers`;
GRANT SELECT ON SCHEMA analytics.gold TO `ml-engineers`;
GRANT ALL PRIVILEGES ON SCHEMA analytics.ml_features TO `ml-engineers`;
GRANT ALL PRIVILEGES ON SCHEMA analytics.ml_models TO `ml-engineers`;

-- Service Principal: CI/CD automation
GRANT USAGE ON CATALOG analytics TO `cicd-service-principal`;
GRANT ALL PRIVILEGES ON CATALOG analytics TO `cicd-service-principal`;
```

### Step 3: Cluster Policies by Role
```python
from databricks.sdk import WorkspaceClient

w = WorkspaceClient()

# Analyst policy: restrict to SQL warehouses and small clusters
analyst_policy = w.cluster_policies.create(
    name="analyst-compute-policy",
    definition="""{
        "cluster_type": {
            "type": "allowlist",
            "values": ["all-purpose"],
            "hidden": false
        },
        "autotermination_minutes": {
            "type": "range",
            "minValue": 10,
            "maxValue": 30,
            "defaultValue": 15
        },
        "num_workers": {
            "type": "range",
            "minValue": 0,
            "maxValue": 4
        },
        "node_type_id": {
            "type": "allowlist",
            "values": ["m5.xlarge", "m5.2xlarge"]
        },
        "spark_conf.spark.databricks.cluster.profile": {
            "type": "fixed",
            "value": "singleNode"
        }
    }""",
)

# Assign to analysts group
w.cluster_policies.set_permissions(
    cluster_policy_id=analyst_policy.policy_id,
    access_control_list=[{
        "group_name": "data-analysts",
        "all_permissions": [{"permission_level": "CAN_USE"}],
    }],
)
```

### Step 4: SQL Warehouse Permissions
```bash
# Grant warehouse access by group
databricks permissions update sql/warehouses/$WAREHOUSE_ID --json '[
  {"group_name": "data-analysts", "permission_level": "CAN_USE"},
  {"group_name": "data-engineers", "permission_level": "CAN_MANAGE"},
  {"group_name": "ml-engineers", "permission_level": "CAN_USE"}
]'
```

### Step 5: Row-Level Security and Column Masking
```sql
-- Row filter: analysts only see their department's data
CREATE OR REPLACE FUNCTION analytics.gold.dept_filter(dept STRING)
  RETURN IF(IS_ACCOUNT_GROUP_MEMBER('data-admins'), true,
            dept = current_user_department());

ALTER TABLE analytics.gold.sales
  SET ROW FILTER analytics.gold.dept_filter ON (department);

-- Column mask: hide email from non-engineers
CREATE OR REPLACE FUNCTION analytics.gold.mask_email(email STRING)
  RETURN IF(IS_ACCOUNT_GROUP_MEMBER('data-engineers'), email,
            REGEXP_REPLACE(email, '(.).*@', '$1***@'));

ALTER TABLE analytics.gold.customers
  ALTER COLUMN email SET MASK analytics.gold.mask_email;
```

### Step 6: Service Principal for Automation
```python
from databricks.sdk import AccountClient

acct = AccountClient()

# Create service principal
sp = acct.service_principals.create(
    display_name="cicd-pipeline",
    active=True,
)

# Generate OAuth secret
secret = acct.service_principal_secrets.create(
    service_principal_id=sp.id,
)
print(f"Client ID: {sp.application_id}")
print(f"Secret: {secret.secret}")  # Store securely — shown only once
```

### Step 7: Audit Access Patterns
```sql
-- Who accessed what in the last 7 days
SELECT event_time, user_identity.email AS actor,
       action_name, request_params
FROM system.access.audit
WHERE action_name LIKE '%Grant%' OR action_name LIKE '%Revoke%'
  AND event_date > current_date() - INTERVAL 7 DAYS
ORDER BY event_time DESC;

-- Excessive privilege detection
SELECT user_identity.email, action_name, COUNT(*) AS access_count
FROM system.access.audit
WHERE event_date > current_date() - INTERVAL 30 DAYS
  AND service_name = 'unityCatalog'
GROUP BY user_identity.email, action_name
HAVING COUNT(*) > 100
ORDER BY access_count DESC;
```

## Output
- Account-level groups provisioned via SCIM matching IdP teams
- Unity Catalog grants enforcing least-privilege across medallion layers
- Cluster policies restricting compute by role (analysts vs engineers)
- SQL warehouse permissions assigned per group
- Row-level security and column masking for PII protection
- Service principal for CI/CD with OAuth M2M
- Audit queries for ongoing compliance monitoring

## Error Handling
| Issue | Cause | Solution |
|-------|-------|----------|
| `PERMISSION_DENIED` on table | Missing `USAGE` on parent catalog/schema | Grant `USAGE` at each namespace level |
| SCIM sync fails | Expired bearer token | Regenerate account-level PAT or use OAuth |
| Can't create cluster | No matching cluster policy | Assign a policy to the user's group |
| Can't see SQL warehouse | Missing `CAN_USE` grant | Add warehouse permission for the group |
| Row filter too slow | Complex subquery in filter function | Materialize permissions in a small lookup table |

## Examples

### Verify Current Permissions
```sql
SHOW GRANTS ON CATALOG analytics;
SHOW GRANTS `data-analysts` ON SCHEMA analytics.gold;
SHOW GRANTS ON TABLE analytics.gold.sales;
```

### Permission Matrix Reference
| Role | Bronze | Silver | Gold | ML | Clusters | Warehouses |
|------|--------|--------|------|----|----------|------------|
| Data Engineer | Read/Write | Read/Write | Read | - | Create (policy) | Use/Manage |
| Data Analyst | - | - | Read | - | Single-node (policy) | Use |
| ML Engineer | - | Read | Read | Read/Write | Create (policy) | Use |
| Admin | Full | Full | Full | Full | Unrestricted | Manage |
| CI/CD SP | Full | Full | Full | Full | Manage | - |

## Resources
- [Unity Catalog Privileges](https://docs.databricks.com/aws/en/data-governance/unity-catalog/manage-privileges/)
- [SCIM Provisioning](https://docs.databricks.com/aws/en/admin/users-groups/scim/)
- [Cluster Policies](https://docs.databricks.com/aws/en/admin/clusters/policy-definition)
- [Row and Column Filters](https://docs.databricks.com/aws/en/data-governance/unity-catalog/row-and-column-filters)

Related Skills

kubernetes-rbac-analyzer

25
from ComeOnOliver/skillshub

Kubernetes Rbac Analyzer - Auto-activating skill for Security Advanced. Triggers on: kubernetes rbac analyzer, kubernetes rbac analyzer Part of the Security Advanced skill category.

exa-enterprise-rbac

25
from ComeOnOliver/skillshub

Manage Exa API key scoping, team access controls, and domain restrictions. Use when implementing multi-key access control, configuring per-team search limits, or setting up organization-level Exa governance. Trigger with phrases like "exa access control", "exa RBAC", "exa enterprise", "exa team keys", "exa permissions".

evernote-enterprise-rbac

25
from ComeOnOliver/skillshub

Implement enterprise RBAC for Evernote integrations. Use when building multi-tenant systems, implementing role-based access, or handling business accounts. Trigger with phrases like "evernote enterprise", "evernote rbac", "evernote business", "evernote permissions".

documenso-enterprise-rbac

25
from ComeOnOliver/skillshub

Configure Documenso enterprise role-based access control and team management. Use when implementing team permissions, configuring organizational roles, or setting up enterprise access controls. Trigger with phrases like "documenso RBAC", "documenso teams", "documenso permissions", "documenso enterprise", "documenso roles".

deepgram-enterprise-rbac

25
from ComeOnOliver/skillshub

Configure enterprise role-based access control for Deepgram integrations. Use when implementing team permissions, managing API key scopes, or setting up organization-level access controls. Trigger: "deepgram RBAC", "deepgram permissions", "deepgram access control", "deepgram team roles", "deepgram enterprise", "deepgram key scopes".

databricks-webhooks-events

25
from ComeOnOliver/skillshub

Configure Databricks job notifications, webhooks, and event handling. Use when setting up Slack/Teams notifications, configuring alerts, or integrating Databricks events with external systems. Trigger with phrases like "databricks webhook", "databricks notifications", "databricks alerts", "job failure notification", "databricks slack".

databricks-upgrade-migration

25
from ComeOnOliver/skillshub

Upgrade Databricks runtime versions and migrate between features. Use when upgrading DBR versions, migrating to Unity Catalog, or updating deprecated APIs and features. Trigger with phrases like "databricks upgrade", "DBR upgrade", "databricks migration", "unity catalog migration", "hive to unity".

databricks-security-basics

25
from ComeOnOliver/skillshub

Apply Databricks security best practices for secrets and access control. Use when securing API tokens, implementing least privilege access, or auditing Databricks security configuration. Trigger with phrases like "databricks security", "databricks secrets", "secure databricks", "databricks token security", "databricks scopes".

databricks-sdk-patterns

25
from ComeOnOliver/skillshub

Apply production-ready Databricks SDK patterns for Python and REST API. Use when implementing Databricks integrations, refactoring SDK usage, or establishing team coding standards for Databricks. Trigger with phrases like "databricks SDK patterns", "databricks best practices", "databricks code patterns", "idiomatic databricks".

databricks-reference-architecture

25
from ComeOnOliver/skillshub

Implement Databricks reference architecture with best-practice project layout. Use when designing new Databricks projects, reviewing architecture, or establishing standards for Databricks applications. Trigger with phrases like "databricks architecture", "databricks best practices", "databricks project structure", "how to organize databricks", "databricks layout".

databricks-rate-limits

25
from ComeOnOliver/skillshub

Implement Databricks API rate limiting, backoff, and idempotency patterns. Use when handling rate limit errors, implementing retry logic, or optimizing API request throughput for Databricks. Trigger with phrases like "databricks rate limit", "databricks throttling", "databricks 429", "databricks retry", "databricks backoff".

databricks-prod-checklist

25
from ComeOnOliver/skillshub

Execute Databricks production deployment checklist and rollback procedures. Use when deploying Databricks jobs to production, preparing for launch, or implementing go-live procedures. Trigger with phrases like "databricks production", "deploy databricks", "databricks go-live", "databricks launch checklist".