Use Databricks as a destination

Written by Julie Trenque

Updated on 07/08/2025

2 min

Intermediate

Manage your integrations

Analytics

Automation

CDP

CMP

CMS/CRM/E-commerce

Data Warehouses

Developers

Was this content useful?

Data warehouse integrations are available as a premium add-on for our Web Experimentation and Feature Experimentation module. For more information, please contact your Customer Success Manager.

Once you have enabled the Databricks integration for your project, you can activate Use Databricks as a destination to seamlessly send events to your Databricks account whenever visitors are exposed to one of your Kameleoon experiments.

Activate Databricks as a destination

To activate Databricks as a destination:

  1. Click Use Databricks as a destination.
  2. Databricks catalog: name of the Databricks “catalog” that will contain Kameleoon schema to write to (the top level folder).
  3. Schema: Schema containing the tables this ingestion task will query from.

What “Databricks as a destination” does

Enabling Use Databricks as a destination will stream all Kameleoon experiment exposure events into Databricks.

Events will be stored in the kameleoon_events schema of your Databricks catalog — containing the tables this ingestion task will query from. This is the Databricks catalog you set up during setup, with write access granted to the Kameleoon user. The data will be saved in a table named kameleoon_experiment_event.

The SQL schema of this table is the following:

CREATE TABLE  kameleoon_experiment_event (
     nonce  BIGINT  PRIMARY KEY, -- unique identifier of the event
     timestamp  BIGINT, -- timestamp in millis of the event
     visitor_code  VARCHAR(255), -- Kameleoon visitor identifier
     custom_visitor_id  VARCHAR(255), -- Visitor identifier used by your company
     experiment_id  BIGINT, -- Kameleoon id of the experiment
     variation_id  BIGINT -- Kameleoon id of the variation
 );

The custom_visitor_id is read from the Cross Device Reconciliation custom data, if it has been set.

With campaign data stored in Databricks, you gain the ability to perform in-depth analysis and reporting. You can leverage Databricks’s querying capabilities to extract valuable insights from the collected data, helping you make data-driven decisions to optimize your campaigns and user experience.

By centralizing Kameleoon campaign results in Databricks, you contribute to the enrichment of your Databricks database, making it a comprehensive repository of user data that can be used for a variety of analytical and business intelligence purposes.

  • In this article :