Data warehouse integrations are available as a premium add-on for our Web Experimentation and Feature Experimentation module. For more information, please contact your Customer Success Manager.
We currently support Data warehouse integration from the following providers:
Support for the following providers is coming soon:
- Microsoft Azure
For more information and if you want to be part of our early adopter program, please contact your Customer Success Manager.
- Advanced Targeting: Utilize your data warehouse to create highly personalized and precise campaigns, boosting user engagement and conversion rates.
- Streamlined Data Management: Centralize event data in your data warehouse for easy access, enabling in-depth analysis and reporting.
- Data-Driven Decision-Making: Leverage insights from your data warehouse to inform data-driven decisions and enhance marketing strategies.
Keep these things in mind when using this integration:
- Data Volume: Keep in mind the volume of data you plan to interact with, as it can affect query performance and costs.
- Query Complexity: Complex queries may require more time and resources to execute. Optimize your queries for efficiency.
- Data Privacy: Ensure compliance with data privacy regulations when handling user data within your warehouse.
- Access Control: Implement proper access controls to limit who can configure and use the integration within your organization.
- Data Schema: Maintain a clear and consistent data schema to facilitate data retrieval and analysis.
- Monitoring: Regularly monitor your data warehouse usage to manage costs and performance effectively.
- Documentation: Maintain documentation for queries, configurations, and integration processes to facilitate collaboration and troubleshooting.
To configure this integration, you need the following informations:
- Google Cloud Account: Users must have a valid Google Cloud account to access Google BigQuery and generate the necessary credentials.
- Google Service Account: A Google Service Account with the appropriate permissions to access BigQuery and create credentials is required.
- BigQuery Project: Users need to have a Google BigQuery project set up where data will be stored and queried.
- Credential File: Users must generate a credential file from their Google Service Account, which will be used to securely access BigQuery.
1. Create a service account
Create a service account for Kameleoon within that project that has the BigQuery Data Viewer role.
Type “service accounts” in global search bar and click on the suggested result.
Click Create service account.
Fill in the mandatory fields then click Create and Continue.
Start typing “BigQuery data viewer” in the Select a role field and click on BigQuery Data Viewer when it appears in the dropdown search.
Click on Done at the bottom.
2. Create a new dataset
Create a new dataset called “kameleoon” in your project
Select the region or multi-region that suits you best.
3. Grant permissions to the service account
Grant the service account BigQuery Data Owner status on this “kameleoon” dataset
- In the BigQuery dashboard, click on the kameleoon dataset in the Explorer bar on the left.
- Click Sharing on the top right then Permissions.
- Click ADD PRINCIPAL.
- Type in the Kameleoon service account name in the Add principals field (the interface should offer auto-complete in a drop-down menu with the full name of the service account, you can click on it).
- Underneath in the Assign Roles field type “BigQuery Data Owner” (as above, a drop-down auto-complete menu should appear, you can click on it).
- Finish by clicking SAVE.
4. Create a custom role
Create a custom role that has the
- Go to Roles, that you can find by typing “Roles” in the console global search field.
- Click CREATE ROLE at the top left.
- Fill in the configuration fields as you wish (title, description…).
Click ADD PERMISSIONS and add bigquery.jobs.create from the drop-down list that has appeared in a pop-in window.
Then click ADD.
- Finalize the custom role by clicking CREATE.
Add this custom role to the service account you created
- Go to IAM in the sidebar of Google Cloud dashboard.
- You should see a list of Principals. A service account is a Principal. Find the kameleoon service account in the list and click the pen logo on the left to edit the principal.
- A configuration sidebar appears on the right, click Add Another Role and find the custom role you created above in the drop-down auto-complete menu.
- Click SAVE.
Download the service account JSON credentials file on your computer.
5. Enabling the integration for your project
- Access the Integrations Page: Log in to your Kameleoon account and navigate to the “Integrations” page. You can usually find this in the navigation menu, under Admin > Integrations.
- Select the BigQuery Integration: On the Integrations page, you’ll see a list of available integrations. Locate and click on the “BigQuery” option to begin the setup.
- Choose a Project: On the left side of the Integrations page, you’ll find a list of projects, each represented by a checkbox. To enable the BigQuery Integration for a specific project, select the checkbox corresponding to that project. You can select multiple projects if needed.
- Upload the Credential File: Once you’ve selected the project, you’ll need to upload the JSON file associated with your Google Service Account. Click on the “Upload JSON file” button, and a file upload dialog will appear. Locate and select your JSON file. This file contains the necessary permissions to access BigQuery for the chosen project.
- Save and Apply Changes: After uploading the credential file, click on the “Validate” button. This action will save and apply your configuration settings for the selected project.
6. Managing Multiple Projects
If you have multiple projects and want to configure the BigQuery Integration for each of them, you can do so from the same Kameleoon account. To switch between project configurations, use the project dropdown menu provided within the integration settings. This dropdown allows you to select and manage the integration settings for each project individually.
Keep in mind that each project has its distinct configuration settings. You will need to upload a unique JSON credential file for each project. However, if the permissions granted by a single credential file are applicable to several projects, you can reuse the same credential file for those projects.
Once you have enabled the BigQuery integration for your project, you can: