How to connect Batch to Snowflake?
Here's a user guide on how to install and use Batch's Snowflake Native app to synchronize your Snowflake data with Batch, for segmentation, personalization, and orchestration purposes.
Batch Data Sync Snowflake Native app
Batch leverages Snowflake's Native app framework to provide a ready-to-install Snowflake app that
allows you to sync your customer data from Snowflake into Batch in no time:
Securely manage connections to your Batch projects using Project Keys and REST API Keys.
Create and configure synchronization tasks to send data from your Snowflake dynamic tables to Batch.
Monitor and manage the status of your active data syncs, with options to suspend and resume them as needed.
This guide will walk you through each feature of the application.
Important requirement Snowpark Container Services need to be available on your Snowflake account for you to be able to use this app.
Please be aware that this application consumes Snowflake credits for its operations, primarily from the use of virtual warehouses and compute pools. We recommend you monitor your credit usage to understand the cost implications.
👉 To have access to Batch's Native app on your Snowflake account, you can install it from the Snowflake Marketplace.
1. Install and launch the Native app
These steps need to be performed by a Snowflake Administrator (ACCOUNTADMIN role).
Once the app has been shared with your Snowflake account, you'll be able to find it in your installed apps by navigating to Data products > Apps from your Snowsight console.
When the app is launched for the first time, you'll need to authorize the following privileges needed for the app to work properly:
Step 1: Account privileges
1 → CREATE COMPUTE POOL : This is required by the native app to run Job services triggered by scheduled tasks. These jobs transform your Snowflake data and send it to Batch using Batch APIs.
2 → EXECUTE TASK : This is required for the native app to synchronize updates to your Snowflake data in a scheduled, recurring manner with Batch.
3 → CREATE WAREHOUSE : This is required for the native app to execute recurring synchronization tasks.
4 → Click on the "Grant" button to grant both these privileges.

Step 2: External Access Integration
To synchronize data with Batch, the Native app needs to be able to make external calls to Batch's APIs (api.batch.com:443). This is done through an External Access Integration.
You can first click on the "Review" button to display additional information on the required external access:

Once that is done, simply click on the "Connect" button.

Now, all that is left to do is clicking on the "Activate" button that appears on the top right corner of the screen. You should then be redirected to an information page through which you can launch the app.

2. Manage Batch Projects
The first step is to register your Batch projects with the application. This securely stores your credentials, allowing the app to send data to the correct Batch project.
1 → Go to the "Manage Batch Projects" tab and then the "Add a New Batch Project" section

2 → Enter a display name for your project.
3 → Enter your Batch Project Key
4 → Enter your Batch REST API Key
5 → Click the "Add Project" button.
The application can connect to multiple Batch projects. The maximum number of projects you can add is limited to 4 by default.
Once projects are added, they are listed at the top of the tab and can be deleted via the "Delete project" button.
3. Create a synchronization Task
This section allows you to create a task that automatically synchronizes data from a Snowflake dynamic table to one of your registered Batch projects.
3.1. Prerequisites
Prior to creating a synchronization task, you'll need to make sure the following requirements are met:
A. Identify the data you want to synchronize and store it in a dynamic table
Start by deciding which Profile attributes you want to synchronise from Snowflake to Batch.
To make sure the data is available for syncing, create a Dynamic table in Snowflake containing the data you want to send.
Using a Dynamic table will make sure all changes and inserts on your base tables are dynamically applied to your dynamic table, which will serve as an input for the Batch Native app.
Here is an example of how you can create your Dynamic table:
Important:
Your table must contain a User ID column to be used as a mapping key with Batch’s Profile base. This must be the same identifier you use as Batch’s Custom User ID.
Batch’s Native app uses Batch’s Profile API to send profile data. Thus, your table data must comply with the APIs requirements (e.g. attribute max length, etc). Non-compliant data will be ignored during synchronization.
Columns that should be mapped with Batch native attributes should be named according to Batch's native attribute documentation. e.g. to map email addresses, name the column
"$email_address".Columns containing DATES or URLS must be prefixed by
date_orurl_respectively. If not, these fields will be synchronized as String attributes.The app uses Snowflake Streams to capture changes on your Dynamic Table. Therefore, ensure that your underlying query supports incremental refresh.
B. Grant the app with the required permissions 🔐
Snowflake Native apps do not have default access to the consumer's account objects. Thus, to allow Batch's Native app to manipulate the required objects for synchronization, the following permissions need to be granted by an Administrator:
SELECT on the Dynamic table containing the data you want to synchronize with Batch
USAGE on both the Database and Schema in which the Dynamic table is stored
Here is how you can grant these rights:
3.2. Create a new synchronization task
This allows you to create a background task that is scheduled to run periodically (every 24 hours by default), which will make sure all updates (inserts/deletes/edits) on your source Dynamic table are sent to Batch.
1 → Go to the "Create a Sync task" tab.

2 → Fill in the Project Key: The Batch Project Key for the project you want to send data to. This must match one of the projects you added in the "Manage Batch Projects" tab.
3 → Fill in the Dynamic Table Name: The fully qualified name of the source dynamic table in Snowflake (e.g., MY_DATABASE.MY_SCHEMA.BATCH_PROFILE_DATA). The application will create a stream on this table to capture new and updated rows.
4 → Fill in the Identifier Column name: The name of the column in your dynamic table that contains the user identifier (e.g., CUSTOM_ID). This is used to match users in Batch, so it must be the same identifier used in your Batch Profile base.
5 → After filling in all the fields, click the "Activate data synchronization" button.
Note that the first sync will include all data contained in the Dynamic table.
If a task with the exact same configuration (Project Key and Dynamic Table) already exists but is paused, the app will update and resume it for you. Note that updating a task this way will trigger a synchronization of all data contained in the Dynamic table. To simply "Resume" a paused task without a full refresh, go to the "Manage Sync tasks" tab.
4. Manage Synchronization Tasks
This tab provides a view to monitor and control all your active and paused synchronization tasks.
1 → Go to the "Manage Sync Tasks" tab.
For each task, the following information is displayed:
Task Name: The unique name of the task in Snowflake.Project Name and Key: The associated Batch project Name (display name selected during project registration) and Project Key.Table: The source dynamic table used by the task.Created: The timestamp when the task was created.Status: The current state of the task (STARTED or SUSPENDED)Suspended Date: If the task is suspended, this shows when it was paused.

2 → Click the "Resume" button to resume a paused task: The task's status will change to STARTED
3 → Click the "Suspend" button to pause a running task: The task's status will change to SUSPENDED.
4 → Click the "🗑️" button to delete a task.
5 → A status message below the task displays information on the last task run.
6 → Click the "Refresh Tasks" button to see the most up-to-date view of existing tasks.
The application enforces a limit on the number of concurrently running tasks that is set to 2 by default.
5. Test and Monitor the workflow
Testing
You can use Batch’s Profile View to verify that the customer data has reached its destination, on the basis of a customer ID (the same one contained in your User ID column): How to find a user's profile on the dashboard?.
Monitoring task executions
To monitor task execution activity, you can check the information displayed in the "Manage sync tasks" tab. This will let you know what tasks are running/suspended, as well as additional information on the last executions.
If any issues occur during task execution, a red status message will be displayed below the task with additional information on the encountered error.

Accessing container logs
All logs generated by the sync task containers are automatically captured and stored in your account's central snowflake.telemetry.events table. This provides a persistent, queryable record of all activity.
To find logs relevant to this application, you can run a query that filters these events based on the application's unique compute pool name. This is the most effective way to see a complete history of all your sync tasks.
Run the following command in a Snowflake worksheet to retrieve the latest logs:
This query will show you the log messages and all their associated metadata, giving you a comprehensive view for debugging.
Debug Mode
For more detailed troubleshooting, you can enable a global Debug Mode. When active, the data sync container generates more verbose logs, which can help diagnose complex issues with data processing or API communication.
When to Use It: Enable this mode only when you are actively troubleshooting a failing sync task and the standard execution logs do not provide enough detail. Remember to disable it once the troubleshooting is done to avoid unnecessary storage of a high volume of detailed logs.
How to Enable/Disable It: Debug Mode is an application-level setting that must be changed. To enable it, run the following SQL command in a Snowflake worksheet:
To disable it, simply set it back to 'FALSE'.
Last updated

