How to connect Batch to Snowflake?
Here's a user guide on how to install and use Batch's Snowflake Native app to synchronize your Snowflake data with Batch, for segmentation, personalization, and orchestration purposes.
Batch Data Sync Snowflake Native app
Batch leverages Snowflake's Native app framework to provide a ready-to-install Snowflake app that
allows you to sync your customer data from Snowflake into Batch in no time:
Securely manage connections to your Batch projects using Project Keys and REST API Keys.
Create and configure synchronization tasks to send data from your Snowflake dynamic tables to Batch.
Monitor and manage the status of your active data syncs, with options to suspend and resume them as needed.
This guide will walk you through each feature of the application.
Important requirement Snowpark Container Services need to be available on your Snowflake account for you to be able to use this app.
👉 To have access to Batch's Native app on your Snowflake account, please reach out to your Customer Success Manager or contact support@batch.com.
1. Install and launch the Native app
These steps need to be performed by a Snowflake Administrator (ACCOUNTADMIN role).
Once the app has been shared with your Snowflake account, you'll be able to find it in your installed apps by navigating to Data products > Apps from your Snowsight console.
When the app is launched for the first time, you'll need to authorize the following privileges needed for the app to work properly:
Step 1: Account privileges
1 → CREATE COMPUTE POOL
: This is required by the native app to run Job services triggered by scheduled tasks. These jobs transform your Snowflake data and send it to Batch using Batch APIs.
2 → EXECUTE TASK
: This is required for the native app to synchronize updates to your Snowflake data in a scheduled, recurring manner with Batch.
3 → Click on the "Grant" button to grant both these privileges.

Step 2: External Access Integration
To synchronize data with Batch, the Native app needs to be able to make external calls to Batch's APIs (api.batch.com:443
). This is done through an External Access Integration.
You can first click on the "Review" button to display additional information on the required external access:

Once that is done, simply click on the "Connect" button.

Now, all that is left to do is clicking on the "Activate" button that appears on the top right corner of the screen. You should then be redirected to an information page through which you can launch the app.

2. Manage Batch Projects
The first step is to register your Batch projects with the application. This securely stores your credentials, allowing the app to send data to the correct Batch project.
1 → Go to the "Manage Batch Projects" tab and then the "Add a New Batch Project" section

2 → Enter a display name for your project.
3 → Enter your Batch Project Key
4 → Enter your Batch REST API Key
5 → Click the "Add Project" button.
Once projects are added, they are listed at the top of the tab and can be deleted via the "Delete project" button.
3. Create a synchronization Task
This section allows you to create a task that automatically synchronizes data from a Snowflake dynamic table to one of your registered Batch projects.
3.1. Prerequisites
Prior to creating a synchronization task, you'll need to make sure the following requirements are met:
3.2. Create a new synchronization task
This allows you to create a background task that is scheduled to run periodically (every 24 hours by default), which will make sure all updates (inserts/deletes/edits) on your source Dynamic table are sent to Batch.
1 → Go to the "Create a Sync task" tab.

2 → Fill in the Warehouse Name
: The name of the Snowflake warehouse that the task will use to run. This warehouse must exist, and the application must have USAGE permission on it.
3 → Fill in the Project Key
: The Batch Project Key for the project you want to send data to. This must match one of the projects you added in the "Manage Batch Projects" tab.
4 → Fill in the Dynamic Table Name
: The fully qualified name of the source dynamic table in Snowflake (e.g., MY_DATABASE.MY_SCHEMA.BATCH_PROFILE_DATA
). The application will create a stream on this table to capture new and updated rows.
5 → Fill in the Identifier Column name
: The name of the column in your dynamic table that contains the user identifier (e.g., CUSTOM_ID
). This is used to match users in Batch, so it must be the same identifier used in your Batch Profile base.
6 → After filling in all the fields, click the "Activate data synchronization" button.
4. Manage Synchronization Tasks
This tab provides a view to monitor and control all your active and paused synchronization tasks.
1 → Go to the "Manage Sync Tasks" tab.
For each task, the following information is displayed:
Task Name
: The unique name of the task in Snowflake.Project Name and Key
: The associated Batch project Name (display name selected during project registration) and Project Key.Table
: The source dynamic table used by the task.Created
: The timestamp when the task was created.Status
: The current state of the task (STARTED or SUSPENDED)Suspended Date
: If the task is suspended, this shows when it was paused.

2 → Click the "Resume" button to resume a paused task: The task's status will change to STARTED
3 → Click the "Suspend" button to pause a running task: The task's status will change to SUSPENDED
.
4 → Click the "🗑️" button to delete a task.
5 → A status message below the task displays information on the last task run.
6 → Click the "Refresh Tasks" button to see the most up-to-date view of existing tasks.
5. Test and Monitor the workflow
Testing
You can use Batch’s Profile View to verify that the customer data has reached its destination, on the basis of a customer ID (the same one contained in your User ID column): How to find a user's profile on the dashboard?.
Monitoring task executions
To monitor task execution activity, you can check the information displayed in the "Manage sync tasks" tab. This will let you know what tasks are running/suspended, as well as additional information on the last executions.
If any issues occur during task execution, a red status message will be displayed below the task with additional information on the encountered error.

Accessing container logs
All logs generated by the sync task containers are automatically captured and stored in your account's central snowflake.telemetry.events
table. This provides a persistent, queryable record of all activity.
To find logs relevant to this application, you can run a query that filters these events based on the application's unique compute pool name. This is the most effective way to see a complete history of all your sync tasks.
Run the following command in a Snowflake worksheet to retrieve the latest logs:
SELECT *
FROM
snowflake.telemetry.events
WHERE
-- Filter to logs specifically from this application's compute pool
RESOURCE_ATTRIBUTES['snow.compute_pool.name']::string = 'BATCH_SYNC_NATIVE_APP_COMPUTE_POOL'
ORDER BY
timestamp DESC;
This query will show you the log messages and all their associated metadata, giving you a comprehensive view for debugging.
Debug Mode
For more detailed troubleshooting, you can enable a global Debug Mode. When active, the data sync container generates more verbose logs, which can help diagnose complex issues with data processing or API communication.
When to Use It: Enable this mode only when you are actively troubleshooting a failing sync task and the standard execution logs do not provide enough detail. Remember to disable it once the troubleshooting is done to avoid unnecessary storage of a high volume of detailed logs.
How to Enable/Disable It: Debug Mode is an application-level setting that must be changed. To enable it, run the following SQL command in a Snowflake worksheet:
CALL BATCH_SYNC_NATIVE_APP.batch_app.update_config('DEBUG_MODE','TRUE');
To disable it, simply set it back to 'FALSE'
.
Last updated