Create a Sync from BigQuery to Batch Profile attributes
Before you start
To create a BigQuery → Batch sync, you’ll need:
Access to the Batch dashboard
A BigQuery table or view containing one row per profile
A Google Cloud service account key (JSON) to grant Batch read access
A table (or view) that follows the Cloud Sync input format (see below)
1) Prepare your BigQuery table
Cloud Sync expects your BigQuery source (table or view) to include:
A profile identifier (to know which profile to update)
A cursor field (to know what changed since the last run)
Any number of attribute columns (sent to Batch as profile attributes)
1.1 One row per profile
Your source must contain one row per profile. Each row is interpreted as an update to a single Batch profile.
1.2 Required columns
Your table (or view) must include:
custom_id
✅
The profile identifier in Batch
last_updated_at
✅
Cursor used for incremental sync
Important: last_updated_at must be updated every time any synced attribute changes, otherwise updates may not be picked up by the next run.
1.3 Attribute naming rules (BigQuery-compatible)
Cloud Sync reads BigQuery columns and converts them into Batch profile attributes.
However, BigQuery column names cannot contain characters like $, (, or ). That means you can’t use the exact Profile API formats such as:
url(avatar)date(birthday)$email_address
✅ Instead, Cloud Sync relies on prefixes in column names to represent typed or native fields.
Supported prefixes
date__
Date attribute
date__birthday
url__
URL attribute
url__avatar
batch__
Native profile fields (instead of $...)
batch__email_address
1.4 Example schema (e-commerce)
Here’s a table format you can use as a reference:
How this maps in Batch:
custom_ididentifies the profilebatch__email_addressupdates the profile’s native email fieldplan,country,lifetime_value,is_vipbecome attributesurl__avataris interpreted as a URL attributedate__birthdayanddate__last_purchaseare interpreted as date attributes
1.5 Using a View
If your raw table doesn’t match the expected naming or format, create a BigQuery View that converts your schema into the correct conventions.
Example:
This approach lets you:
rename fields with the correct prefixes (
batch__,date__,url__)compute a reliable
last_updated_atensure you always expose one row per profile
1.6 Handling nulls
If a column value is NULL, Batch interprets it as attribute removal for that profile.
If you don’t want an attribute removed:
ensure your view returns a non-null value, or
exclude the column from the sync entirely.
1.7 Attributes limits and constraints
When syncing data from BigQuery to Batch, all attributes sent through Cloud Sync must respect the same limits and constraints as the Batch Profile API. See Profile API documentation, the attributes object.
2) Create a Service Account key in Google Cloud
Batch uses a Service Account Key (JSON) to securely access your BigQuery dataset.
Go to Google Cloud Console → IAM & Admin → Service Accounts
Create a service account (or reuse an existing one)
Generate a JSON key
Grant the service account:
roles/bigquery.jobUserDataset-level permission: BigQuery Data Editor on the dataset containing your source table/view
3) Create the Sync in the Batch dashboard
Cloud Sync is configured from the dashboard via a dedicated Sync module.
Open the Batch dashboard
Go to Data → Cloud Sync
Click Create Sync
Select BigQuery as the source
3.1 Configure your BigQuery connection
Enter:
Dataset
Table or View
Upload your Service Account Key (JSON)
Batch validates the connection before continuing.
3.2 Configure profile mapping
Cloud Sync applies a simple mapping model:
custom_id→ identifies which Batch profile to updateall other columns → mapped to profile attributes
last_updated_at→ used only for incremental sync logic
4) How incremental sync works
Cloud Sync uses incremental processing, which means it does not re-import your full dataset at every run. Instead, it fetches only the rows that changed since the last successful sync.
4.1 The last_updated_at cursor
last_updated_at cursorBatch stores the last successful cursor value internally.
At each run, Batch fetches only rows where:
last_updated_atis greater than the last stored cursor
This makes sync runs faster, more scalable, and more cost-efficient.
4.2 Inserts, updates, and deletes
Incremental syncs naturally capture:
✅ inserts
✅ updates
They do not automatically capture:
❌ deletes
If you need deletions reflected in Batch, rely on a different pipelines or implement soft deletes by setting all attributes to null in the BigQuery view when a profile is deleted.
4.3 Best practices for reliable incremental syncs
To avoid missing changes:
Ensure
last_updated_atupdates every time a synced column changesAvoid timestamps that only reflect partial updates
Use a View if you need computed fields or type conversions
Partition or cluster on
last_updated_atfor large datasets
5) Test and enable your Sync
Before enabling the schedule:
Run a test sync
Verify:
Profiles are created or updated correctly
batch__,date__, andurl__fields are interpreted correctlyNull values behave as expected (null → attribute removal)
Once enabled, Batch automatically handles:
batching
retries
Last updated

