How to export data to Google BigQuery
Note: This is a beta feature. Contact support@getvero.com for access.
You can connect Vero to your Google BigQuery data warehouse to sync your customer, campaign and message interaction activity. Vero has partnered with Prequel.co to offer this data destination feature.
Prerequisites
Before you begin, ensure you have:
- A Google Cloud Platform account with a BigQuery project.
- Administrative access to your BigQuery instance and Google Cloud Storage.
- Administrative access to your Vero account.
Overview
To set up BigQuery as a data destination, you'll need to:
- Add a BigQuery Data Source in Vero.
- Create a dataset in BigQuery where Vero will write data.
- Update your service account to enable write access.
- Grant dataset permissions to your service account.
- Create a staging bucket in Google Cloud Storage.
- Grant permissions to Vero's data syncing service.
- Add the connection in Vero.
- Contact Vero support to activate
Step 1: Follow the steps to add a BigQuery Data Source
Follow all of the steps outlined in the How to connect to Google BigQuery help article and ensure your connection is working.
Step 2: Create a dataset in BigQuery
This dataset will be where Vero writes data.
- Log in to your Google Cloud Platform account and navigate to BigQuery.
- Click Create dataset.
- Enter a name for your dataset (e.g., "vero_data"). This is where Vero will write your data.
- Select a location (region) for your dataset. Make note of this region as your staging bucket (setup in the following steps) must be in the same region.
- Configure any additional settings as needed for your organization (default table expiration, encryption, etc.).
- Click Create dataset.
Step 3: Updating your service account to enable write access
- In the Google Cloud Console, navigate to IAM & Admin → Service Accounts.
- Find the service account you setup when configuring BigQuery as a Vero data source. Click on it to view the details.
- You want to Grant users access to this service account. In the Service account users role field, enter
datasync-donwtbtw@prql-prod.iam.gserviceaccount.com. This user is used by Vero to transfer data. - Select the Permissions tab, find the principal
datasync-donwtbtw@prql-prod.iam.gserviceaccount.com, click the Edit principal button (pencil icon), click Add another role, select the Service Account Token Creator role, and click Save.
Minimum project permissions
As an alternative to granting access to the Service Account Token Creator role, the following are the minimum permissions required by Vero at the project level. You can apply these directly to the prinicapl instead: bigquery.jobs.create.
Step 4: Grant dataset permissions to your service account
Further ot the above, you also need to give the relevant service account permission to write data to the specific dataset you created.
- In BigQuery, select the dataset you created in the steps above.
- Click Share → Manage permissions.
- Click Add Principal.
- In the "New principals" field, enter your service account email (from Step 2, step 8).
- In the "Select a role" dropdown, choose BigQuery Data Owner.
- Click Save.
Minimum dataset permissions
As an alternative to granting access to the BigQuery Data Owner role, the following are the minimum permissions required by Vero for the dataset. These can be granted to the principal and applied to the dataset specifically:
bigquery.tables.createbigquery.tables.deletebigquery.tables.getbigquery.tables.getDatabigquery.tables.listbigquery.tables.updatebigquery.tables.updateDatabigquery.routines.getbigquery.routines.list
Step 5: Create a staging bucket in Google Cloud Storage
- In the Google Cloud Console, navigate to Cloud Storage → Buckets.
- Click Create to create a new bucket.
- Name the bucket
vero-etl-staging. This is the standardized bucket name used for Vero data syncing. - Select location (region) for the staging bucket. Important: This location must match the region where you created your BigQuery dataset in Step 1.
- Configure the remaining options according to your preferences and click Create.
- Once created, click on the
vero-etl-stagingbucket to open the Bucket details page. - Click the Permissions tab, then click Grant Access (or Add).
- In the New principals field, enter the service account email you noted in Step 2.
- Select the Storage Admin role and click Save.
Minimum staging bucket permissions
As an alternative to granting access to the Storage Admin role, the following are the minimum permissions required by Vero for the staging bucket. These can be granted to the principal and applied to the bucket specifically:
storage.buckets.getstorage.objects.liststorage.objects.getstorage.objects.createstorage.objects.delete
Step 6: Find your Project ID
- In the Google Cloud Console, click the project dropdown at the top of the page.
- Make note of your Project ID. You'll need this when adding the connection in Vero.
Step 7: Contact Vero support to activate
Once you've completed the setup steps above, contact Vero support to complete the activation of your data destination. Please provide:
- Your Vero account email address.
- The data connection name you created in Vero.
- Bucket Location: The region/location of your staging bucket (from Step 4).
Our support team will complete the final setup steps to begin syncing your data to BigQuery.
Testing your connection
After Vero support completes the setup we will notify you. At this point:
- Prequel.co will perform a full data sync to your BigQuery dataset.
- After the initial sync, updates will be synced periodically.
- You can view the synced tables in your BigQuery console.
- Allow up to 24 hours for the initial full data sync to complete.
Supported data
For a detailed list of the tables Vero will create and the data synced in each table, refer to the Exporting data to a data destination help article.
Securing your connection
- Vero's service account uses role-based access and can only impersonate the service account you created
- The service account you created only has access to the specific dataset and staging bucket
- You maintain full control over your data in BigQuery
- You can revoke access at any time by removing the service account or its permissions
- All data is transmitted securely using industry-standard encryption
FAQs
What is Prequel?
Prequel is the trusted vendor used by Vero to ETL data to our customers data warehouses. Prequel is a market-leader in the industry and trusted by businesses such as LaunchDarkly, Zuora, Gong, Webflow, Drata and more.
How often does data sync?
Prequel.co performs a full data sync when your connection is first established, and thereafter syncs updates periodically.
Can I sync to multiple datasets?
Yes, you can create multiple BigQuery data connections, each pointing to different datasets or projects.
What happens if the connection fails?
If the connection fails, Vero will retry automatically. You can also check the connection status in your Data Sources settings.
Can I delete synced data from BigQuery?
Yes, you have full control over the data in your BigQuery instance. However, because this is a sync connection, any tables you delete will be recreated during the next sync cycle.
Prequel.co performs a full data sync when your connection is first established, and thereafter syncs updates periodically. If you need to delete data and prevent it from being recreated, you'll need to contact support to adjust your sync configuration. If you've deleted data and need it restored, you can request a full sync from Vero support.
What is the staging bucket used for?
The staging bucket (vero-etl-staging) is used by Prequel as a temporary storage location when transferring data from Vero to BigQuery. This is a standard practice for efficient bulk data loading into BigQuery.
Do I need to choose a specific region?
Yes, your vero-etl-staging bucket location must match your BigQuery dataset region that you selected in Step 1. Make sure to note the region you choose for your dataset so you can select the same region when creating your staging bucket. It is important to communicate the region to support.