Big Query
Boltic supports integration with Datawarehouse like BigQuery. It fetches data from your BigQuery data source and allows you to create a Bolt that contains a sequence of actions such as Integration, Transformation, and Destination. In addition to this, you can schedule a Bolt at a fixed interval or date and time.
Setup Guide to Integrate BigQuery With Boltic
This guide will give you a brief idea of what steps you need to follow to integrate BigQuery into Boltic
Prerequisites
Before you connect your BQ to Boltic, you must complete the following prerequisites-
On the Big Query configuration page, the Project Key is always required. To get this key, you need to create Google service account.
Verify that the BQ dataset containing the source and target tables have the read and write access
To connect Boltic to your BQ, you should have a service account with a proper set of permissions-
Permissions | |
---|---|
To read data from or write data to a Google BigQuery table | bigquery.jobs.create" |
"bigquery.jobs.create" | |
"bigquery.tables.get" | |
To enable large results | "bigquery.tables.getData" |
"bigquery.tables.create" | |
"bigquery.tables.updateData" | |
For BQ as a sink | "Bigquery.tables.create" |
"bigquery.tables.delete" |
Step 1: Add Integration
Now, visit the Add Integration page to select Big Query as an Integration
Step 2: Configuration
Integrate Google BigQuery data into Boltic either by using Google account or service account
Google Account
Enter an Integration Name by which you want to refer to this BigQuery integration.
Click the Sign in with Google Account
Choose your Gmail Account and then click the Allow button to allow Boltic to read your account
Click the Select Project dropdown to select the project ID
You can also add second Gmail account by clicking on Add Another Account plus button
- Click the Select Dataset dropdown to select the dataset that is available for the selected project ID
Service Account
- Click the Use Service Account
- Enter an Integration name by which you want to refer to this BigQuery integration, set the Database Name, and then upload the Private Key
Use Advanced Options
Options | Description |
---|---|
Allow Large Results | Keep the option Allow Large Results enable to avoid errors such as Response too large to return. |
Large Result Dataset | Enter dataset name that contain the tables. |
Large Result Table Name | Enter the table name that you want to replicate from the dataset. |
Timeout Duration | Amount of time before a data load query times out. Can be set from 10 nanoseconds to Days. Default is 10 minutes. |
Step 3: Test & Save
To validate data source configuration, click on the Test & Save, and determine whether the connection is successfully established or not.
Add Info
Its metadata, which includes a description, can be used for surfacing information to end-users and as tags for monitoring. Click on the More Options button to enter the metadata.
Any Question? 🤓
We are always an email away to help you resolve your queries. If you need any help, write to us at - 📧 support@boltic.io