Snowflake
Before You Begin
Access to your Snowflake Console is required to complete this integration guide.
Configure a Snowflake Storage Integration and Stored Procedure
Step 1
Create a storage integration using the CREATE STORAGE INTEGRATION COMMAND
Retrieve the Cloud Storage Service Account for your Snowflake Account. Provide the value of STORAGE_GCP_SERVICE_ACCOUNT
to your Clario Customer Success team.
Step 2
Create a stored procedure to handle the export:
Schedule Unload Jobs
For each table you wish to unload, create a task using a schedule that fits your needs.
The inputs for clario_unloader_sp
are as follows:
brand
Should be a meaningful abbreviation of the brand you are sending data for (i.e. Avalanche Clothing would be ac).
sourceType
Should represent the type of data you are sending (i.e. customer_master, product_master, esp_activity, esp_subsciber, etc.).
feedType
ongoing (for recurring feeds) or backfill (for one-time historical loads).
db
The name of the Snowflake database your table lives in.
schema
The schema your Snowflake table is organized under.
table
The Snowflake table you are unloading to GCS.
Note:
All unload tasks are suspended upon creation. Ensure you enable them.
ALTER TASK IF EXISTS clario_unload_<brand>_<sourceType>_<feedType> RESUME;
Example Unload Job
You work on the data team for Avalanche Clothing. You are in the process of onboarding with Clario and need to get a feed established to pull transaction header data out of your Snowflake warehouse. Clario is asking for a historical backfill and an ongoing feed. The following assumptions can me made:
Warehouse:
AVALANCHE_WH
Brand:
ac
(Avalanche Clothing)Source Type:
transaction_header
Schema:
ERP
Table:
TR_HEADER
Scheduled tasks will run at 3:00 AM daily, UTC
The data team maintains two databases in the warehouse:
HISTORICAL
andCURRENT
.
Schedule and Enable the Backfill
Schedule and Enable the Ongoing Feed
Working With Scheduled Jobs
List
View History
Last updated