SnowflakeOperator

Snowflake

Executes SQL code in a Snowflake database

View on GitHub

Last Updated: Mar. 10, 2023

Access Instructions

Install the Snowflake provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

snowflake_conn_idReference to Snowflake connection id
sqlthe SQL code to be executed as a single string, or a list of str (sql statements), or a reference to a template file. Template references are recognized by str ending in ‘.sql’
autocommitif True, each command is automatically committed. (default value: True)
parameters(optional) the parameters to render the SQL query with.
warehousename of warehouse (will overwrite any warehouse defined in the connection’s extra JSON)
databasename of database (will overwrite database defined in connection)
schemaname of schema (will overwrite schema defined in connection)
rolename of role (will overwrite any role defined in connection’s extra JSON)
authenticatorauthenticator for Snowflake. ‘snowflake’ (default) to use the internal Snowflake authenticator ‘externalbrowser’ to authenticate using your web browser and Okta, ADFS or any other SAML 2.0-compliant identify provider (IdP) that has been defined for your account ‘https://.okta.com’ to authenticate through native Okta.
session_parametersYou can set session-level parameters at the time you connect to Snowflake

Documentation

Executes SQL code in a Snowflake database

See also

For more information on how to use this operator, take a look at the guide: SnowflakeOperator

Example DAGs

lineage_emission_dag

Lineage Emission

DataHub
Data Management & Governance
Using Census for Reverse ETL in Airflow

This reverse-ETL pipeline syncs customer data from a Snowflake data warehouse to a marketing platform using Census.

CensusApache Airflow Snowflake
DatabasesETL/ELT
Using Hightouch for Reverse ETL in Airflow

An ELT pipeline which extracts Salesforce customer data to Snowflake for transformation and enrichment, and synced back to Salesforce via Hightouch.

HightouchAmazonSnowflakeApache Airflow
DatabasesETL/ELTStorageBig Data & Analytics
Modern ELT from Salesforce to Snowflake

A modern ELT pipeline from extracting Salesforce data to loading and transforming in Snowflake.

AmazonSnowflakeApache Airflow
DatabasesETL/ELTStorageBig Data & Analytics
Extract Zendesk to Snowflake

Upload the following Zendesk objects to S3: Tickets, Organizations, Users. From S3, loads into Snowflake. Loads can be daily or full-extracts.

SnowflakeAmazonApache Airflow HTTP
ETL/ELTStorageDatabases
Google Analytics to Snowflake

The google_analytics_to_snowflake DAG transfers data from Google Analytics (GA) to Google Cloud Storage (GCS), and then loads the data to Snowflake staging tables. From staging, data is loaded to Snow…

Apache Airflow Snowflake
ETL/ELTData ProcessingStorageDatabasesData Management & Governance
Write-Audit-Publish Great Expectations Example With Snowflake

Example DAG showcasing a write-audit-publish data quality pattern with Snowflake and Great Expectations.

Apache Airflow SnowflakeAmazonGreat Expectations
DatabasesData Management & GovernanceData QualityETL/ELT
Great Expectations Snowflake Example

Example DAG showcasing loading and data quality checking with Snowflake and Great Expectations.

Apache Airflow Great ExpectationsSnowflakeAmazon
Data QualityETL/ELT
Taxi Data Quality Example With Snowflake

An example DAG showing the use of SnowflakeCheckOperators.

Apache Airflow SnowflakeAmazon
ETL/ELTData QualityDatabases
Write-Audit-Publish Snowflake Example with Dynamic Task Mapping

Example DAG showcasing loading and data quality checking with Snowflake and dynamic task mapping.

Apache Airflow Snowflake
DatabasesETL/ELTData Quality
Complex Snowflake Transform Example

Example DAG showcasing loading, transforming, and data quality checking with multiple datasets in Snowflake.

Apache Airflow SnowflakeCommon SQL
DatabasesETL/ELTData Quality
Compare Datasets using Dynamic Task Mapping

This DAG shows an example implementation of comparing data between two S3 buckets and a table in Snowflake using the new .zip() method of the XComArg object.

AmazonSnowflake
Airflow FundamentalsETL/ELT

Was this page helpful?