S3ToSnowflakeOperator

Snowflake

Executes an COPY command to load files from s3 to Snowflake

View on GitHub

Last Updated: Oct. 23, 2022

Access Instructions

Install the Snowflake provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

s3_keysreference to a list of S3 keys
tableRequiredreference to a specific table in snowflake database
schemaname of schema (will overwrite schema defined in connection)
stageRequiredreference to a specific snowflake stage. If the stage’s schema is not the same as the table one, it must be specified
prefixcloud storage location specified to limit the set of files to load
file_formatRequiredreference to a specific file format
warehousename of warehouse (will overwrite any warehouse defined in the connection’s extra JSON)
databasereference to a specific database in Snowflake connection
columns_arrayreference to a specific columns array in snowflake database
patternregular expression pattern string specifying the file names and/or paths to match. Note: regular expression will be automatically enclose in single quotes and all single quotes in expression will replace by two single quotes.
snowflake_conn_idReference to Snowflake connection id
rolename of role (will overwrite any role defined in connection’s extra JSON)
authenticatorauthenticator for Snowflake. ‘snowflake’ (default) to use the internal Snowflake authenticator ‘externalbrowser’ to authenticate using your web browser and Okta, ADFS or any other SAML 2.0-compliant identify provider (IdP) that has been defined for your account ‘https://.okta.com’ to authenticate through native Okta.
session_parametersYou can set session-level parameters at the time you connect to Snowflake

Documentation

Executes an COPY command to load files from s3 to Snowflake

See also

For more information on how to use this operator, take a look at the guide: S3ToSnowflakeOperator

Was this page helpful?