Executes a COPY INTO command to load files from an external stage from clouds to Snowflake

View on GitHub

Last Updated: Feb. 10, 2023

Access Instructions

Install the Snowflake provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.


namespacesnowflake namespace
tableRequiredsnowflake table
file_formatRequiredfile format name i.e. CSV, AVRO, etc
stageRequiredreference to a specific snowflake stage. If the stage’s schema is not the same as the table one, it must be specified
prefixcloud storage location specified to limit the set of files to load
filesfiles to load into table
patternpattern to load files from external location to table
copy_into_postifxoptional sql postfix for INSERT INTO query such as formatTypeOptions and copyOptions
snowflake_conn_idReference to Snowflake connection id
accountsnowflake account name
warehousename of snowflake warehouse
databasename of snowflake database
regionname of snowflake region
rolename of snowflake role
schemaname of snowflake schema
authenticatorauthenticator for Snowflake. ‘snowflake’ (default) to use the internal Snowflake authenticator ‘externalbrowser’ to authenticate using your web browser and Okta, ADFS or any other SAML 2.0-compliant identify provider (IdP) that has been defined for your account to authenticate through native Okta.
session_parametersYou can set session-level parameters at the time you connect to Snowflake
copy_optionssnowflake COPY INTO syntax copy options
validation_modesnowflake COPY INTO syntax validation mode


Executes a COPY INTO command to load files from an external stage from clouds to Snowflake

This operator requires the snowflake_conn_id connection. The snowflake host, login, and, password field must be setup in the connection. Other inputs can be defined in the connection or hook instantiation.

Was this page helpful?