OracleToAzureDataLakeOperator

Microsoft Azure

Moves data from Oracle to Azure Data Lake. The operator runs the query against Oracle and stores the file locally before loading it into Azure Data Lake.

View on GitHub

Last Updated: Oct. 23, 2022

Access Instructions

Install the Microsoft Azure provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

filenameRequiredfile name to be used by the csv file.
azure_data_lake_conn_idRequireddestination azure data lake connection.
azure_data_lake_pathRequireddestination path in azure data lake to put the file.
oracle_conn_idRequiredSource Oracle connection.
sqlRequiredSQL query to execute against the Oracle database. (templated)
sql_paramsParameters to use in sql query. (templated)
delimiterfield delimiter in the file.
encodingencoding type for the file.
quotecharCharacter to use in quoting.
quotingQuoting strategy. See unicodecsv quoting for more information.

Documentation

Moves data from Oracle to Azure Data Lake. The operator runs the query against Oracle and stores the file locally before loading it into Azure Data Lake.

Was this page helpful?