AzureDataLakeHook

Microsoft Azure

This module contains integration with Azure Data Lake.

View on GitHub

Last Updated: Mar. 10, 2023

Access Instructions

Install the Microsoft Azure provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

azure_data_lake_conn_idReference to the Azure Data Lake connection.

Documentation

This module contains integration with Azure Data Lake.

AzureDataLakeHook communicates via a REST API compatible with WebHDFS. Make sure that a Airflow connection of type azure_data_lake exists. Authorization can be done by supplying a login (=Client ID), password (=Client Secret) and extra fields tenant (Tenant) and account_name (Account Name)(see connection azure_data_lake_default for an example).

Interacts with Azure Data Lake.

Client ID and client secret should be in user and password parameters. Tenant and account name should be extra field as {“tenant”: “<TENANT>”, “account_name”: “ACCOUNT_NAME”}.

Was this page helpful?