DatabricksSqlHook

Databricks

Hook to interact with Databricks SQL.

View on GitHub

Last Updated: Nov. 26, 2022

Access Instructions

Install the Databricks provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

databricks_conn_idReference to the Databricks connection.
http_pathOptional string specifying HTTP path of Databricks SQL Endpoint or cluster. If not specified, it should be either specified in the Databricks connection’s extra parameters, or sql_endpoint_name must be specified.
sql_endpoint_nameOptional name of Databricks SQL Endpoint. If not specified, http_path must be provided as described above.
session_configurationAn optional dictionary of Spark session parameters. Defaults to None. If not specified, it could be specified in the Databricks connection’s extra parameters.
http_headersAn optional list of (k, v) pairs that will be set as HTTP headers on every request
catalogAn optional initial catalog to use. Requires DBR version 9.0+
schemaAn optional initial schema to use. Requires DBR version 9.0+
kwargsAdditional parameters internal to Databricks SQL Connector parameters

Documentation

Hook to interact with Databricks SQL.

Was this page helpful?