DataprocSubmitHiveJobOperator

Google

Start a Hive query Job on a Cloud DataProc cluster.

View on GitHub

Last Updated: Feb. 25, 2023

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

queryThe query or reference to the query file (q extension).
query_uriThe HCFS URI of the script that contains the Hive queries.
variablesMap of named parameters for the query.

Documentation

Start a Hive query Job on a Cloud DataProc cluster.

Was this page helpful?