DataprocSubmitSparkSqlJobOperator

Google

Start a Spark SQL query Job on a Cloud DataProc cluster.

View on GitHub

Last Updated: Feb. 25, 2023

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

queryThe query or reference to the query file (q extension). (templated)
query_uriThe HCFS URI of the script that contains the SQL queries.
variablesMap of named parameters for the query. (templated)

Documentation

Start a Spark SQL query Job on a Cloud DataProc cluster.

Was this page helpful?