transform_file

Astro SDKCertified

This function returns a Table object that can be passed to future tasks from specified SQL file.

View on GitHub

Last Updated: Aug. 18, 2022

Access Instructions

Install the Astro SDK provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

file_pathRequiredstrFile path for the SQL file you would like to parse. Can be an absolute path, or you can use arelative path if the `template_searchpath` variable is set in your DAG
conn_idstrConnection ID for the database you want to connect to. If you do not pass in a value for this object we can infer the connection ID from the first table passed into the python_callable function. (required if there are no table arguments)
parametersOptional[mapping or iterable]parameters to pass into the SQL query
databasestrDatabase within the SQL instance you want to access. If left blank we will default to the table.metatadata.database in the first Table passed to the function (required if there are no table arguments)
schemastrSchema within the SQL instance you want to access. If left blank we will default to the table.metatadata.schema in the first Table passed to the function (required if there are no table arguments)
kwargsDictAny keyword arguments supported by the BaseOperator is supported (e.g queue, owner)

Documentation

This function returns a Table object that can be passed to future tasks from specified SQL file. Tables can be inserted via the parameters kwarg.

Was this page helpful?