Astro SDKCertified

Load a file or bucket into either a SQL table or a pandas dataframe./p>

View on GitHub

Last Updated: Aug. 18, 2022

Access Instructions

Install the Astro SDK provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.


input_fileRequiredastro.files.base.FileFile path and conn_id for object stores.
output_tableastro.sql.table.TableTable to create
task_idstrtask id, optional
if_existsLiteral['replace', 'append']default override an existing Table. Options: fail, replace, append
ndjson_normalize_sepstrseparator used to normalize nested ndjson. ex - {'a': {'b':'c'}} will result in: column - 'a_b' where ndjson_normalize_sep = '_'
use_native_supportboolUse native support for data transfer if available on the destination.
native_support_kwargsDictkwargs to be used by method involved in native support flow
columns_names_capitalizationLiteral['upper', 'lower', 'original']determines whether to convert all columns to lowercase/uppercase in the resulting dataframe
enable_native_fallbackboolUse enable_native_fallback=True to fall back to default transfer


Load a file or bucket into either a SQL table or a pandas dataframe.

Was this page helpful?