SqlToSlackApiFileOperator

Slack

Executes an SQL statement in a given SQL connection and sends the results to Slack API as file.

View on GitHub

Last Updated: Nov. 16, 2022

Access Instructions

Install the Slack provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

sqlRequiredThe SQL query to be executed
sql_conn_idRequiredreference to a specific DB-API Connection.
slack_conn_idRequiredSlack API Connection.
slack_filenameRequiredFilename for display in slack. Should contain supported extension which referenced to SUPPORTED_FILE_FORMATS. It is also possible to set compression in extension: filename.csv.gzip, filename.json.zip, etc.
sql_hook_paramsExtra config params to be passed to the underlying hook. Should match the desired hook constructor params.
parametersThe parameters to pass to the SQL query.
slack_channelsComma-separated list of channel names or IDs where the file will be shared. If omitting this parameter, then file will send to workspace.
slack_initial_commentThe message text introducing the file in specified slack_channels.
slack_titleTitle of file.
df_kwargsKeyword arguments forwarded to pandas.DataFrame.to_{format}() method.

Documentation

Executes an SQL statement in a given SQL connection and sends the results to Slack API as file.

Example:
SqlToSlackApiFileOperator(
task_id="sql_to_slack",
sql="SELECT 1 a, 2 b, 3 c",
sql_conn_id="sql-connection",
slack_conn_id="slack-api-connection",
slack_filename="awesome.json.gz",
slack_channels="#random,#general",
slack_initial_comment="Awesome load to compressed multiline JSON.",
df_kwargs={
"orient": "records",
"lines": True,
},
)

Was this page helpful?