RedshiftHook

Amazon

Interact with Amazon Redshift. Provide thin wrapper around boto3.client("redshift").

View on GitHub

Last Updated: Mar. 21, 2023

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

aws_conn_idThe Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).
verifyWhether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html
region_nameAWS region_name. If not specified then the default boto3 behaviour is used.
client_typeReference to boto3.client service_name, e.g. ‘emr’, ‘batch’, ‘s3’, etc. Mutually exclusive with resource_type.
resource_typeReference to boto3.resource service_name, e.g. ‘s3’, ‘ec2’, ‘dynamodb’, etc. Mutually exclusive with client_type.
configConfiguration for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

Documentation

Interact with Amazon Redshift. Provide thin wrapper around boto3.client("redshift").

Additional arguments (such as aws_conn_id) may be specified and are passed down to the underlying AwsBaseHook.

Was this page helpful?