S3ToRedshiftOperator
AmazonExecutes an COPY command to load files from s3 to Redshift
Access Instructions
Install the Amazon provider package into your Airflow environment.
Import the module into your DAG file and instantiate it with your desired params.
Parameters
Documentation
Executes an COPY command to load files from s3 to Redshift
See also
For more information on how to use this operator, take a look at the guide: Amazon S3 To Amazon Redshift transfer operator
Example DAGs
This DAG shows an example implementation of executing predictions from a machine learning model using AWS SageMaker.
An example pipeline demonstrating how to perform data quality checks in Redshift using SQL Check Operators.
Example DAG showcasing loading and data quality checking with Redshift and Great Expectations.
This is the second in a series of DAGs showing an EL pipeline with data integrity checking of data in S3 as well as Redshift.
This is the third in a series of DAGs showing an EL pipeline with data integrity and data quality checking for data in S3 and Redshift using ETag verification and row-based data quality checks where t…
Run an ETL pipeline that resumes a Redshift cluster, extracts data from S3, transforms data, loads data back to S3, then pauses the cluster.
This DAG shows an example implementation of executing predictions from a machine learning model using AWS SageMaker.