S3ListPrefixesOperator

Amazon

List all subfolders from the bucket with the given string prefix in name.

View on GitHub

Last Updated: Jan. 24, 2023

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

bucketRequiredThe S3 bucket where to find the subfolders. (templated)
prefixRequiredPrefix string to filter the subfolders whose name begin with such prefix. (templated)
delimiterRequiredthe delimiter marks subfolder hierarchy. (templated)
aws_conn_idThe connection ID to use when connecting to S3 storage.
verifyWhether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values: False: do not validate SSL certificates. SSL will still be used(unless use_ssl is False), but SSL certificates will not be verified. path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.

Documentation

List all subfolders from the bucket with the given string prefix in name.

This operator returns a python list with the name of all subfolders which can be used by xcom in the downstream task.

See also

For more information on how to use this operator, take a look at the guide: List Amazon S3 prefixes

Example:

The following operator would list all the subfolders from the S3 customers/2018/04/ prefix in the data bucket.

s3_file = S3ListPrefixesOperator(
task_id='list_s3_prefixes',
bucket='data',
prefix='customers/2018/04/',
delimiter='/',
aws_conn_id='aws_customers_conn'
)

Was this page helpful?