Creates a new object from data as string or bytes.

View on GitHub

Last Updated: Jan. 24, 2023

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.


s3_bucketName of the S3 bucket where to save the object. (templated) It should be omitted when bucket_key is provided as a full s3:// url.
s3_keyRequiredThe key of the object to be created. (templated) It can be either full s3:// style url or relative path from root level. When it’s specified as a full s3:// url, please omit bucket_name.
dataRequiredstring or bytes to save as content.
replaceIf True, it will overwrite the key if it already exists
encryptIf True, the file will be encrypted on the server-side by S3 and will be stored in an encrypted form while at rest in S3.
acl_policyString specifying the canned ACL policy for the file being uploaded to the S3 bucket.
encodingThe string to byte encoding. It should be specified only when data is provided as string.
compressionType of compression to use, currently only gzip is supported. It can be specified only when data is provided as string.
aws_conn_idConnection id of the S3 connection to use
verifyWhether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values: False: do not validate SSL certificates. SSL will still be used,but SSL certificates will not be verified. path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.


Creates a new object from data as string or bytes.

See also

For more information on how to use this operator, take a look at the guide: Create an Amazon S3 object

Was this page helpful?