Submits a Salesforce query and uploads the results to AWS S3.

View on GitHub

Last Updated: Sep. 13, 2022

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.


salesforce_queryRequiredThe query to send to Salesforce.
s3_bucket_nameRequiredThe bucket name to upload to.
s3_keyRequiredThe object name to set when uploading the file.
salesforce_conn_idRequiredThe name of the connection that has the parameters needed to connect to Salesforce.
export_formatDesired format of files to be exported.
query_paramsAdditional optional arguments to be passed to the HTTP request querying Salesforce.
include_deletedTrue if the query should include deleted records.
coerce_to_timestampTrue if you want all datetime fields to be converted into Unix timestamps. False if you want them to be left in the same format as they were in Salesforce. Leaving the value as False will result in datetimes being strings. Default: False
record_time_addedTrue if you want to add a Unix timestamp field to the resulting data that marks when the data was fetched from Salesforce. Default: False
aws_conn_idThe name of the connection that has the parameters we need to connect to S3.
replaceA flag to decide whether or not to overwrite the S3 key if it already exists. If set to False and the key exists an error will be raised.
encryptIf True, the file will be encrypted on the server-side by S3 and will be stored in an encrypted form while at rest in S3.
gzipIf True, the file will be compressed locally.
acl_policyString specifying the canned ACL policy for the file being uploaded to the S3 bucket.


Submits a Salesforce query and uploads the results to AWS S3.

See also

For more information on how to use this operator, take a look at the guide: Extract data from Salesforce to Amazon S3 transfer operator

Was this page helpful?