SFTPOperator

SFTP

SFTPOperator for transferring files from remote host to local or vice a versa. This operator uses sftp_hook to open sftp transport channel that serve as basis for file transfer.

View on GitHub

Last Updated: Jan. 20, 2023

Access Instructions

Install the SFTP provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

ssh_conn_idssh connection id from airflow Connections. ssh_conn_id will be ignored if ssh_hook or sftp_hook is provided.
sftp_hookpredefined SFTPHook to use Either sftp_hook or ssh_conn_id needs to be provided.
ssh_hookDeprecated - predefined SSHHook to use for remote execution Use sftp_hook instead.
remote_hostremote host to connect (templated) Nullable. If provided, it will replace the remote_host which was defined in sftp_hook/ssh_hook or predefined in the connection of ssh_conn_id.
local_filepathRequiredlocal file path or list of local file paths to get or put. (templated)
remote_filepathRequiredremote file path or list of remote file paths to get or put. (templated)
operationspecify operation ‘get’ or ‘put’, defaults to put
confirmspecify if the SFTP operation should be confirmed, defaults to True
create_intermediate_dirscreate missing intermediate directories when copying from remote to local and vice-versa. Default is False. Example: The following task would copy file.txt to the remote host at /tmp/tmp1/tmp2/ while creating tmp,``tmp1`` and tmp2 if they don’t exist. If the parameter is not passed it would error as the directory does not exist. put_file = SFTPOperator( task_id="test_sftp", ssh_conn_id="ssh_default", local_filepath="/tmp/file.txt", remote_filepath="/tmp/tmp1/tmp2/file.txt", operation="put", create_intermediate_dirs=True, dag=dag )

Documentation

SFTPOperator for transferring files from remote host to local or vice a versa. This operator uses sftp_hook to open sftp transport channel that serve as basis for file transfer.

Was this page helpful?