Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add fail_on_file_not_exist option to SFTPToS3Operator #44320

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

Guaqamole
Copy link

@Guaqamole Guaqamole commented Nov 24, 2024

Fixes: #40576

I added fail_on_file_not_exist param to SFTPToS3Operator so that user can configure the parameter and operator will not fail in case of sftp file not exist.


^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.

In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in newsfragments.


Here's the test result ran in my local environment.
sftp_to_s3_test_local
you can see the dag does not fail with fail_on_file_not_exist but still shows logs that the file does not exist.

sftp_to_s3_dag.py

from airflow import DAG
from airflow.providers.amazon.aws.transfers.sftp_to_s3 import SFTPToS3Operator
from datetime import datetime, timedelta

default_args = {
    "owner": "airflow",
    "retries": 1,
    "retry_delay": timedelta(minutes=1),
}

with DAG(
    dag_id="sftp_to_s3_example",
    default_args=default_args,
    description="Transfer a file from SFTP to S3",
    schedule_interval=None,
    start_date=datetime(2023, 1, 1),
    catchup=False,
    tags=["example", "sftp", "s3"],
) as dag:
    transfer_file = SFTPToS3Operator(
        task_id="transfer_file",
        sftp_conn_id="sftp_default",
        sftp_path="/Users/john/test.mv.dba", 
        s3_conn_id="aws_conn",
        s3_bucket="airflow",
        s3_key="test/test.mv.db",
        fail_on_file_not_exist=False,
    )

Copy link

boring-cyborg bot commented Nov 24, 2024

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contributors' Guide (https://github.com/apache/airflow/blob/main/contributing-docs/README.rst)
Here are some useful points:

  • Pay attention to the quality of your code (ruff, mypy and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
  • Always keep your Pull Requests rebased, otherwise your build might fail due to changes not related to your commits.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: [email protected]
    Slack: https://s.apache.org/airflow-slack

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:providers provider:amazon-aws AWS/Amazon - related issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Airflow SFTPToS3Operator file exist check
1 participant