# In what way should the cli access the API. # it has to cleanup after it is sent a SIGTERM, before it is SIGKILLED # When a task is killed forcefully, this is the amount of time in seconds that This will be deprecated in Airflow 2.0 (be forced to False). # Whether to enable pickling for xcom (note that this is insecure and allows for # Name of handler to read task instance logs. # Turn unit test mode on (overwrites many configuration options with test # What security module to use (for example kerberos): # Can be used to de-elevate a sudo user running Airflow when executing tasks # If set, tasks without a `run_as_user` argument will be run with this user # The class to use for running task instances in a subprocess # How long before timing out a python file import while filling the DagBag # Secret key to save connection passwords in the dbįernet_key = ibwZ5uSASmZGphBmwdJ4BIhd1-5WZXMTTgMF9u1_dGM= Plugins_folder = /home/ec2-user/airflow/plugins # get started, but you probably want to set this to False in a production # Whether to load the examples that ship with Airflow. # The maximum number of active DAG runs per DAG # whose size is guided by this config element # When not using pools, tasks are run in the "default pool", # The number of task instances allowed to run concurrently by the scheduler # the max number of task instances that should run simultaneously # The amount of parallelism as a setting to the executor. # can be idle in the pool before it is invalidated. # The SqlAlchemy pool recycle is the number of seconds a connection Sql_alchemy_conn = The SqlAlchemy pool size is the maximum number of database connections #sql_alchemy_conn = sqlite:////home/ec2-user/airflow/airflow.db # SqlAlchemy supports many different database engine, more information # The SqlAlchemy connection string to the metadata database. # SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor # The executor class that airflow should use. Simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s Log_format = %%(levelname)s - %%(message)s # logging_config_class = my.fault_local_settings.LOGGING_CONFIG # This class has to be on the python classpath # Specify the class that will specify the logging configuration # must supply an Airflow connection id that provides access to the storage # Airflow can store logs remotely in AWS S3 or Google Cloud Storage. # The folder where airflow should store its log filesīase_log_folder = /home/ec2-user/airflow/logs # The folder where your airflow pipelines live, most likely aĭags_folder = /home/ec2-user/airflow/dags # The home folder for airflow, default is ~/airflow Here is the code I have: from airflow import DAGįrom airflow.operators import SimpleHttpOperator, HttpSensor, EmailOperator, S3KeySensorįrom _operator import BashOperatorĭag = DAG('myDag', default_args=default_args, schedule_interval= BashOperator(īash_command='echo "Dag Ran Successfully!" > /home/ec2-user/output.txt',Īnd if needed here is my airflow.cfg file (note the only lines I changed were executor = LocalExecutor and sql_alchemy_conn = I've tried using this Stackoverflow Post on the same topic as well but to no avail. Also, the scheduler is currently running too. I've already cleared all of it's PAST, FUTURE, UPSTREAM history on the UI and I have the DAG turned on so that's not the issue. All of the tasks in the DAG are colored white on the graph view with no status when the first one should be in the running state while it waits for the S3 file to appear. I had this basic example DAG that was working when my executor was set to SequentialExecuter but now that I have it set to LocalExecuter it never runs. I was able to start up the airflow webserver and airflow scheduler and I'm able to go on the UI and view all my DAGs but now none of my DAGs are starting that previously were working. I just went through the process of configuring my Airflow setup to be capable of parallel processing by following this article and using this article.Įverything seems to be working fine in the sense that I was able to run all of those commands from the articles without any errors, warnings, or exceptions.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |