Complex workflow execution is usually not just a simple single command execution, but a more complicated set of tasks that together create a workflow. These tasks can in most cases be described by a directed acyclic graph (DAG). Workflows represented as DAG can be then executed on computing infrastructure.
An important requirement for us is the possibility to orchestrate tasks on HPC batch scheduling systems and also on container orchestration platform such as Kubernetes. These functionalities are implemented via Apache Airflow, with custom task provider exposing tasks for HPC job management through HEAppE and Distributed Data Interface for data transfer management.