airflow.providers.amazon.aws.executors.ecs.utils
AWS ECS Executor Utilities.
Data classes and utility functions used by the ECS executor.
Classes
EcsQueuedTask
|
Represents an ECS task that is queued. The task will be run in the next heartbeat. |
EcsTaskInfo
|
Contains information about a currently running ECS task. |
RunTaskKwargsConfigKeys
|
Keys loaded into the config which are valid ECS run_task kwargs. |
AllEcsConfigKeys
|
All keys loaded into the config which are related to the ECS Executor. |
EcsExecutorTask
|
Data Transfer Object for an ECS Task. |
EcsTaskCollection
|
A five-way dictionary between Airflow task ids, Airflow cmds, ECS ARNs, and ECS task objects. |
Functions
parse_assign_public_ip (assign_public_ip[, ...])
|
Convert "assign_public_ip" from True/False to ENABLE/DISABLE. |
camelize_dict_keys (nested_dict)
|
Accept a potentially nested dictionary and recursively convert all keys into camelCase. |
Module Contents
-
airflow.providers.amazon.aws.executors.ecs.utils.CommandType[source]
-
airflow.providers.amazon.aws.executors.ecs.utils.ExecutorConfigFunctionType[source]
-
airflow.providers.amazon.aws.executors.ecs.utils.ExecutorConfigType[source]
-
airflow.providers.amazon.aws.executors.ecs.utils.ECS_LAUNCH_TYPE_EC2 = 'EC2'[source]
-
airflow.providers.amazon.aws.executors.ecs.utils.ECS_LAUNCH_TYPE_FARGATE = 'FARGATE'[source]
-
airflow.providers.amazon.aws.executors.ecs.utils.CONFIG_GROUP_NAME = 'aws_ecs_executor'[source]
-
airflow.providers.amazon.aws.executors.ecs.utils.CONFIG_DEFAULTS[source]
-
class airflow.providers.amazon.aws.executors.ecs.utils.EcsQueuedTask[source]
Represents an ECS task that is queued. The task will be run in the next heartbeat.
-
key: airflow.models.taskinstance.TaskInstanceKey[source]
-
command: CommandType[source]
-
queue: str[source]
-
executor_config: ExecutorConfigType[source]
-
attempt_number: int[source]
-
next_attempt_time: datetime.datetime[source]
-
class airflow.providers.amazon.aws.executors.ecs.utils.EcsTaskInfo[source]
Contains information about a currently running ECS task.
-
cmd: CommandType[source]
-
queue: str[source]
-
config: ExecutorConfigType[source]
-
class airflow.providers.amazon.aws.executors.ecs.utils.RunTaskKwargsConfigKeys[source]
Bases: airflow.providers.amazon.aws.executors.utils.base_config_keys.BaseConfigKeys
Keys loaded into the config which are valid ECS run_task kwargs.
-
ASSIGN_PUBLIC_IP = 'assign_public_ip'[source]
-
CAPACITY_PROVIDER_STRATEGY = 'capacity_provider_strategy'[source]
-
CLUSTER = 'cluster'[source]
-
CONTAINER_NAME = 'container_name'[source]
-
LAUNCH_TYPE = 'launch_type'[source]
-
PLATFORM_VERSION = 'platform_version'[source]
-
SECURITY_GROUPS = 'security_groups'[source]
-
SUBNETS = 'subnets'[source]
-
TASK_DEFINITION = 'task_definition'[source]
-
class airflow.providers.amazon.aws.executors.ecs.utils.AllEcsConfigKeys[source]
Bases: RunTaskKwargsConfigKeys
All keys loaded into the config which are related to the ECS Executor.
-
AWS_CONN_ID = 'conn_id'[source]
-
CHECK_HEALTH_ON_STARTUP = 'check_health_on_startup'[source]
-
MAX_RUN_TASK_ATTEMPTS = 'max_run_task_attempts'[source]
-
REGION_NAME = 'region_name'[source]
-
RUN_TASK_KWARGS = 'run_task_kwargs'[source]
-
exception airflow.providers.amazon.aws.executors.ecs.utils.EcsExecutorException[source]
Bases: Exception
Thrown when something unexpected has occurred within the ECS ecosystem.
-
class airflow.providers.amazon.aws.executors.ecs.utils.EcsExecutorTask(task_arn, last_status, desired_status, containers, started_at=None, stopped_reason=None, external_executor_id=None)[source]
Data Transfer Object for an ECS Task.
-
task_arn[source]
-
last_status[source]
-
desired_status[source]
-
containers[source]
-
started_at = None[source]
-
stopped_reason = None[source]
-
external_executor_id = None[source]
-
get_task_state()[source]
Determine the state of an ECS task based on its status and other relevant attributes.
- It can return one of the following statuses:
QUEUED - Task is being provisioned.
RUNNING - Task is launched on ECS.
REMOVED - Task provisioning has failed for some reason. See stopped_reason.
FAILED - Task is completed and at least one container has failed.
SUCCESS - Task is completed and all containers have succeeded.
-
__repr__()[source]
Return a string representation of the ECS task.
-
class airflow.providers.amazon.aws.executors.ecs.utils.EcsTaskCollection[source]
A five-way dictionary between Airflow task ids, Airflow cmds, ECS ARNs, and ECS task objects.
-
key_to_arn: dict[airflow.models.taskinstance.TaskInstanceKey, str][source]
-
arn_to_key: dict[str, airflow.models.taskinstance.TaskInstanceKey][source]
-
tasks: dict[str, EcsExecutorTask][source]
-
key_to_failure_counts: dict[airflow.models.taskinstance.TaskInstanceKey, int][source]
-
key_to_task_info: dict[airflow.models.taskinstance.TaskInstanceKey, EcsTaskInfo][source]
-
add_task(task, airflow_task_key, queue, airflow_cmd, exec_config, attempt_number)[source]
Add a task to the collection.
-
update_task(task)[source]
Update the state of the given task based on task ARN.
-
task_by_key(task_key)[source]
Get a task by Airflow Instance Key.
-
task_by_arn(arn)[source]
Get a task by AWS ARN.
-
pop_by_key(task_key)[source]
Delete task from collection based off of Airflow Task Instance Key.
-
get_all_arns()[source]
Get all AWS ARNs in collection.
-
get_all_task_keys()[source]
Get all Airflow Task Keys in collection.
-
failure_count_by_key(task_key)[source]
Get the number of times a task has failed given an Airflow Task Key.
-
increment_failure_count(task_key)[source]
Increment the failure counter given an Airflow Task Key.
-
info_by_key(task_key)[source]
Get the Airflow Command given an Airflow task key.
-
__getitem__(value)[source]
Get a task by AWS ARN.
-
__len__()[source]
Determine the number of tasks in collection.
-
airflow.providers.amazon.aws.executors.ecs.utils.parse_assign_public_ip(assign_public_ip, is_launch_type_ec2=False)[source]
Convert “assign_public_ip” from True/False to ENABLE/DISABLE.
-
airflow.providers.amazon.aws.executors.ecs.utils.camelize_dict_keys(nested_dict)[source]
Accept a potentially nested dictionary and recursively convert all keys into camelCase.