airflow.providers.google.cloud.triggers.dataflow

Module Contents

Classes

TemplateJobStartTrigger

Dataflow trigger to check if templated job has been finished.

DataflowJobStatusTrigger

Trigger that monitors if a Dataflow job has reached any of the expected statuses.

DataflowStartYamlJobTrigger

Dataflow trigger that checks the state of a Dataflow YAML job.

DataflowJobMetricsTrigger

Trigger that checks for metrics associated with a Dataflow job.

DataflowJobAutoScalingEventTrigger

Trigger that checks for autoscaling events associated with a Dataflow job.

DataflowJobMessagesTrigger

Trigger that checks for job messages associated with a Dataflow job.

Attributes

DEFAULT_DATAFLOW_LOCATION

airflow.providers.google.cloud.triggers.dataflow.DEFAULT_DATAFLOW_LOCATION = 'us-central1'[source]
class airflow.providers.google.cloud.triggers.dataflow.TemplateJobStartTrigger(job_id, project_id, location=DEFAULT_DATAFLOW_LOCATION, gcp_conn_id='google_cloud_default', poll_sleep=10, impersonation_chain=None, cancel_timeout=5 * 60)[source]

Bases: airflow.triggers.base.BaseTrigger

Dataflow trigger to check if templated job has been finished.

Parameters
  • project_id (str | None) – Required. the Google Cloud project ID in which the job was started.

  • job_id (str) – Required. ID of the job.

  • location (str) – Optional. the location where job is executed. If set to None then the value of DEFAULT_DATAFLOW_LOCATION will be used

  • gcp_conn_id (str) – The connection ID to use connecting to Google Cloud.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional. Service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • cancel_timeout (int | None) – Optional. How long (in seconds) operator should wait for the pipeline to be successfully cancelled when task is being killed.

serialize()[source]

Serialize class arguments and classpath.

async run()[source]

Fetch job status or yield certain Events.

Main loop of the class in where it is fetching the job status and yields certain Event.

If the job has status success then it yields TriggerEvent with success status, if job has status failed - with error status. In any other case Trigger will wait for specified amount of time stored in self.poll_sleep variable.

class airflow.providers.google.cloud.triggers.dataflow.DataflowJobStatusTrigger(job_id, expected_statuses, project_id, location=DEFAULT_DATAFLOW_LOCATION, gcp_conn_id='google_cloud_default', poll_sleep=10, impersonation_chain=None)[source]

Bases: airflow.triggers.base.BaseTrigger

Trigger that monitors if a Dataflow job has reached any of the expected statuses.

Parameters
  • job_id (str) – Required. ID of the job.

  • expected_statuses (set[str]) – The expected state(s) of the operation. See: https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.jobs#Job.JobState

  • project_id (str | None) – Required. The Google Cloud project ID in which the job was started.

  • location (str) – Optional. The location where the job is executed. If set to None then the value of DEFAULT_DATAFLOW_LOCATION will be used.

  • gcp_conn_id (str) – The connection ID to use for connecting to Google Cloud.

  • poll_sleep (int) – Time (seconds) to wait between two consecutive calls to check the job.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional. Service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

serialize()[source]

Serialize class arguments and classpath.

async run()[source]

Loop until the job reaches an expected or terminal state.

Yields a TriggerEvent with success status, if the client returns an expected job status.

Yields a TriggerEvent with error status, if the client returns an unexpected terminal job status or any exception is raised while looping.

In any other case the Trigger will wait for a specified amount of time stored in self.poll_sleep variable.

async_hook()[source]
class airflow.providers.google.cloud.triggers.dataflow.DataflowStartYamlJobTrigger(job_id, project_id, location=DEFAULT_DATAFLOW_LOCATION, gcp_conn_id='google_cloud_default', poll_sleep=10, cancel_timeout=5 * 60, expected_terminal_state=None, impersonation_chain=None)[source]

Bases: airflow.triggers.base.BaseTrigger

Dataflow trigger that checks the state of a Dataflow YAML job.

Parameters
  • job_id (str) – Required. ID of the job.

  • project_id (str | None) – Required. The Google Cloud project ID in which the job was started.

  • location (str) – The location where job is executed. If set to None then the value of DEFAULT_DATAFLOW_LOCATION will be used.

  • gcp_conn_id (str) – The connection ID to use connecting to Google Cloud.

  • poll_sleep (int) – Optional. The time in seconds to sleep between polling Google Cloud Platform for the Dataflow job.

  • cancel_timeout (int | None) – Optional. How long (in seconds) operator should wait for the pipeline to be successfully cancelled when task is being killed.

  • expected_terminal_state (str | None) – Optional. The expected terminal state of the Dataflow job at which the operator task is set to succeed. Defaults to ‘JOB_STATE_DONE’ for the batch jobs and ‘JOB_STATE_RUNNING’ for the streaming jobs.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional. Service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

serialize()[source]

Serialize class arguments and classpath.

async run()[source]

Fetch job and yield events depending on the job’s type and state.

Yield TriggerEvent if the job reaches a terminal state. Otherwise awaits for a specified amount of time stored in self.poll_sleep variable.

class airflow.providers.google.cloud.triggers.dataflow.DataflowJobMetricsTrigger(job_id, project_id, location=DEFAULT_DATAFLOW_LOCATION, gcp_conn_id='google_cloud_default', poll_sleep=10, impersonation_chain=None, fail_on_terminal_state=True)[source]

Bases: airflow.triggers.base.BaseTrigger

Trigger that checks for metrics associated with a Dataflow job.

Parameters
  • job_id (str) – Required. ID of the job.

  • project_id (str | None) – Required. The Google Cloud project ID in which the job was started.

  • location (str) – Optional. The location where the job is executed. If set to None then the value of DEFAULT_DATAFLOW_LOCATION will be used.

  • gcp_conn_id (str) – The connection ID to use for connecting to Google Cloud.

  • poll_sleep (int) – Time (seconds) to wait between two consecutive calls to check the job.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional. Service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • fail_on_terminal_state (bool) – If set to True the trigger will yield a TriggerEvent with error status if the job reaches a terminal state.

serialize()[source]

Serialize class arguments and classpath.

async run()[source]

Loop until a terminal job status or any job metrics are returned.

Yields a TriggerEvent with success status, if the client returns any job metrics and fail_on_terminal_state attribute is False.

Yields a TriggerEvent with error status, if the client returns a job status with a terminal state value and fail_on_terminal_state attribute is True.

Yields a TriggerEvent with error status, if any exception is raised while looping.

In any other case the Trigger will wait for a specified amount of time stored in self.poll_sleep variable.

async get_job_metrics()[source]

Wait for the Dataflow client response and then return it in a serialized list.

async_hook()[source]
class airflow.providers.google.cloud.triggers.dataflow.DataflowJobAutoScalingEventTrigger(job_id, project_id, location=DEFAULT_DATAFLOW_LOCATION, gcp_conn_id='google_cloud_default', poll_sleep=10, impersonation_chain=None, fail_on_terminal_state=True)[source]

Bases: airflow.triggers.base.BaseTrigger

Trigger that checks for autoscaling events associated with a Dataflow job.

Parameters
  • job_id (str) – Required. ID of the job.

  • project_id (str | None) – Required. The Google Cloud project ID in which the job was started.

  • location (str) – Optional. The location where the job is executed. If set to None then the value of DEFAULT_DATAFLOW_LOCATION will be used.

  • gcp_conn_id (str) – The connection ID to use for connecting to Google Cloud.

  • poll_sleep (int) – Time (seconds) to wait between two consecutive calls to check the job.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional. Service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • fail_on_terminal_state (bool) – If set to True the trigger will yield a TriggerEvent with error status if the job reaches a terminal state.

serialize()[source]

Serialize class arguments and classpath.

async run()[source]

Loop until a terminal job status or any autoscaling events are returned.

Yields a TriggerEvent with success status, if the client returns any autoscaling events and fail_on_terminal_state attribute is False.

Yields a TriggerEvent with error status, if the client returns a job status with a terminal state value and fail_on_terminal_state attribute is True.

Yields a TriggerEvent with error status, if any exception is raised while looping.

In any other case the Trigger will wait for a specified amount of time stored in self.poll_sleep variable.

async list_job_autoscaling_events()[source]

Wait for the Dataflow client response and then return it in a serialized list.

async_hook()[source]
class airflow.providers.google.cloud.triggers.dataflow.DataflowJobMessagesTrigger(job_id, project_id, location=DEFAULT_DATAFLOW_LOCATION, gcp_conn_id='google_cloud_default', poll_sleep=10, impersonation_chain=None, fail_on_terminal_state=True)[source]

Bases: airflow.triggers.base.BaseTrigger

Trigger that checks for job messages associated with a Dataflow job.

Parameters
  • job_id (str) – Required. ID of the job.

  • project_id (str | None) – Required. The Google Cloud project ID in which the job was started.

  • location (str) – Optional. The location where the job is executed. If set to None then the value of DEFAULT_DATAFLOW_LOCATION will be used.

  • gcp_conn_id (str) – The connection ID to use for connecting to Google Cloud.

  • poll_sleep (int) – Time (seconds) to wait between two consecutive calls to check the job.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional. Service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • fail_on_terminal_state (bool) – If set to True the trigger will yield a TriggerEvent with error status if the job reaches a terminal state.

serialize()[source]

Serialize class arguments and classpath.

async run()[source]

Loop until a terminal job status or any job messages are returned.

Yields a TriggerEvent with success status, if the client returns any job messages and fail_on_terminal_state attribute is False.

Yields a TriggerEvent with error status, if the client returns a job status with a terminal state value and fail_on_terminal_state attribute is True.

Yields a TriggerEvent with error status, if any exception is raised while looping.

In any other case the Trigger will wait for a specified amount of time stored in self.poll_sleep variable.

async list_job_messages()[source]

Wait for the Dataflow client response and then return it in a serialized list.

async_hook()[source]

Was this entry helpful?