tests.system.google.cloud.dataproc.example_dataproc_spark

Example Airflow DAG for DataprocSubmitJobOperator with spark job.

Attributes

ENV_ID

DAG_ID

PROJECT_ID

CLUSTER_NAME_BASE

CLUSTER_NAME_FULL

CLUSTER_NAME

REGION

CLUSTER_CONFIG

SPARK_JOB

create_cluster

test_run

Module Contents

tests.system.google.cloud.dataproc.example_dataproc_spark.ENV_ID[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.DAG_ID = 'dataproc_spark'[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.PROJECT_ID[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.CLUSTER_NAME_BASE = ''[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.CLUSTER_NAME_FULL = ''[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.CLUSTER_NAME = ''[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.REGION = 'europe-west1'[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.CLUSTER_CONFIG[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.SPARK_JOB[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.create_cluster[source]
tests.system.google.cloud.dataproc.example_dataproc_spark.test_run[source]

Was this entry helpful?