Airflow Summit 2021 is coming July 8-16. Register now!

apache-airflow-providers-apache-spark

Package apache-airflow-providers-apache-spark

Apache Spark

Release: 2.0.0

Provider package

This is a provider package for apache.spark provider. All classes for this provider package are in airflow.providers.apache.spark python package.

Installation

You can install this package on top of an existing airflow 2.1+ installation via pip install apache-airflow-providers-apache-spark

PIP requirements

PIP package

Version required

apache-airflow

>=2.1.0

pyspark

Changelog

2.0.0

Breaking changes

  • Auto-apply apply_default decorator (#15667)

Warning

Due to apply_default decorator removal, this version of the provider requires Airflow 2.1.0+. If your Airflow version is < 2.1.0, and you want to install this provider version, first upgrade Airflow to at least version 2.1.0. Otherwise your Airflow package version will be upgraded automatically and you will have to manually run airflow upgrade db to complete the migration.

Bug fixes

  • Make SparkSqlHook use Connection (#15794)

1.0.3

Bug fixes

  • Fix 'logging.exception' redundancy (#14823)

1.0.2

Bug fixes

  • Use apache.spark provider without kubernetes (#14187)

1.0.1

Updated documentation and readme files.

1.0.0

Initial version of the provider.

Was this entry helpful?