Skip to content

Clarify policy for supported Spark versions #1579

@andygrove

Description

@andygrove

What is the problem the feature request solves?

Our documentation currently states that we support all Spark 3.4.x and 3.5.x versions. However, we only test for a single 3.4.x and a single 3.5.x version in CI, so this claim is misleading and causing confusion with users. There is also no guarantee that we can support new Spark patch releases without additional development work.

I want to update the documentation to list the specific minor releases we support and ensure we have some CI testing for those versions. The level of support and testing could vary by version. For example, we could support Spark 3.5.0 through 3.5.4 for development/testing purposes with minimal testing in CI, and recommend 3.5.5 for production use with full testing in CI.

Additionally, we should state the general policy, such as "support the most recent N patch releases for 3.5.x".

Describe the potential solution

No response

Additional context

No response

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions