Skip to content

Commit

Permalink
Merge pull request #8253 from pxLi/23041-merge-to-main
Browse files Browse the repository at this point in the history
Merge branch-23.04 to main [skip ci]
  • Loading branch information
pxLi committed May 9, 2023
2 parents d5acb6b + d325540 commit 630e79f
Show file tree
Hide file tree
Showing 29 changed files with 75 additions and 65 deletions.
9 changes: 7 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,16 @@
# Change log
Generated on 2023-04-18
Generated on 2023-05-09

## Release 23.04

### Features
|||
|:---|:---|
|[#7992](https://github.com/NVIDIA/spark-rapids/issues/7992)|[Audit][SPARK-40819][SQL][3.3] Timestamp nanos behaviour regression (parquet reader)|
|[#7985](https://github.com/NVIDIA/spark-rapids/issues/7985)|[FEA] Expose Alluxio master URL to support K8s Env|
|[#7880](https://github.com/NVIDIA/spark-rapids/issues/7880)|[FEA] retry framework task level metrics|
|[#7394](https://github.com/NVIDIA/spark-rapids/issues/7394)|[FEA] Support Delta Lake auto compaction|
|[#7920](https://github.com/NVIDIA/spark-rapids/issues/7920)|[FEA] Remove SpillCallback and executor level spill metrics|
|[#7463](https://github.com/NVIDIA/spark-rapids/issues/7463)|[FEA] Drop support for Databricks-9.1 ML LTS|
|[#7253](https://github.com/NVIDIA/spark-rapids/issues/7253)|[FEA] Implement OOM retry framework|
|[#7042](https://github.com/NVIDIA/spark-rapids/issues/7042)|[FEA] Add support in the tools event parsing for ML functions, libraries, and expressions|
Expand Down Expand Up @@ -83,6 +85,10 @@ Generated on 2023-04-18
### PRs
|||
|:---|:---|
|[#8247](https://github.com/NVIDIA/spark-rapids/pull/8247)|Bump up plugin version to 23.04.1-SNAPSHOT|
|[#8248](https://github.com/NVIDIA/spark-rapids/pull/8248)|[Doc] update versions for 2304 hot fix [skip ci]|
|[#8246](https://github.com/NVIDIA/spark-rapids/pull/8246)|Cherry-pick hotfix: Use weak keys in executor broadcast plan cache|
|[#8092](https://github.com/NVIDIA/spark-rapids/pull/8092)|Init changelog for 23.04 [skip ci]|
|[#8109](https://github.com/NVIDIA/spark-rapids/pull/8109)|Bump up JNI and private version to released 23.04.0|
|[#7939](https://github.com/NVIDIA/spark-rapids/pull/7939)|[Doc]update download docs for 2304 version[skip ci]|
|[#8127](https://github.com/NVIDIA/spark-rapids/pull/8127)|Avoid SQL result check of Delta Lake full delete on Databricks|
Expand Down Expand Up @@ -228,7 +234,6 @@ Generated on 2023-04-18
|[#6698](https://github.com/NVIDIA/spark-rapids/issues/6698)|[FEA] Support json_tuple|
|[#6885](https://github.com/NVIDIA/spark-rapids/issues/6885)|[FEA] Support reverse|
|[#6879](https://github.com/NVIDIA/spark-rapids/issues/6879)|[FEA] Support Databricks 11.3 ML LTS|
|[#5618](https://github.com/NVIDIA/spark-rapids/issues/5618)|Qualification tool use expressions parsed in duration and speedup factors|

### Performance
|||
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ as a `provided` dependency.
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark_2.12</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
<scope>provided</scope>
</dependency>
```
4 changes: 2 additions & 2 deletions aggregator/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
</parent>
<artifactId>rapids-4-spark-aggregator_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Aggregator</name>
<description>Creates an aggregated shaded package of the RAPIDS plugin for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>

<properties>
<!--
Expand Down
4 changes: 2 additions & 2 deletions api_validation/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
</parent>
<artifactId>rapids-4-spark-api-validation</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>

<profiles>
<profile>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-20x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-20x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.0.x Support</name>
<description>Delta Lake 2.0.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>

<dependencies>
<dependency>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-21x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-21x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.1.x Support</name>
<description>Delta Lake 2.1.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>

<dependencies>
<dependency>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-22x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-22x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.2.x Support</name>
<description>Delta Lake 2.2.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>

<dependencies>
<dependency>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-spark321db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-spark321db_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Databricks 10.4 Delta Lake Support</name>
<description>Databricks 10.4 Delta Lake support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>

<dependencies>
<dependency>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-spark330db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-spark330db_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Databricks 11.3 Delta Lake Support</name>
<description>Databricks 11.3 Delta Lake support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>

<dependencies>
<dependency>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-stub/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-stub_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake Stub</name>
<description>Delta Lake stub for the RAPIDS Accelerator for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>

<dependencies>
<dependency>
Expand Down
4 changes: 2 additions & 2 deletions dist/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
</parent>
<artifactId>rapids-4-spark_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Distribution</name>
<description>Creates the distribution package of the RAPIDS plugin for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>
<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
Expand Down
2 changes: 1 addition & 1 deletion docs/configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The following is the list of options that `rapids-plugin-4-spark` supports.
On startup use: `--conf [conf key]=[conf value]`. For example:

```
${SPARK_HOME}/bin/spark-shell --jars rapids-4-spark_2.12-23.04.0-cuda11.jar \
${SPARK_HOME}/bin/spark-shell --jars rapids-4-spark_2.12-23.04.1-cuda11.jar \
--conf spark.plugins=com.nvidia.spark.SQLPlugin \
--conf spark.rapids.sql.concurrentGpuTasks=2
```
Expand Down
2 changes: 1 addition & 1 deletion docs/demo/Databricks/generate-init-script.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
{
"cell_type":"code",
"source":[
"dbutils.fs.mkdirs(\"dbfs:/databricks/init_scripts/\")\n \ndbutils.fs.put(\"/databricks/init_scripts/init.sh\",\"\"\"\n#!/bin/bash\nsudo wget -O /databricks/jars/rapids-4-spark_2.12-23.04.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.04.0/rapids-4-spark_2.12-23.04.0.jar\n\"\"\", True)"
"dbutils.fs.mkdirs(\"dbfs:/databricks/init_scripts/\")\n \ndbutils.fs.put(\"/databricks/init_scripts/init.sh\",\"\"\"\n#!/bin/bash\nsudo wget -O /databricks/jars/rapids-4-spark_2.12-23.04.1.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.04.1/rapids-4-spark_2.12-23.04.1.jar\n\"\"\", True)"
],
"metadata":{

Expand Down
12 changes: 6 additions & 6 deletions docs/dev/shims.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,17 +68,17 @@ Using JarURLConnection URLs we create a Parallel World of the current version wi
Spark 3.0.2's URLs:

```text
jar:file:/home/spark/rapids-4-spark_2.12-23.04.0.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.0.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.0.jar!/spark302/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.1.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.1.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.1.jar!/spark302/
```

Spark 3.2.0's URLs :

```text
jar:file:/home/spark/rapids-4-spark_2.12-23.04.0.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.0.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.0.jar!/spark320/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.1.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.1.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.04.1.jar!/spark320/
```

### Late Inheritance in Public Classes
Expand Down
13 changes: 7 additions & 6 deletions docs/download.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ cuDF jar, that is either preinstalled in the Spark classpath on all nodes or sub
that uses the RAPIDS Accelerator For Apache Spark. See the [getting-started
guide](https://nvidia.github.io/spark-rapids/Getting-Started/) for more details.

## Release v23.04.0
## Release v23.04.1
Hardware Requirements:

The plugin is tested on the following architectures:
Expand All @@ -41,9 +41,9 @@ for your hardware's minimum driver version.
*For Cloudera and EMR support, please refer to the
[Distributions](./FAQ.md#which-distributions-are-supported) section of the FAQ.

### Download v23.04.0
### Download v23.04.1
* Download the [RAPIDS
Accelerator for Apache Spark 23.04.0 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.04.0/rapids-4-spark_2.12-23.04.0.jar)
Accelerator for Apache Spark 23.04.1 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.04.1/rapids-4-spark_2.12-23.04.1.jar)

This package is built against CUDA 11.8 and all CUDA 11.x versions are supported through [CUDA forward
compatibility](https://docs.nvidia.com/deploy/cuda-compatibility/index.html). It is tested
Expand All @@ -52,17 +52,18 @@ do not have CUDA forward compatibility (for example, GeForce), CUDA 11.5 or late
need to ensure the minimum driver (450.80.02) and CUDA toolkit are installed on each Spark node.

### Verify signature
* Download the [RAPIDS Accelerator for Apache Spark 23.04.0 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.04.0/rapids-4-spark_2.12-23.04.0.jar)
and [RAPIDS Accelerator for Apache Spark 23.04.0 jars.asc](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.04.0/rapids-4-spark_2.12-23.04.0.jar.asc)
* Download the [RAPIDS Accelerator for Apache Spark 23.04.1 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.04.1/rapids-4-spark_2.12-23.04.1.jar)
and [RAPIDS Accelerator for Apache Spark 23.04.1 jars.asc](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.04.1/rapids-4-spark_2.12-23.04.1.jar.asc)
* Download the [PUB_KEY](https://keys.openpgp.org/search?q=sw-spark@nvidia.com).
* Import the public key: `gpg --import PUB_KEY`
* Verify the signature: `gpg --verify rapids-4-spark_2.12-23.04.0.jar.asc rapids-4-spark_2.12-23.04.0.jar`
* Verify the signature: `gpg --verify rapids-4-spark_2.12-23.04.1.jar.asc rapids-4-spark_2.12-23.04.1.jar`

The output if signature verify:

gpg: Good signature from "NVIDIA Spark (For the signature of spark-rapids release jars) <sw-spark@nvidia.com>"

### Release Notes
The 23.04.1 release patches a possible driver OOM which can occur on an executor broadcast.
New functionality and performance improvements for this release include:
* Introduces OOM retry framework for automatic OOM handling in memory-intensive operators, such as: join, aggregates and windows, coalescing, projections and filters.
* Support dynamic repartitioning in large/skewed hash joins
Expand Down
2 changes: 1 addition & 1 deletion docs/get-started/getting-started-databricks.md
Original file line number Diff line number Diff line change
Expand Up @@ -139,7 +139,7 @@ cluster.
```bash
spark.rapids.sql.python.gpu.enabled true
spark.python.daemon.module rapids.daemon_databricks
spark.executorEnv.PYTHONPATH /databricks/jars/rapids-4-spark_2.12-23.04.0.jar:/databricks/spark/python
spark.executorEnv.PYTHONPATH /databricks/jars/rapids-4-spark_2.12-23.04.1.jar:/databricks/spark/python
```
Note that since python memory pool require installing the cudf library, so you need to install cudf library in
each worker nodes `pip install cudf-cu11 --extra-index-url=https://pypi.nvidia.com` or disable python memory pool
Expand Down
4 changes: 2 additions & 2 deletions docs/get-started/getting-started-on-prem.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,13 +53,13 @@ CUDA and will not run on other versions. The jars use a classifier to keep them
- CUDA 11.x => classifier cuda11

For example, here is a sample version of the jar with CUDA 11.x support:
- rapids-4-spark_2.12-23.04.0-cuda11.jar
- rapids-4-spark_2.12-23.04.1-cuda11.jar

For simplicity export the location to this jar. This example assumes the sample jar above has
been placed in the `/opt/sparkRapidsPlugin` directory:
```shell
export SPARK_RAPIDS_DIR=/opt/sparkRapidsPlugin
export SPARK_RAPIDS_PLUGIN_JAR=${SPARK_RAPIDS_DIR}/rapids-4-spark_2.12-23.04.0-cuda11.jar
export SPARK_RAPIDS_PLUGIN_JAR=${SPARK_RAPIDS_DIR}/rapids-4-spark_2.12-23.04.1-cuda11.jar
```

## Install the GPU Discovery Script
Expand Down
6 changes: 3 additions & 3 deletions integration_tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -250,7 +250,7 @@ individually, so you don't risk running unit tests along with the integration te
http://www.scalatest.org/user_guide/using_the_scalatest_shell

```shell
spark-shell --jars rapids-4-spark-tests_2.12-23.04.0-tests.jar,rapids-4-spark-integration-tests_2.12-23.04.0-tests.jar,scalatest_2.12-3.0.5.jar,scalactic_2.12-3.0.5.jar
spark-shell --jars rapids-4-spark-tests_2.12-23.04.1-tests.jar,rapids-4-spark-integration-tests_2.12-23.04.1-tests.jar,scalatest_2.12-3.0.5.jar,scalactic_2.12-3.0.5.jar
```

First you import the `scalatest_shell` and tell the tests where they can find the test files you
Expand All @@ -273,7 +273,7 @@ If you just want to verify the SQL replacement is working you will need to add t
assumes CUDA 11.0 is being used.

```
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.04.0-cuda11.jar" ./runtests.py
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.04.1-cuda11.jar" ./runtests.py
```

You don't have to enable the plugin for this to work, the test framework will do that for you.
Expand Down Expand Up @@ -372,7 +372,7 @@ To run cudf_udf tests, need following configuration changes:
As an example, here is the `spark-submit` command with the cudf_udf parameter on CUDA 11.0:

```
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.04.0-cuda11.jar,rapids-4-spark-tests_2.12-23.04.0.jar" --conf spark.rapids.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.concurrentPythonWorkers=2 --py-files "rapids-4-spark_2.12-23.04.0-cuda11.jar" --conf spark.executorEnv.PYTHONPATH="rapids-4-spark_2.12-23.04.0-cuda11.jar" ./runtests.py --cudf_udf
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.04.1-cuda11.jar,rapids-4-spark-tests_2.12-23.04.1.jar" --conf spark.rapids.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.concurrentPythonWorkers=2 --py-files "rapids-4-spark_2.12-23.04.1-cuda11.jar" --conf spark.executorEnv.PYTHONPATH="rapids-4-spark_2.12-23.04.1-cuda11.jar" ./runtests.py --cudf_udf
```

### Enabling fuzz tests
Expand Down
4 changes: 2 additions & 2 deletions integration_tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
</parent>
<artifactId>rapids-4-spark-integration-tests_2.12</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
<properties>
<target.classifier/>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion jenkins/databricks/create.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ def main():
workspace = 'https://dbc-9ff9942e-a9c4.cloud.databricks.com'
token = ''
sshkey = ''
cluster_name = 'CI-GPU-databricks-23.04.0'
cluster_name = 'CI-GPU-databricks-23.04.1'
idletime = 240
runtime = '7.0.x-gpu-ml-scala2.12'
num_workers = 1
Expand Down
4 changes: 2 additions & 2 deletions jenkins/version-def.sh
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ IFS=$PRE_IFS

CUDF_VER=${CUDF_VER:-"23.04.0"}
CUDA_CLASSIFIER=${CUDA_CLASSIFIER:-"cuda11"}
PROJECT_VER=${PROJECT_VER:-"23.04.0"}
PROJECT_TEST_VER=${PROJECT_TEST_VER:-"23.04.0"}
PROJECT_VER=${PROJECT_VER:-"23.04.1"}
PROJECT_TEST_VER=${PROJECT_TEST_VER:-"23.04.1"}
SPARK_VER=${SPARK_VER:-"3.1.1"}
# Make a best attempt to set the default value for the shuffle shim.
# Note that SPARK_VER for non-Apache Spark flavors (i.e. databricks,
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
<artifactId>rapids-4-spark-parent</artifactId>
<name>RAPIDS Accelerator for Apache Spark Root Project</name>
<description>The root project of the RAPIDS Accelerator for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>
<packaging>pom</packaging>

<url>https://nvidia.github.io/spark-rapids/</url>
Expand Down
4 changes: 2 additions & 2 deletions shuffle-plugin/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.04.0</version>
<version>23.04.1</version>
</parent>

<artifactId>rapids-4-spark-shuffle_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Shuffle Plugin</name>
<description>Accelerated shuffle plugin for the RAPIDS plugin for Apache Spark</description>
<version>23.04.0</version>
<version>23.04.1</version>

<dependencies>
<dependency>
Expand Down

0 comments on commit 630e79f

Please sign in to comment.