Skip to content

Commit

Permalink
chore: bump version numbers (#1203)
Browse files Browse the repository at this point in the history
  • Loading branch information
mhamilton723 authored Oct 12, 2021
1 parent 993da81 commit 5fc65ab
Show file tree
Hide file tree
Showing 6 changed files with 77 additions and 77 deletions.
26 changes: 13 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@

[![Build Status](https://msdata.visualstudio.com/A365/_apis/build/status/microsoft.SynapseML?branchName=master)](https://msdata.visualstudio.com/A365/_build/latest?definitionId=17563&branchName=master) [![codecov](https://codecov.io/gh/Azure/mmlspark/branch/master/graph/badge.svg)](https://codecov.io/gh/Azure/mmlspark) [![Gitter](https://badges.gitter.im/Microsoft/MMLSpark.svg)](https://gitter.im/Microsoft/MMLSpark?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)

[![Release Notes](https://img.shields.io/badge/release-notes-blue)](https://github.com/Azure/mmlspark/releases) [![Scala Docs](https://img.shields.io/static/v1?label=api%20docs&message=scala&color=blue&logo=scala)](https://mmlspark.blob.core.windows.net/docs/1.0.0-rc3/scala/index.html#package) [![PySpark Docs](https://img.shields.io/static/v1?label=api%20docs&message=python&color=blue&logo=python)](https://mmlspark.blob.core.windows.net/docs/1.0.0-rc3/pyspark/index.html) [![Academic Paper](https://img.shields.io/badge/academic-paper-7fdcf7)](https://arxiv.org/abs/1810.08744)
[![Release Notes](https://img.shields.io/badge/release-notes-blue)](https://github.com/Azure/mmlspark/releases) [![Scala Docs](https://img.shields.io/static/v1?label=api%20docs&message=scala&color=blue&logo=scala)](https://mmlspark.blob.core.windows.net/docs/1.0.0-rc4/scala/index.html#package) [![PySpark Docs](https://img.shields.io/static/v1?label=api%20docs&message=python&color=blue&logo=python)](https://mmlspark.blob.core.windows.net/docs/1.0.0-rc4/pyspark/index.html) [![Academic Paper](https://img.shields.io/badge/academic-paper-7fdcf7)](https://arxiv.org/abs/1810.08744)

[![Version](https://img.shields.io/badge/version-1.0.0--rc2-blue)](https://github.com/Azure/mmlspark/releases) [![Snapshot Version](https://mmlspark.blob.core.windows.net/icons/badges/master_version3.svg)](#sbt)
[![Version](https://img.shields.io/badge/version-1.0.0--rc4-blue)](https://github.com/Azure/mmlspark/releases) [![Snapshot Version](https://mmlspark.blob.core.windows.net/icons/badges/master_version3.svg)](#sbt)


MMLSpark is an ecosystem of tools aimed towards expanding the distributed computing framework
Expand All @@ -22,10 +22,10 @@ can embed **any** web service into their SparkML models. In this vein, MMLSpark
SparkML transformers for a wide variety of [Microsoft Cognitive Services](https://azure.microsoft.com/en-us/services/cognitive-services/). For production grade deployment, the Spark Serving project enables high throughput,
sub-millisecond latency web services, backed by your Spark cluster.

MMLSpark requires Scala 2.11, Spark 2.4+, and Python 3.5+.
MMLSpark requires Scala 2.12, Spark 3.0+, and Python 3.6+.
See the API documentation [for
Scala](https://mmlspark.blob.core.windows.net/docs/1.0.0-rc3/scala/index.html#package) and [for
PySpark](https://mmlspark.blob.core.windows.net/docs/1.0.0-rc3/pyspark/index.html).
Scala](https://mmlspark.blob.core.windows.net/docs/1.0.0-rc4/scala/index.html#package) and [for
PySpark](https://mmlspark.blob.core.windows.net/docs/1.0.0-rc4/pyspark/index.html).

<details>
<summary><strong><em>Table of Contents</em></strong></summary>
Expand Down Expand Up @@ -149,7 +149,7 @@ the above example, or from python:
```python
import pyspark
spark = pyspark.sql.SparkSession.builder.appName("MyApp") \
.config("spark.jars.packages", "com.microsoft.ml.spark:mmlspark_2.11:1.0.0-rc3") \
.config("spark.jars.packages", "com.microsoft.ml.spark:mmlspark:1.0.0-rc4") \
.config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven") \
.getOrCreate()
import mmlspark
Expand All @@ -162,7 +162,7 @@ your `build.sbt`:

```scala
resolvers += "MMLSpark" at "https://mmlspark.azureedge.net/maven"
libraryDependencies += "com.microsoft.ml.spark" %% "mmlspark" % "1.0.0-rc3"
libraryDependencies += "com.microsoft.ml.spark" %% "mmlspark" % "1.0.0-rc4"

```

Expand All @@ -172,9 +172,9 @@ MMLSpark can be conveniently installed on existing Spark clusters via the
`--packages` option, examples:

```bash
spark-shell --packages com.microsoft.ml.spark:mmlspark_2.11:1.0.0-rc3
pyspark --packages com.microsoft.ml.spark:mmlspark_2.11:1.0.0-rc3
spark-submit --packages com.microsoft.ml.spark:mmlspark_2.11:1.0.0-rc3 MyApp.jar
spark-shell --packages com.microsoft.ml.spark:mmlspark:1.0.0-rc4
pyspark --packages com.microsoft.ml.spark:mmlspark:1.0.0-rc4
spark-submit --packages com.microsoft.ml.spark:mmlspark:1.0.0-rc4 MyApp.jar
```

This can be used in other Spark contexts too. For example, you can use MMLSpark
Expand All @@ -189,15 +189,15 @@ cloud](http://community.cloud.databricks.com), create a new [library from Maven
coordinates](https://docs.databricks.com/user-guide/libraries.html#libraries-from-maven-pypi-or-spark-packages)
in your workspace.

For the coordinates use: `com.microsoft.ml.spark:mmlspark_2.11:1.0.0-rc3`
For the coordinates use: `com.microsoft.ml.spark:mmlspark:1.0.0-rc4`
with the resolver: `https://mmlspark.azureedge.net/maven`. Ensure this library is
attached to your target cluster(s).

Finally, ensure that your Spark cluster has at least Spark 2.4 and Scala 2.11.

You can use MMLSpark in both your Scala and PySpark notebooks. To get started with our example notebooks import the following databricks archive:

`https://mmlspark.blob.core.windows.net/dbcs/MMLSparkExamplesv1.0.0-rc3.dbc`
`https://mmlspark.blob.core.windows.net/dbcs/MMLSparkExamplesv1.0.0-rc4.dbc`

### Apache Livy and HDInsight

Expand All @@ -210,7 +210,7 @@ Excluding certain packages from the library may be necessary due to current issu
{
"name": "mmlspark",
"conf": {
"spark.jars.packages": "com.microsoft.ml.spark:mmlspark_2.11:1.0.0-rc3",
"spark.jars.packages": "com.microsoft.ml.spark:mmlspark:1.0.0-rc4",
"spark.jars.repositories": "https://mmlspark.azureedge.net/maven",
"spark.jars.excludes": "org.scala-lang:scala-reflect,org.apache.spark:spark-tags_2.11,org.scalactic:scalactic_2.11,org.scalatest:scalatest_2.11"
}
Expand Down
6 changes: 3 additions & 3 deletions docs/R-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ To install the current MMLSpark package for R use:

```R
...
devtools::install_url("https://mmlspark.azureedge.net/rrr/mmlspark-1.0.0-rc3.zip")
devtools::install_url("https://mmlspark.azureedge.net/rrr/mmlspark-1.0.0-rc4.zip")
...
```

Expand All @@ -23,7 +23,7 @@ It will take some time to install all dependencies. Then, run:
library(sparklyr)
library(dplyr)
config <- spark_config()
config$sparklyr.defaultPackages <- "com.microsoft.ml.spark:mmlspark_2.11:1.0.0-rc3"
config$sparklyr.defaultPackages <- "com.microsoft.ml.spark:mmlspark:1.0.0-rc4"
sc <- spark_connect(master = "local", config = config)
...
```
Expand Down Expand Up @@ -83,7 +83,7 @@ and then use spark_connect with method = "databricks":

```R
install.packages("devtools")
devtools::install_url("https://mmlspark.azureedge.net/rrr/mmlspark-1.0.0-rc3.zip")
devtools::install_url("https://mmlspark.azureedge.net/rrr/mmlspark-1.0.0-rc4.zip")
library(sparklyr)
library(dplyr)
sc <- spark_connect(method = "databricks")
Expand Down
Loading

0 comments on commit 5fc65ab

Please sign in to comment.