Skip to content

Commit

Permalink
Merge pull request #52 from mlverse/updates
Browse files Browse the repository at this point in the history
Updates
  • Loading branch information
edgararuiz authored Oct 18, 2023
2 parents 7812e2a + b648b47 commit beb4b31
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 6 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/test-coverage.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@
# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help
on:
push:
branches: main
branches: temp
pull_request:
branches: main
branches: temp

name: test-coverage

Expand Down
4 changes: 3 additions & 1 deletion DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Imports:
reticulate (>= 1.31),
methods,
rlang,
sparklyr (>= 1.8.3),
sparklyr (>= 1.8.3.9001),
tidyselect,
fs,
magrittr,
Expand All @@ -40,3 +40,5 @@ Suggests:
tibble,
withr
Config/testthat/edition: 3
Remotes:
sparklyr/sparklyr
5 changes: 2 additions & 3 deletions R/spark-connect.R
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ spark_connect_method.spark_method_spark_connect <- function(
config,
app_name,
version,
hadoop_version,
packages,
extensions,
scala_version,
...) {
Expand All @@ -29,14 +29,13 @@ spark_connect_method.spark_method_databricks_connect <- function(
config,
app_name,
version,
hadoop_version,
packages,
extensions,
scala_version,
...) {
py_spark_connect(master = master, method = method, config = config, ...)
}


py_spark_connect <- function(master,
token = Sys.getenv("DATABRICKS_TOKEN"),
cluster_id = NULL,
Expand Down

0 comments on commit beb4b31

Please sign in to comment.