Skip to content

Commit f815127

Browse files
zjffdushivaram
authored andcommitted
[SPARK-12318][SPARKR] Save mode in SparkR should be error by default
shivaram Please help review. Author: Jeff Zhang <zjffdu@apache.org> Closes apache#10290 from zjffdu/SPARK-12318. (cherry picked from commit 2eb5af5) Signed-off-by: Shivaram Venkataraman <shivaram@cs.berkeley.edu>
1 parent 16edd93 commit f815127

File tree

2 files changed

+13
-6
lines changed

2 files changed

+13
-6
lines changed

R/pkg/R/DataFrame.R

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1925,7 +1925,7 @@ setMethod("except",
19251925
#' @param df A SparkSQL DataFrame
19261926
#' @param path A name for the table
19271927
#' @param source A name for external data source
1928-
#' @param mode One of 'append', 'overwrite', 'error', 'ignore' save mode
1928+
#' @param mode One of 'append', 'overwrite', 'error', 'ignore' save mode (it is 'error' by default)
19291929
#'
19301930
#' @family DataFrame functions
19311931
#' @rdname write.df
@@ -1942,7 +1942,7 @@ setMethod("except",
19421942
#' }
19431943
setMethod("write.df",
19441944
signature(df = "DataFrame", path = "character"),
1945-
function(df, path, source = NULL, mode = "append", ...){
1945+
function(df, path, source = NULL, mode = "error", ...){
19461946
if (is.null(source)) {
19471947
sqlContext <- get(".sparkRSQLsc", envir = .sparkREnv)
19481948
source <- callJMethod(sqlContext, "getConf", "spark.sql.sources.default",
@@ -1967,7 +1967,7 @@ setMethod("write.df",
19671967
#' @export
19681968
setMethod("saveDF",
19691969
signature(df = "DataFrame", path = "character"),
1970-
function(df, path, source = NULL, mode = "append", ...){
1970+
function(df, path, source = NULL, mode = "error", ...){
19711971
write.df(df, path, source, mode, ...)
19721972
})
19731973

@@ -1990,7 +1990,7 @@ setMethod("saveDF",
19901990
#' @param df A SparkSQL DataFrame
19911991
#' @param tableName A name for the table
19921992
#' @param source A name for external data source
1993-
#' @param mode One of 'append', 'overwrite', 'error', 'ignore' save mode
1993+
#' @param mode One of 'append', 'overwrite', 'error', 'ignore' save mode (it is 'error' by default)
19941994
#'
19951995
#' @family DataFrame functions
19961996
#' @rdname saveAsTable
@@ -2007,7 +2007,7 @@ setMethod("saveDF",
20072007
setMethod("saveAsTable",
20082008
signature(df = "DataFrame", tableName = "character", source = "character",
20092009
mode = "character"),
2010-
function(df, tableName, source = NULL, mode="append", ...){
2010+
function(df, tableName, source = NULL, mode="error", ...){
20112011
if (is.null(source)) {
20122012
sqlContext <- get(".sparkRSQLsc", envir = .sparkREnv)
20132013
source <- callJMethod(sqlContext, "getConf", "spark.sql.sources.default",

docs/sparkr.md

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,7 @@ printSchema(people)
148148
</div>
149149

150150
The data sources API can also be used to save out DataFrames into multiple file formats. For example we can save the DataFrame from the previous example
151-
to a Parquet file using `write.df`
151+
to a Parquet file using `write.df` (Until Spark 1.6, the default mode for writes was `append`. It was changed in Spark 1.7 to `error` to match the Scala API)
152152

153153
<div data-lang="r" markdown="1">
154154
{% highlight r %}
@@ -387,3 +387,10 @@ The following functions are masked by the SparkR package:
387387
Since part of SparkR is modeled on the `dplyr` package, certain functions in SparkR share the same names with those in `dplyr`. Depending on the load order of the two packages, some functions from the package loaded first are masked by those in the package loaded after. In such case, prefix such calls with the package name, for instance, `SparkR::cume_dist(x)` or `dplyr::cume_dist(x)`.
388388

389389
You can inspect the search path in R with [`search()`](https://stat.ethz.ch/R-manual/R-devel/library/base/html/search.html)
390+
391+
392+
# Migration Guide
393+
394+
## Upgrading From SparkR 1.6 to 1.7
395+
396+
- Until Spark 1.6, the default mode for writes was `append`. It was changed in Spark 1.7 to `error` to match the Scala API.

0 commit comments

Comments
 (0)