Skip to content

Commit 43ffa03

Browse files
committed
[SPARK-12000] Fix API doc generation issues
This pull request fixes multiple issues with API doc generation. - Modify the Jekyll plugin so that the entire doc build fails if API docs cannot be generated. This will make it easy to detect when the doc build breaks, since this will now trigger Jenkins failures. - Change how we handle the `-target` compiler option flag in order to fix `javadoc` generation. - Incorporate doc changes from thunterdb (in apache#10048). Closes apache#10048. Author: Josh Rosen <joshrosen@databricks.com> Author: Timothy Hunter <timhunter@databricks.com> Closes apache#10049 from JoshRosen/fix-doc-build. (cherry picked from commit d3ca8cf) Signed-off-by: Josh Rosen <joshrosen@databricks.com>
1 parent 4361517 commit 43ffa03

File tree

4 files changed

+14
-9
lines changed

4 files changed

+14
-9
lines changed

docs/_plugins/copy_api_dirs.rb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
cd("..")
2828

2929
puts "Running 'build/sbt -Pkinesis-asl clean compile unidoc' from " + pwd + "; this may take a few minutes..."
30-
puts `build/sbt -Pkinesis-asl clean compile unidoc`
30+
system("build/sbt -Pkinesis-asl clean compile unidoc") || raise("Unidoc generation failed")
3131

3232
puts "Moving back into docs dir."
3333
cd("docs")
@@ -117,7 +117,7 @@
117117

118118
puts "Moving to python/docs directory and building sphinx."
119119
cd("../python/docs")
120-
puts `make html`
120+
system(make html) || raise("Python doc generation failed")
121121

122122
puts "Moving back into home dir."
123123
cd("../../")
@@ -131,7 +131,7 @@
131131
# Build SparkR API docs
132132
puts "Moving to R directory and building roxygen docs."
133133
cd("R")
134-
puts `./create-docs.sh`
134+
system("./create-docs.sh") || raise("R doc generation failed")
135135

136136
puts "Moving back into home dir."
137137
cd("../")

network/common/src/main/java/org/apache/spark/network/client/StreamCallback.java

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,8 +21,8 @@
2121
import java.nio.ByteBuffer;
2222

2323
/**
24-
* Callback for streaming data. Stream data will be offered to the {@link onData(ByteBuffer)}
25-
* method as it arrives. Once all the stream data is received, {@link onComplete()} will be
24+
* Callback for streaming data. Stream data will be offered to the {@link onData(String, ByteBuffer)}
25+
* method as it arrives. Once all the stream data is received, {@link onComplete(String)} will be
2626
* called.
2727
* <p>
2828
* The network library guarantees that a single thread will call these methods at a time, but

network/common/src/main/java/org/apache/spark/network/server/RpcHandler.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ public abstract void receive(
5555

5656
/**
5757
* Receives an RPC message that does not expect a reply. The default implementation will
58-
* call "{@link receive(TransportClient, byte[], RpcResponseCallback}" and log a warning if
58+
* call "{@link receive(TransportClient, byte[], RpcResponseCallback)}" and log a warning if
5959
* any of the callback methods are called.
6060
*
6161
* @param client A channel client which enables the handler to make requests back to the sender

project/SparkBuild.scala

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -160,7 +160,12 @@ object SparkBuild extends PomBuild {
160160

161161
javacOptions in Compile ++= Seq(
162162
"-encoding", "UTF-8",
163-
"-source", javacJVMVersion.value,
163+
"-source", javacJVMVersion.value
164+
),
165+
// This -target option cannot be set in the Compile configuration scope since `javadoc` doesn't
166+
// play nicely with it; see https://github.com/sbt/sbt/issues/355#issuecomment-3817629 for
167+
// additional discussion and explanation.
168+
javacOptions in (Compile, compile) ++= Seq(
164169
"-target", javacJVMVersion.value
165170
),
166171

@@ -547,9 +552,9 @@ object Unidoc {
547552
publish := {},
548553

549554
unidocProjectFilter in(ScalaUnidoc, unidoc) :=
550-
inAnyProject -- inProjects(OldDeps.project, repl, examples, tools, streamingFlumeSink, yarn),
555+
inAnyProject -- inProjects(OldDeps.project, repl, examples, tools, streamingFlumeSink, yarn, testTags),
551556
unidocProjectFilter in(JavaUnidoc, unidoc) :=
552-
inAnyProject -- inProjects(OldDeps.project, repl, bagel, examples, tools, streamingFlumeSink, yarn),
557+
inAnyProject -- inProjects(OldDeps.project, repl, bagel, examples, tools, streamingFlumeSink, yarn, testTags),
553558

554559
// Skip actual catalyst, but include the subproject.
555560
// Catalyst is not public API and contains quasiquotes which break scaladoc.

0 commit comments

Comments
 (0)