Skip to content

Commit db00c07

Browse files
committed
README
1 parent 1115a09 commit db00c07

File tree

4 files changed

+66
-15
lines changed

4 files changed

+66
-15
lines changed

README.md

Lines changed: 28 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -2,21 +2,42 @@
22

33
[If github is too slow to view ,please click me](http://www.jianshu.com/p/3c19f8b9341c)
44

5+
6+
## Declarative workflows for building Spark Streaming
7+
8+
![](https://github.com/allwefantasy/streamingpro/blob/master/images/Snip20160510_4.png)
9+
[If no picture show,please click me](http://upload-images.jianshu.io/upload_images/1063603-968e744a1ef2e334.png?imageMogr2/auto-orient/strip%7CimageView2/2/w/1240)
10+
11+
12+
## Spark Streaming
13+
14+
Spark Streaming is an extension of the core Spark API that enables stream processing
15+
from a variety of sources.
16+
Spark is a extensible and programmable framework for massive distributed processing of datasets,
17+
called Resilient Distributed Datasets (RDD). Spark Streaming receives input data streams and
18+
divides the data into batches, which are then processed by the Spark engine to generate the results.
19+
20+
Spark Streaming data is organized into a sequence of DStreams,
21+
represented internally as a sequence of RDDs.
22+
23+
## StreamingPro
24+
525
StreamingPro is not a complete
6-
application, but rather a code library and API that can easily be used
7-
to build your streaming application which may run on Spark Streaming.
26+
application, but rather a extensible and programmable framework for spark streaming (also include spark,storm)
27+
that can easily be used to build your streaming application.
828

9-
StreamingPro also make it possible that all you should do to build streaming program is assembling components(eg. SQL Component) in configuration file.
10-
Of source , if you are a geek who like do every thing by programing,we also encourage you use API provided
11-
by StreamingPro which is more easy to use then original API designed by Spark/Storm.
29+
30+
StreamingPro also make it possible that all you should do to build streaming program is
31+
assembling components(eg. SQL Component) in configuration file.
1232

1333
## Features
1434

15-
* Pure spark streaming program (Storm/Spark in future)
35+
* Pure Spark Streaming(Or normal Spark) program (Storm in future)
1636
* No need of coding, only declarative workflows
37+
* Rest API for interactive
1738
* SQL-Oriented workflows support
1839
* Data continuously streamed in & processed in near real-time
19-
* dynamically CURD of workflows at runtime via Rest API (in single instance or multiple instances)
40+
* dynamically CURD of workflows at runtime via Rest API
2041
* Flexible workflows (input, output, parsers, etc...)
2142
* High performance
2243
* Scalable
@@ -28,7 +49,6 @@ by StreamingPro which is more easy to use then original API designed by Spark/St
2849
* [Build](https://github.com/allwefantasy/streamingpro/wiki/Build)
2950
* [Run your first application](https://github.com/allwefantasy/streamingpro/wiki/Run-your-first-application)
3051
* [Submit application](https://github.com/allwefantasy/streamingpro/wiki/Submit-application)
31-
* [Dynamically add Job via Rest API](https://github.com/allwefantasy/streamingpro/wiki/Dynamically-add-Job-via-Rest-API)
3252
* [dynamically CURD of workflows at runtime via Rest API](https://github.com/allwefantasy/streamingpro/wiki/How-To-Add-New-Compositor)
3353
* [Recovery](https://github.com/allwefantasy/streamingpro/wiki/Recovery)
3454
* [Useful modules introduction](https://github.com/allwefantasy/streamingpro/wiki/Common-compositors-introduction)
@@ -40,11 +60,6 @@ by StreamingPro which is more easy to use then original API designed by Spark/St
4060
![](https://github.com/allwefantasy/streamingpro/blob/master/images/Snip20160510_3.png)
4161
[If no picture show,please click me](http://upload-images.jianshu.io/upload_images/1063603-383c19104e141031.png?imageMogr2/auto-orient/strip%7CimageView2/2/w/1240)
4262

43-
### Declarative workflows
44-
45-
![](https://github.com/allwefantasy/streamingpro/blob/master/images/Snip20160510_4.png)
46-
[If no picture show,please click me](http://upload-images.jianshu.io/upload_images/1063603-968e744a1ef2e334.png?imageMogr2/auto-orient/strip%7CimageView2/2/w/1240)
47-
4863
### Implementation
4964

5065
![](https://github.com/allwefantasy/streamingpro/blob/master/images/Snip20160510_1.png)
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
package streaming.common
2+
3+
import java.util.{Map => JMap}
4+
5+
/**
6+
* 5/14/16 WilliamZhu(allwefantasy@gmail.com)
7+
*/
8+
object ParamsHelper {
9+
implicit def mapToParams(_params: JMap[Any, Any]):Params = {
10+
new Params(_params)
11+
}
12+
}
13+
14+
class Params(_params: JMap[Any, Any]) {
15+
def paramAsInt(key: String, defaultValue: Int = 0) = {
16+
_params.getOrDefault(key, defaultValue).toString.toInt
17+
}
18+
19+
def paramAsDouble(key: String, defaultValue: Double = 0) = {
20+
_params.getOrDefault(key, defaultValue).toString.toDouble
21+
}
22+
23+
def param(key: String, defaultValue: String = null) = {
24+
_params.getOrDefault(key, defaultValue).toString
25+
}
26+
27+
def paramAsBoolean(key: String, defaultValue: Boolean = false) = {
28+
_params.getOrDefault(key, defaultValue).toString.toBoolean
29+
}
30+
31+
}
32+

src/main/java/streaming/core/LocalSparkApp.scala

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,8 @@ object LocalSparkApp {
99
"-streaming.master", "local[2]",
1010
"-streaming.name", "god",
1111
"-streaming.rest", "true",
12-
"-streaming.platform", "spark"
12+
"-streaming.platform", "spark",
13+
"-streaming.spark.service", "true"
1314
))
1415
}
1516
}

src/main/java/streaming/core/strategy/platform/SparkRuntime.scala

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ import java.util.{List => JList, Map => JMap}
66

77
import net.csdn.common.logging.Loggers
88
import org.apache.spark.{SparkConf, SparkContext, SparkRuntimeOperator}
9+
import streaming.common.ParamsHelper._
910

1011
import scala.collection.JavaConversions._
1112

@@ -47,7 +48,9 @@ class SparkRuntime(_params: JMap[Any, Any]) extends StreamingRuntime with Platfo
4748
}
4849

4950
override def awaitTermination: Unit = {
50-
Thread.currentThread().join()
51+
if (params.paramAsBoolean("streaming.spark.service", false)) {
52+
Thread.currentThread().join()
53+
}
5154
}
5255

5356
override def streamingRuntimeInfo: StreamingRuntimeInfo = sparkRuntimeInfo

0 commit comments

Comments
 (0)