You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Then go to the spark-hbase installation directory and issue
56
-
57
-
./bin/pyspark-hbase
58
-
57
+
```
58
+
./bin/pyspark-hbase
59
+
```
59
60
A successfull message is as follows:
60
61
61
62
You are using Spark SQL on HBase!!!
62
63
HBaseSQLContext available as hsqlContext.
63
64
64
65
To run a python script, the PYTHONPATH environment should be set to the "python" directory of the Spark-HBase installation. For example,
65
-
66
-
export PYTHONPATH=/root-of-Spark-HBase/python
66
+
```
67
+
export PYTHONPATH=/root-of-Spark-HBase/python
68
+
```
67
69
68
70
Note that the shell commands are not included in the Zip file of the Spark release. They are for developers' use only for this version of 1.0.0. Instead, users can use "$SPARK_HOME/bin/spark-shell --packages Huawei-Spark/Spark-SQL-on-HBase:1.0.0" for SQL shell or "$SPARK_HOME/bin/pyspark --packages Huawei-Spark/Spark-SQL-on-HBase:1.0.0" for Pythin shell.
69
71
@@ -72,13 +74,13 @@ Note that the shell commands are not included in the Zip file of the Spark relea
72
74
Testing first requires [building Spark HBase](#building-spark). Once Spark HBase is built ...
73
75
74
76
Run all test suites from Maven:
75
-
76
-
mvn -Phbase,hadoop-2.4 test
77
-
77
+
```
78
+
mvn -Phbase,hadoop-2.4 test
79
+
```
78
80
Run a single test suite from Maven, for example:
79
-
80
-
mvn -Phbase,hadoop-2.4 test -DwildcardSuites=org.apache.spark.sql.hbase.BasicQueriesSuite
81
-
81
+
```
82
+
mvn -Phbase,hadoop-2.4 test -DwildcardSuites=org.apache.spark.sql.hbase.BasicQueriesSuite
83
+
```
82
84
## IDE Setup
83
85
84
86
We use IntelliJ IDEA for Spark HBase development. You can get the community edition for free and install the JetBrains Scala plugin from Preferences > Plugins.
@@ -93,7 +95,7 @@ To import the current Spark HBase project for IntelliJ:
93
95
6. When you run the scala test, sometimes you will get out of memory exception. You can increase your VM memory usage by the following setting, for example:
94
96
95
97
```
96
-
-XX:MaxPermSize=512m -Xmx3072m
98
+
-XX:MaxPermSize=512m -Xmx3072m
97
99
```
98
100
99
101
You can also make those setting to be the default by setting to the "Defaults -> ScalaTest".
0 commit comments