Skip to content

Commit b0d3119

Browse files
committed
🚧 Spark
1 parent aa18543 commit b0d3119

File tree

2 files changed

+47
-1
lines changed

2 files changed

+47
-1
lines changed
Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
# Spark 安装和配置
2+
3+
## 介绍
4+
5+
- 2018-12 发布最新:2.4.0 版本
6+
- 官网:<https://spark.apache.org/>
7+
- 官网文档:<https://spark.apache.org/documentation.html>
8+
- 官网下载:<https://spark.apache.org/downloads.html>
9+
- 官网 Github:<https://github.com/apache/spark>
10+
11+
## 本地模式安装
12+
13+
- CentOS 7.4
14+
- IP 地址:`192.168.0.105`
15+
- 必须 JDK 8.x
16+
- 因为个人原因,我这里 Hadoop 还是 2.6.5 版本,Spark 要用的是 2.2.0
17+
- Spark 2.2.0 官网文档:<https://spark.apache.org/docs/2.2.0/>
18+
- 192M,下载速度有点慢
19+
- `cd /usr/local && wget https://archive.apache.org/dist/spark/spark-2.2.0/spark-2.2.0-bin-hadoop2.6.tgz`
20+
- 解压:`tar zxvf spark-2.2.0-bin-hadoop2.6.tgz`
21+
- 重命名:`mv /usr/local/spark-2.2.0-bin-hadoop2.6 /usr/local/spark`
22+
- 增加环境变量:
23+
24+
```
25+
vim /etc/profile
26+
27+
SPARK_HOME=/usr/local/spark
28+
PATH=$PATH:${SPARK_HOME}/bin:${SPARK_HOME}/sbin
29+
export SPARK_HOME
30+
export PATH
31+
32+
source /etc/profile
33+
```
34+
35+
- 修改配置:`cp $SPARK_HOME/conf/spark-env.sh.template $SPARK_HOME/conf/spark-env.sh`
36+
- 修改配置:`vim $SPARK_HOME/conf/spark-env.sh`
37+
- 假设我的 hadoop 路径是:/usr/local/hadoop-2.6.5,则最尾巴增加:
38+
39+
```
40+
export HADOOP_CONF_DIR=/usr/local/hadoop-2.6.5/etc/hadoop
41+
```
42+
43+
44+
## 资料
45+
46+
- <https://cloud.tencent.com/developer/article/1010903>

markdown-file/monitor.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -568,7 +568,7 @@ TOTAL:(总的流量) 12.9GB 229Mb 190Mb 193Mb
568568
569569
```
570570

571-
### 端口使用情况
571+
### 端口使用情况(也可以用来查看端口占用)
572572

573573
#### lsof
574574

0 commit comments

Comments
 (0)