File tree Expand file tree Collapse file tree 1 file changed +16
-2
lines changed Expand file tree Collapse file tree 1 file changed +16
-2
lines changed Original file line number Diff line number Diff line change @@ -432,13 +432,27 @@ tcp6 0 0 :::37481 :::* LISTEN
432
432
433
433
## 运行作业
434
434
435
+ - 在主节点上操作
435
436
- 运行一个 Mapreduce 作业试试:
436
- - `hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.5.jar pi 5 10`
437
+ - 计算 π:`hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.5.jar pi 5 10`
438
+ - 运行一个文件相关作业:
439
+ - 由于运行 hadoop 时指定的输入文件只能是 HDFS 文件系统中的文件,所以我们必须将要进行 wordcount 的文件从本地文件系统拷贝到 HDFS 文件系统中。
440
+ - 查看目前根目录结构:`hadoop fs -ls /`
441
+ - 创建目录:`hadoop fs -mkdir -p /tmp/zch/wordcount_input_dir`
442
+ - 上传文件:`hadoop fs -put /opt/input.txt /tmp/zch/wordcount_input_dir`
443
+ - 查看上传的目录下是否有文件:`hadoop fs -ls /tmp/zch/wordcount_input_dir`
444
+ - 向 yarn 提交作业,计算单词个数:`hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.5.jar wordcount /tmp/zch/wordcount_input_dir /tmp/zch/wordcount_output_dir`
445
+ - 查看计算结果输出的目录:`hadoop fs -ls /tmp/zch/wordcount_output_dir`
446
+ - 查看计算结果输出内容:`hadoop fs -cat /tmp/zch/wordcount_output_dir/part-r-00000`
447
+ - 查看正在运行的 Hadoop 任务:` yarn application -list `
448
+ - 关闭 Hadoop 任务进程:` yarn application -kill 你的ApplicationId `
449
+
437
450
438
451
-------------------------------------------------------------------
439
452
440
453
## 资料
441
454
442
455
- < https://www.linode.com/docs/databases/hadoop/how-to-install-and-set-up-hadoop-cluster/ >
443
456
- < http://www.cnblogs.com/Leo_wl/p/7426496.html >
444
- - < https://blog.csdn.net/bingduanlbd/article/details/51892750 >
457
+ - < https://blog.csdn.net/bingduanlbd/article/details/51892750 >
458
+ - < https://blog.csdn.net/whdxjbw/article/details/81050597 >
You can’t perform that action at this time.
0 commit comments