hadoop,
<div id="article_content" class="article_content clearfix csdn-tracking-statistics" data-pid="blog" data-mod="popu_307" data-dsm="post">
<div class="markdown_views">
<blockquote>
<p>介绍如何在Intellij Idea中通过创建maven工程配置MapReduce的编程环境。</p>
</blockquote>
<h1 id="一软件环境"><a name="t0"></a>一、软件环境</h1>
<p>我使用的软件版本如下:</p>
<ol>
<li>Intellij Idea 2017.1</li>
<li>Maven 3.3.9</li>
<li>Hadoop伪分布式环境( 安装教程可参考<a href="http://blog.csdn.net/napoay/article/details/54136398" rel="nofollow" target="_blank">这里</a>)</li>
</ol>
<h1 id="二创建maven工程"><a name="t1"></a>二、创建maven工程</h1>
<p>打开Idea,file->new->Project,左侧面板选择maven工程。(如果只跑MapReduce创建java工程即可,不用勾选Creat from archetype,如果想创建web工程或者使用骨架可以勾选) <br>
<img src="https://img-blog.csdn.net/20170330195641898?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQv/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="这里写图片描述" title=""> <br>
设置GroupId和ArtifactId,下一步。 <br>
<img src="https://img-blog.csdn.net/20170330195925542?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbmFwb2F5/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="这里写图片描述" title=""> <br>
设置工程存储路径,下一步。 <br>
<img src="https://img-blog.csdn.net/20170330200053760?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbmFwb2F5/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="这里写图片描述" title=""> <br>
Finish之后,空白工程的路径如下图所示。</p>
<p><img src="https://img-blog.csdn.net/20170330204326182?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbmFwb2F5/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="这里写图片描述" title=""></p>
<p>完整的工程路径如下图所示: <br>
<img src="https://img-blog.csdn.net/20170330204504371?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbmFwb2F5/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="这里写图片描述" title=""></p>
<h1 id="三添加maven依赖"><a name="t2"></a>三、添加maven依赖</h1>
<p>在pom.xml添加依赖,对于hadoop 2.7.3版本的hadoop,需要的jar包有以下几个:</p>
<ul>
<li>hadoop-common</li>
<li>hadoop-hdfs</li>
<li>hadoop-mapreduce-client-core</li>
<li>hadoop-mapreduce-client-jobclient</li>
<li><p>log4j( 打印日志)</p>
<p>pom.xml中的依赖如下:</p></li>
</ul>
<pre class="prettyprint" name="code"><code class="hljs xml has-numbering"> <span class="hljs-tag"><<span class="hljs-title">dependencies</span>></span>
<span class="hljs-tag"><<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">groupId</span>></span>junit<span class="hljs-tag"></<span class="hljs-title">groupId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">artifactId</span>></span>junit<span class="hljs-tag"></<span class="hljs-title">artifactId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">version</span>></span>4.12<span class="hljs-tag"></<span class="hljs-title">version</span>></span>
<span class="hljs-tag"><<span class="hljs-title">scope</span>></span>test<span class="hljs-tag"></<span class="hljs-title">scope</span>></span>
<span class="hljs-tag"></<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">groupId</span>></span>org.apache.hadoop<span class="hljs-tag"></<span class="hljs-title">groupId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">artifactId</span>></span>hadoop-common<span class="hljs-tag"></<span class="hljs-title">artifactId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">version</span>></span>2.7.3<span class="hljs-tag"></<span class="hljs-title">version</span>></span>
<span class="hljs-tag"></<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">groupId</span>></span>org.apache.hadoop<span class="hljs-tag"></<span class="hljs-title">groupId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">artifactId</span>></span>hadoop-hdfs<span class="hljs-tag"></<span class="hljs-title">artifactId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">version</span>></span>2.7.3<span class="hljs-tag"></<span class="hljs-title">version</span>></span>
<span class="hljs-tag"></<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">groupId</span>></span>org.apache.hadoop<span class="hljs-tag"></<span class="hljs-title">groupId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">artifactId</span>></span>hadoop-mapreduce-client-core<span class="hljs-tag"></<span class="hljs-title">artifactId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">version</span>></span>2.7.3<span class="hljs-tag"></<span class="hljs-title">version</span>></span>
<span class="hljs-tag"></<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">groupId</span>></span>org.apache.hadoop<span class="hljs-tag"></<span class="hljs-title">groupId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">artifactId</span>></span>hadoop-mapreduce-client-jobclient<span class="hljs-tag"></<span class="hljs-title">artifactId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">version</span>></span>2.7.3<span class="hljs-tag"></<span class="hljs-title">version</span>></span>
<span class="hljs-tag"></<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"><<span class="hljs-title">groupId</span>></span>log4j<span class="hljs-tag"></<span class="hljs-title">groupId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">artifactId</span>></span>log4j<span class="hljs-tag"></<span class="hljs-title">artifactId</span>></span>
<span class="hljs-tag"><<span class="hljs-title">version</span>></span>1.2.17<span class="hljs-tag"></<span class="hljs-title">version</span>></span>
<span class="hljs-tag"></<span class="hljs-title">dependency</span>></span>
<span class="hljs-tag"></<span class="hljs-title">dependencies</span>></span></code><ul class="pre-numbering"><li >1</li><li >2</li><li >3</li><li >4</li><li >5</li><li >6</li><li >7</li><li >8</li><li >9</li><li >10</li><li >11</li><li >12</li><li >13</li><li >14</li><li >15</li><li >16</li><li >17</li><li >18</li><li >19</li><li >20</li><li >21</li><li >22</li><li >23</li><li >24</li><li >25</li><li >26</li><li >27</li><li >28</li><li >29</li><li >30</li><li >31</li><li >32</li><li >33</li><li >34</li><li >35</li><li >36</li><li >37</li><li >38</li></ul></pre>
<h1 id="四配置log4j"><a name="t3"></a>四、配置log4j</h1>
<p>在<code>src/main/resources</code>目录下新增log4j的配置文件<code>log4j.properties</code>,内容如下:</p>
<pre class="prettyprint" name="code"><code class="hljs avrasm has-numbering">log4j<span class="hljs-preprocessor">.rootLogger</span> = debug,stdout
<span class="hljs-preprocessor">### 输出信息到控制抬 ###</span>
log4j<span class="hljs-preprocessor">.appender</span><span class="hljs-preprocessor">.stdout</span> = org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.log</span>4j<span class="hljs-preprocessor">.ConsoleAppender</span>
log4j<span class="hljs-preprocessor">.appender</span><span class="hljs-preprocessor">.stdout</span><span class="hljs-preprocessor">.Target</span> = System<span class="hljs-preprocessor">.out</span>
log4j<span class="hljs-preprocessor">.appender</span><span class="hljs-preprocessor">.stdout</span><span class="hljs-preprocessor">.layout</span> = org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.log</span>4j<span class="hljs-preprocessor">.PatternLayout</span>
log4j<span class="hljs-preprocessor">.appender</span><span class="hljs-preprocessor">.stdout</span><span class="hljs-preprocessor">.layout</span><span class="hljs-preprocessor">.ConversionPattern</span> = [%-<span class="hljs-number">5</span>p] %d{yyyy-MM-dd HH:mm:ss,SSS} method:%l%n%m%n
</code><ul class="pre-numbering"><li >1</li><li >2</li><li >3</li><li >4</li><li >5</li><li >6</li><li >7</li><li >8</li></ul></pre>
<h1 id="五启动hadoop"><a name="t4"></a>五、启动Hadoop</h1>
<p>启动Hadoop,运行命令:</p>
<pre class="prettyprint" name="code"><code class="hljs sql has-numbering">cd hadoop-2.7.3/
./sbin/<span class="hljs-operator"><span class="hljs-keyword">start</span>-<span class="hljs-keyword">all</span>.sh</span></code><ul class="pre-numbering"><li >1</li><li >2</li></ul></pre>
<p>访问<a href="http://localhost:50070" rel="nofollow" target="_blank">http://localhost:50070/</a>查看hadoop是否正常启动。</p>
<h1 id="六运行wordcount从本地读取文件"><a name="t5"></a><strong>六、运行WordCount(从本地读取文件)</strong></h1>
<p>在工程根目录下新建input文件夹,input文件夹下新增dream.txt,随便写入一些单词:</p>
<pre class="prettyprint" name="code"><code class="hljs livecodeserver has-numbering">I have <span class="hljs-operator">a</span> dream
<span class="hljs-operator">a</span> dream</code><ul class="pre-numbering"><li >1</li><li >2</li></ul></pre>
<p>在src/main/java目录下新建包,新增FileUtil.java,创建一个删除output文件的函数,以后就不用手动删除了。内容如下:</p>
<pre class="prettyprint" name="code"><code class="hljs java has-numbering"><span class="hljs-keyword">package</span> com.mrtest.hadoop;
<span class="hljs-keyword">import</span> java.io.File;
<span class="hljs-javadoc">/**
* Created by bee on 3/25/17.
*/</span>
<span class="hljs-keyword">public</span> <span class="hljs-class"><span class="hljs-keyword">class</span> <span class="hljs-title">FileUtil</span> {</span>
<span class="hljs-keyword">public</span> <span class="hljs-keyword">static</span> <span class="hljs-keyword">boolean</span> <span class="hljs-title">deleteDir</span>(String path) {
File dir = <span class="hljs-keyword">new</span> File(path);
<span class="hljs-keyword">if</span> (dir.exists()) {
<span class="hljs-keyword">for</span> (File f : dir.listFiles()) {
<span class="hljs-keyword">if</span> (f.isDirectory()) {
deleteDir(f.getName());
} <span class="hljs-keyword">else</span> {
f.delete();
}
}
dir.delete();
<span class="hljs-keyword">return</span> <span class="hljs-keyword">true</span>;
} <span class="hljs-keyword">else</span> {
System.out.println(<span class="hljs-string">"文件(夹)不存在!"</span>);
<span class="hljs-keyword">return</span> <span class="hljs-keyword">false</span>;
}
}
}</code><ul class="pre-numbering"><li >1</li><li >2</li><li >3</li><li >4</li><li >5</li><li >6</li><li >7</li><li >8</li><li >9</li><li >10</li><li >11</li><li >12</li><li >13</li><li >14</li><li >15</li><li >16</li><li >17</li><li >18</li><li >19</li><li >20</li><li >21</li><li >22</li><li >23</li><li >24</li><li >25</li><li >26</li><li >27</li><li >28</li></ul></pre>
<p>编写WordCount的MapReduce程序WordCount.java,内容如下:</p>
<pre class="prettyprint" name="code"><code class="hljs avrasm has-numbering">package <span class="hljs-keyword">com</span><span class="hljs-preprocessor">.mrtest</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-comment">;</span>
import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.conf</span><span class="hljs-preprocessor">.Configuration</span><span class="hljs-comment">;</span>
import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.fs</span><span class="hljs-preprocessor">.Path</span><span class="hljs-comment">;</span>
import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.IntWritable</span><span class="hljs-comment">;</span>
import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.Text</span><span class="hljs-comment">;</span>
import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.mapreduce</span><span class="hljs-preprocessor">.Job</span><span class="hljs-comment">;</span>
import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.mapreduce</span><span class="hljs-preprocessor">.Mapper</span><span class="hljs-comment">;</span>
import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.mapreduce</span><span class="hljs-preprocessor">.Reducer</span><span class="hljs-comment">;</span>
import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.mapreduce</span><span class="hljs-preprocessor">.lib</span><span class="hljs-preprocessor">.input</span><span class="hljs-preprocessor">.FileInputFormat</span><span class="hljs-comment">;</span>
import org<span class="hljs-preprocessor">.apache</span><span class="hljs-preprocessor">.hadoop</span><span class="hljs-preprocessor">.mapreduce</span><span class="hljs-preprocessor">.lib</span><span class="hljs-preprocessor">.output</span><span class="hljs-preprocessor">.FileOutputFormat</span><span class="hljs-comment">;</span>
import java<span class="hljs-preprocessor">.io</span><span class="hljs-preprocessor">.IOException</span><span class="hljs-comment">;</span>
import java<span class="hljs-preprocessor">.util</span><span class="hljs-preprocessor">.Iterator</span><span class="hljs-comment">;</span>
import java<span class="hljs-preprocessor">.util</span><span class="hljs-preprocessor">.StringTokenizer</span><span class="hljs-comment">;</span>
<span class="hljs-comment">/**
* Created by bee on 3/25/17.
*/</span>
public class WordCount {
public static class TokenizerMapper extends
Mapper<Object, Text, Text, IntWritable> {
public static final IntWritable one = new IntWritable(<span class="hljs-number">1</span>)<span class="hljs-comment">;</span>
private Text word = new Text()<span class="hljs-comment">;</span>
public void map(Object key, Text value, Context context)
throws IOException, InterruptedException {
StringTokenizer itr = new StringTokenizer(value<span class="hljs-preprocessor">.toString</span>())<span class="hljs-comment">;</span>
while (itr<span class="hljs-preprocessor">.hasMoreTokens</span>()) {
this<span class="hljs-preprocessor">.word</span><span class="hljs-preprocessor">.set</span>(itr<span class="hljs-preprocessor">.nextToken</span>())<span class="hljs-comment">;</span>
context<span class="hljs-preprocessor">.write</span>(this<span class="hljs-preprocessor">.word</span>, one)<span class="hljs-comment">;</span>
}
}
}
public static class IntSumReduce extends
Reducer<Text, IntWritable, Text, IntWritable> {
private IntWritable result = new IntWritable()<span class="hljs-comment">;</span>
public void reduce(Text key, Iterable<IntWritable> values,
Context context)
throws IOException, InterruptedException {
int sum = <span class="hljs-number">0</span><span class="hljs-comment">;</span>
IntWritable val<span class="hljs-comment">;</span>
for (Iterator i = values<span class="hljs-preprocessor">.iterator</span>()<span class="hljs-comment">; i.hasNext(); sum += val.get()) {</span>
val = (IntWritable) i<span class="hljs-preprocessor">.next</span>()<span class="hljs-comment">;</span>
}
this<span class="hljs-preprocessor">.result</span><span class="hljs-preprocessor">.set</span>(sum)<span class="hljs-comment">;</span>
context<span class="hljs-preprocessor">.write</span>(key, this<span class="hljs-preprocessor">.result</span>)<span class="hljs-comment">;</span>
}
}
public static void main(String[] args)
throws IOException, ClassNotFoundException, InterruptedException {
FileUtil<span class="hljs-preprocessor">.deleteDir</span>(<span class="hljs-string">"output"</span>)<span class="hljs-comment">;</span>
Configuration conf = new Configuration()<span class="hljs-comment">;</span>
String[] otherArgs = new String[]{<span class="hljs-string">"input/dream.txt"</span>,<span class="hljs-string">"output"</span>}<span class="hljs-comment">;</span>
if (otherArgs<span class="hljs-preprocessor">.length</span> != <span class="hljs-number">2</span>) {
System<span class="hljs-preprocessor">.err</span><span class="hljs-preprocessor">.println</span>(<span class="hljs-string">"Usage:Merge and duplicate removal <in> <out>"</span>)<span class="hljs-comment">;</span>
System<span class="hljs-preprocessor">.exit</span>(<span class="hljs-number">2</span>)<span class="hljs-comment">;</span>
}
Job job = Job<span class="hljs-preprocessor">.getInstance</span>(conf, <span class="hljs-string">"WordCount"</span>)<span class="hljs-comment">;</span>
job<span class="hljs-preprocessor">.setJarByClass</span>(WordCount<span class="hljs-preprocessor">.class</span>)<span class="hljs-comment">;</span>
job<span class="hljs-preprocessor">.setMapperClass</span>(WordCount<span class="hljs-preprocessor">.TokenizerMapper</span><span class="hljs-preprocessor">.class</span>)<span class="hljs-comment">;</span>
job<span class="hljs-preprocessor">.setReducerClass</span>(WordCount<span class="hljs-preprocessor">.IntSumReduce</span><span class="hljs-preprocessor">.class</span>)<span class="hljs-comment">;</span>
job<span class="hljs-preprocessor">.setOutputKeyClass</span>(Text<span class="hljs-preprocessor">.class</span>)<span class="hljs-comment">;</span>
job<span class="hljs-preprocessor">.setOutputValueClass</span>(IntWritable<span class="hljs-preprocessor">.class</span>)<span class="hljs-comment">;</span>
FileInputFormat<span class="hljs-preprocessor">.addInputPath</span>(job, new Path(otherArgs[<span class="hljs-number">0</span>]))<span class="hljs-comment">;</span>
FileOutputFormat<span class="hljs-preprocessor">.setOutputPath</span>(job, new Path(otherArgs[<span class="hljs-number">1</span>]))<span class="hljs-comment">;</span>
System<span class="hljs-preprocessor">.exit</span>(job<span class="hljs-preprocessor">.waitForCompletion</span>(true) ? <span class="hljs-number">0</span> : <span class="hljs-number">1</span>)<span class="hljs-comment">;</span>
}
}
</code><ul class="pre-numbering"><li >1</li><li >2</li><li >3</li><li >4</li><li >5</li><li >6</li><li >7</li><li >8</li><li >9</li><li >10</li><li >11</li><li >12</li><li >13</li><li >14</li><li >15</li><li >16</li><li >17</li><li >18</li><li >19</li><li >20</li><li >21</li><li >22</li><li >23</li><li >24</li><li >25</li><li >26</li><li >27</li><li >28</li><li >29</li><li >30</li><li >31</li><li >32</li><li >33</li><li >34</li><li >35</li><li >36</li><li >37</li><li >38</li><li >39</li><li >40</li><li >41</li><li >42</li><li >43</li><li >44</li><li >45</li><li >46</li><li >47</li><li >48</li><li >49</li><li >50</li><li >51</li><li >52</li><li >53</li><li >54</li><li >55</li><li >56</li><li >57</li><li >58</li><li >59</li><li >60</li><li >61</li><li >62</li><li >63</li><li >64</li><li >65</li><li >66</li><li >67</li><li >68</li><li >69</li><li >70</li><li >71</li><li >72</li><li >73</li><li >74</li><li >75</li><li >76</li><li >77</li><li >78</li><li >79</li><li >80</li><li >81</li></ul></pre>
<p>运行完毕以后,会在工程根目录下增加一个output文件夹,打开output/part-r-00000,内容如下:</p>
<pre class="prettyprint" name="code"><code class="hljs mathematica has-numbering"><span class="hljs-keyword">I</span> <span class="hljs-number">1</span>
a <span class="hljs-number">2</span>
dream <span class="hljs-number">2</span>
have <span class="hljs-number">1</span></code><ul class="pre-numbering"><li >1</li><li >2</li><li >3</li><li >4</li></ul></pre>
<p>这里在main函数中新增了一个String类型的数组,如果想用main函数的args数组接受参数,在运行时指定输入和输出路径也是可以的。运行WordCount之前,配置Configuration并指定Program arguments即可。 <br>
<img src="https://img-blog.csdn.net/20170330203027273?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbmFwb2F5/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast" alt="这里写图片描述" title=""></p>
<hr>
<h1 id="七运行wordcount从hdfs读取文件"><a name="t6"></a><strong>七、运行WordCount(从HDFS读取文件)</strong></h1>
<p>在HDFS上新建文件夹:</p>
<pre class="prettyprint" name="code"><code class="hljs perl has-numbering">hadoop fs -<span class="hljs-keyword">mkdir</span> /worddir</code><ul class="pre-numbering"><li >1</li></ul></pre>
<p>如果出现Namenode安全模式导致的不能创建文件夹提示:</p>
<pre class="prettyprint" name="code"><code class="hljs sql has-numbering">mkdir: Cannot <span class="hljs-operator"><span class="hljs-keyword">create</span> directory /worddir. Name node <span class="hljs-keyword">is</span> <span class="hljs-keyword">in</span> safe mode.</span></code><ul class="pre-numbering"><li >1</li></ul></pre>
<p>运行以下命令关闭safe mode:</p>
<pre class="prettyprint" name="code"><code class="hljs lasso has-numbering">hadoop dfsadmin <span class="hljs-attribute">-safemode</span> leave</code><ul class="pre-numbering"><li >1</li></ul></pre>
<p>上传本地文件:</p>
<pre class="prettyprint" name="code"><code class="hljs livecodeserver has-numbering">hadoop fs -<span class="hljs-built_in">put</span> dream.txt /worddir</code><ul class="pre-numbering"><li >1</li></ul></pre>
<p>修改otherArgs参数,指定输入为文件在HDFS上的路径:</p>
<pre class="prettyprint" name="code"><code class="hljs javascript has-numbering"><span class="hljs-built_in">String</span>[] otherArgs = <span class="hljs-keyword">new</span> <span class="hljs-built_in">String</span>[]{<span class="hljs-string">"hdfs://localhost:9000/worddir/dream.txt"</span>,<span class="hljs-string">"output"</span>};</code><ul class="pre-numbering"><li >1</li></ul></pre>
<h1 id="八代码下载"><a name="t7"></a>八、代码下载</h1>
<p>代码下载地址:<a href="http://download.csdn.net/detail/napoay/9799523" rel="nofollow" target="_blank">http://download.csdn.net/detail/napoay/9799523</a></p> </div>
<link rel="stylesheet" href="https://csdnimg.cn/release/phoenix/template/css/markdown_views-ea0013b516.css">
</div>