hadoop namenode –format
啓動Hadoop
start-all.sh
停止Hadoop
stop-all.sh
jps命令可以看到Hadoop的所有守護行程
用hdfs dfsadmin -report 命令來檢查,能看到DataNode狀態纔是正常
可以通過Hadoop NameNode和JobTracker的Web介面來檢視叢集是否啓動成功,其存取地址如下
http://192.168.96.128:8088/cluster
NameNode爲http://192.168.96.128:50070/dfshealth.html#tab-overview
JobTracker爲http://localhost:50030/
到此僅僅是在linux上佈置成功
很多部落格說要下載外掛hadoop-eclipse-plugin-2.6.0.jar,我下載後也放在eclipse安裝資料夾的外掛plugins下了,那頭小象始終不見出來,發現不裝外掛也可以執行wordCount,與版本有關,但是不是必須對應的,
1、下載hadoop.dll和winutils.exe
https://github.com/steveloughran/winutils/blob/master/hadoop-2.8.3/bin/hadoop.dll
winutils.exe放在hadoop-2.8.4\bin
hadoop.dll放在C:\Windows\System32
存取hdfs時報許可權錯誤
Permission denied: user=administrator, access=WRITE,
設定環境變數
設定伺服器上的hadoop使用者,因爲執行login.login的時候呼叫了hadoop裏面的HadoopLoginModule方法,會先讀取HADOOP_USER_NAME系統環境變數,然後是java環境變數,如果再沒有就從NTUserPrincipal裏面取
設定好環境,重新啓動Eclipse
<?xml version="1.0"?>
-<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://maven.apache.org/POM/4.0.0">
<modelVersion>4.0.0</modelVersion>
<groupId>HadoopJar</groupId>
<artifactId>Hadoop</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Hadoop</name>
<url>http://maven.apache.org</url>
-<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<hadoop.version>2.8.4</hadoop.version>
</properties>
-<dependencies>
-<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core -->
-<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>${hadoop.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-hdfs -->
-<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-common -->
-<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-jobclient -->
-<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-jobclient</artifactId>
<version>${hadoop.version}</version>
</dependency>
-<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
-<dependency>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<version>1.8</version>
<scope>system</scope>
<systemPath>D:\Java\jdk1.8.0_101\lib/tools.jar</systemPath>
</dependency>
</dependencies>
-<build>
<finalName>Hadoop</finalName>
-<plugins>
-<plugin>
<artifactId>maven-compiler-plugin</artifactId>
-<configuration>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
-<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
-<configuration>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
</plugins>
</build>
</project>
主機名要替換成IP
hadoop.tmp.dir /usr/local/hadoop-2.8.4/tmp Abase for other temporary directories. fs.defaultFS hdfs://192.168.96.128:9000 io.file.buffer.size 4096
Hdfs dfs –mkdir /hi