Hadoop 名称节点格式化窗口 - java.lang.UnsupportedOperationException
我在学校的数据库课上,我的教授让我们使用 hadoop v3.2.1.在按照 youtube 教程在 Windows 上安装时,我被困在格式化名称节点部分.这就是 cmd 中出现的内容:
I am in a databasing class at school and my professor is having us work with hadoop v3.2.1. In following a youtube tutorial to install on windows, I am stuck on the formatting namenode part. this is what comes up in cmd:
2020-03-15 15:38:05,819 INFO util.GSet: Computing capacity for map NameNodeRetryCache
2020-03-15 15:38:05,819 INFO util.GSet: VM type = 64-bit
2020-03-15 15:38:05,820 INFO util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
2020-03-15 15:38:05,820 INFO util.GSet: capacity = 2^15 = 32768 entries
2020-03-15 15:38:05,883 INFO namenode.FSImage: Allocated new BlockPoolId: BP-381120843-10.0.0.230-1584301085876
2020-03-15 15:38:05,884 ERROR namenode.NameNode: Failed to start namenode.
java.lang.UnsupportedOperationException
at java.nio.file.Files.setPosixFilePermissions(Files.java:2044)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:452)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:591)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:613)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:188)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1206)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1649)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1759)
2020-03-15 15:38:05,887 INFO util.ExitUtil: Exiting with status 1: java.lang.UnsupportedOperationException
2020-03-15 15:38:05,889 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at DrStrange/10.0.0.230
************************************************************/
这是我的属性:
核心站点.xml:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
mapred-site.xml
mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
hdfs-site.xml
hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>C:hadoop-3.2.1data
amenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>C:hadoop-3.2.1datadatanode</value>
</property>
</configuration>
yarn-site.xml:
yarn-site.xml:
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
</configuration>
我正在关注本教程:如何在 Windows 上安装 Hadoop 到大约一半当我意识到它太旧时,然后切换到这个:如何在 Windows10 中安装 Hadoop 3.2.0
and i was following this tutorial: How to Install Hadoop on Windows until about halfway through when i realized it was too old and then switched to this one: How to Install Hadoop 3.2.0 in Windows10
另外,我不知道这是否与我目前的问题有关,所以我会这么说.当我跳到下一步并键入 start-all 时,资源管理器和节点管理器都会出错.我想我会把它全部塞进一个问题中.
Also, I have no idea if this is related to my current problem, so I will say this. When I skip to the next step and type start-all, resource manager and node manager both error out. Figured I'd stuff it all into one question.
推荐答案
以下错误是一个Hadoop 3.2.1 中的错误.
此问题将在下一个版本中解决.目前,您可以使用以下步骤临时修复它:
This issue will be solved within the next release. For now, you can fix it temporarily using the following steps:
- 从 hadoop-hdfs-3.2.1.jar 文件3.2.1.jar" rel="nofollow noreferrer">跟随链接.
- 将文件夹
%HADOOP_HOME%share中的文件名
hadoop-hdfs-3.2.1.jar
重命名为hadoop-hdfs-3.2.1.bak
hadoophdfs - 将下载的
hadoop-hdfs-3.2.1.jar
复制到文件夹%HADOOP_HOME%sharehadoophdfs
- Download
hadoop-hdfs-3.2.1.jar
file from the following link. - Rename the file name
hadoop-hdfs-3.2.1.jar
tohadoop-hdfs-3.2.1.bak
in folder%HADOOP_HOME%sharehadoophdfs
- Copy the downloaded
hadoop-hdfs-3.2.1.jar
to folder%HADOOP_HOME%sharehadoophdfs
参考文献
- 安装 Hadoop 3.2.1 Windows 10 上的单节点集群
- 在 Windows 10 上安装 Hadoop 3.2.1 分步指南
相关文章