/bin/bash:/bin/java: 在 MacOS 的 Yarn 应用程序中没有这样的文件或目录错误

2022-01-13 00:00:00 macos hadoop mapreduce java hadoop-yarn

我试图在 Mac OS X EL Captain 10.11 上使用 Java 1.7 SDK 和 Hadoop2.7.1 运行一个简单的 wordcount MapReduce 程序,我在容器日志stderr"中收到以下错误消息/bin/bash:/bin/java: 没有这样的文件或目录

I was trying to run a simple wordcount MapReduce Program using Java 1.7 SDK and Hadoop2.7.1 on Mac OS X EL Captain 10.11 and I am getting the following error message in my container log "stderr" /bin/bash: /bin/java: No such file or directory

应用程序日志-

5/11/27 02:52:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/11/27 02:52:33 INFO client.RMProxy: Connecting to ResourceManager at /192.168.200.96:8032
15/11/27 02:52:34 INFO input.FileInputFormat: Total input paths to process : 0
15/11/27 02:52:34 INFO mapreduce.JobSubmitter: number of splits:0
15/11/27 02:52:34 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1448608832342_0003
15/11/27 02:52:34 INFO impl.YarnClientImpl: Submitted application application_1448608832342_0003
15/11/27 02:52:34 INFO mapreduce.Job: The url to track the job: http://mymac.local:8088/proxy/application_1448608832342_0003/
15/11/27 02:52:34 INFO mapreduce.Job: Running job: job_1448608832342_0003
15/11/27 02:52:38 INFO mapreduce.Job: Job job_1448608832342_0003 running in uber mode : false
15/11/27 02:52:38 INFO mapreduce.Job:  map 0% reduce 0%
15/11/27 02:52:38 INFO mapreduce.Job: Job job_1448608832342_0003 failed with state FAILED due to: Application application_1448608832342_0003 failed 2 times due to AM Container for appattempt_1448608832342_0003_000002 exited with  exitCode: 127
For more detailed output, check application tracking page:http://mymac.local:8088/cluster/app/application_1448608832342_0003Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1448608832342_0003_02_000001
Exit code: 127
Stack trace: ExitCodeException exitCode=127:
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
    at org.apache.hadoop.util.Shell.run(Shell.java:456)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)


Container exited with a non-zero exit code 127
Failing this attempt. Failing the application.
15/11/27 02:52:38 INFO mapreduce.Job: Counters: 0

我正在运行的命令

hadoop jar wordcount.jar org.myorg.WordCount /user/gangadharkadam/input/ /user/gangadharkadam/output/

我的 ENV 变量是-

My ENV variable are-

export   JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home
export HADOOP_HOME=/usr/local/hadoop/hadoop-2.7.1
export HADOOP_MAPRED_HOME=/usr/local/hadoop/hadoop-2.7.1
export HADOOP_COMMON_HOME=/usr/local/hadoop/hadoop-2.7.1
export HADOOP_HDFS_HOME=/usr/local/hadoop/hadoop-2.7.1
export YARN_HOME=/usr/local/hadoop/hadoop-2.7.1
export HADOOP_CONF_DIR=/usr/local/hadoop/hadoop-2.7.1/etc/hadoop
export HADOOP_CLASSPATH=$JAVA_HOME/lib/tools.jar

export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:HADOOP_HOME/sbin:$M2_HOME/bin:$ANT_HOME/bin:$IVY_HOME/bin:$FB_HOME/bin:$MYSQL_HOME/bin:$MYSQL_HOME/lib:$SQOOP_HOME/bin

问题似乎是因为 YARN 对 JAVA 可执行文件使用的路径与您在操作系统中的路径不同.stderr"中失败任务的本地日志显示 -/bin/bash:/bin/java: 没有这样的文件或目录

The problem seems to be because YARN is using different path for JAVA executable different then you have in your OS. the local logs for the failed task in "stderr" shows- /bin/bash: /bin/java: No such file or directory

我尝试创建从 $JAVA_HOM/bin/java 到/bin/java 的软链接,但在 El Captian OS X 中,但它不允许创建软链接.新的 OS X EL Captian 具有无根登录,用户无法在某些受限文件夹(如/bin/)上创建任何内容.非常感谢有关此问题的任何解决方法.在此先感谢.

I tried to create a soft link from $JAVA_HOM/bin/java to /bin/java but in El Captian OS X but its not allowing to create a softlink. The New OS X EL Captian has a rootless login and user can not create anything on certain restricted folders like /bin/. Any work around on this issue is highly appreciated.Thanks in advance.

推荐答案

此答案适用于 Hadoop 版本 2.6.0 及更早版本.禁用 SIP 并创建符号链接确实提供了一种解决方法.更好的解决方案是修复 hadoop-config.sh 以便它正确获取您的 JAVA_HOME

This answer is applicable for Hadoop version 2.6.0 and earlier. Disabling SIP and creating a symbolic link does provide a workaround. A better solution is to fix the hadoop-config.sh so it picks up your JAVA_HOME correctly

HADOOP_HOME/libexec/hadoop-config.sh中寻找下面的if条件#尝试设置JAVA_HOME如果没有设置

In HADOOP_HOME/libexec/hadoop-config.sh look for the if condition below # Attempt to set JAVA_HOME if it is not set

删除导出 JAVA_HOME 行中的额外括号,如下所示.改变这个

Remove extra parentheses in the export JAVA_HOME lines as below. Change this

if [ -x /usr/libexec/java_home ]; then
    export JAVA_HOME=($(/usr/libexec/java_home))
else
    export JAVA_HOME=(/Library/Java/Home)
fi

if [ -x /usr/libexec/java_home ]; then
    // note that the extra parentheses are removed
    export JAVA_HOME=$(/usr/libexec/java_home)
else
    export JAVA_HOME=/Library/Java/Home
fi

完成此更改后重新启动纱线.

Restart yarn after you have made this change.

可以在此处找到更多详细信息 https://issues.apache.org/jira/browse/HADOOP-8717 似乎 Hadoop 3.0.0-alpha1 是第一个修复版本.

More detailed info can be found here https://issues.apache.org/jira/browse/HADOOP-8717 and seems that Hadoop 3.0.0-alpha1 is the first release with the fix.

相关文章