从远程系统提交 mapreduce 作业时出现异常

2022-01-13 00:00:00 hadoop linux mapreduce remote-server java

我在从远程系统提交 mapreduce 作业时遇到异常

I got an exception while submitting a mapreduce job from remote system

13/10/28 18:49:52 错误 security.UserGroupInformation: PriviledgedActionException as:root cause:org.apache.hadoop.mapred.InvalidInputException: 输入路径不存在: file:/F:/Workspaces/Test/Hadoop/测试

13/10/28 18:49:52 ERROR security.UserGroupInformation: PriviledgedActionException as:root cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/F:/Workspaces/Test/Hadoop/test

我的 hadoop 和 mapreduce 环境是在 linux 机器上配置的.我从本地 Windows PC 提交 wordcount 作业,如下所示:

My hadoop and mapreduce envirnment is configured on a linux machine. I submit the wordcount job from a local Windows PC as follows:

public static void main(String[] args) throws Exception {

    UserGroupInformation ugi = UserGroupInformation.createRemoteUser("root");

    try {
        ugi.doAs(new PrivilegedExceptionAction<Void>() {

            public Void run() throws Exception {

                JobConf conf = new JobConf(MapReduce.class);
                conf.set("mapred.job.name", "MyApp");
                conf.set("mapred.job.tracker", "192.168.1.149:9001");
                conf.set("fs.default.name","hdfs://192.168.1.149:9000");
                conf.set("hadoop.job.ugi", "root");

                conf.setOutputKeyClass(Text.class);
                conf.setOutputValueClass(IntWritable.class);

                conf.setMapperClass(Map.class);
                conf.setCombinerClass(Reduce.class);
                conf.setReducerClass(Reduce.class);

                conf.setInputFormat(TextInputFormat.class);
                conf.setOutputFormat(TextOutputFormat.class);

                FileInputFormat.setInputPaths(conf, new Path("test"));
                FileOutputFormat.setOutputPath(conf, new Path("test"));

                JobClient.runJob(conf);

                return null;
            }
        });
    } catch (Exception e) {
        e.printStackTrace();
    }
}

其中 192.168.1.149 是 hadoop 配置的 linux pc.我在那里启动了 hadoop、mapreduce 服务.此外,test 目录也是使用相同的 java API 创建的,它可以正常工作.但 mapreduce 不是.

where 192.168.1.149 is the hadoop configured linux pc. I started hadoop, mapreduce services there. Also test directory was also created with same java API, it worked. But mapreduce not.

**请帮忙.. **

推荐答案

其实是我的配置错误:

我错过了 ma​​pred-site.xml 中的 ma​​pred.local.dir 属性

I missed mapred.local.dir property in mapred-site.xml

 

相关文章