Hadoop:reducer 的数量不等于我在程序中设置的数量

2022-01-13 00:00:00 hadoop mapreduce java

我在 mapred-site.xml 中将 mapred.tasktracker.reduce.tasks.maximum 设置为 10,并且我还在我的 jobConf.setNumReduceTasks(5)工作.

I have set mapred.tasktracker.reduce.tasks.maximum to 10 in mapred-site.xml, and I also write jobConf.setNumReduceTasks(5) in my job.

如果我在 Shell 中运行作业,一切正常.

Everything is ok if I run the job in Shell.

但是当我通过 eclipse 运行相同的作业时,只启动了一个 reducer.

But when I run the same job by eclipse, only one reducer was launched.

我尝试在 Eclipse 中编辑 Map/Reduce Locations,并将 mapred.reduce.tasks 设置为 10.但这仍然不起作用.

I try to edit Map/Reduce Locations in eclipse, and set mapred.reduce.tasks to 10. But that still doesn't work.

在eclipse中还有什么可以调整的参数吗?

Is there any other parameters I can adjust in eclipse?

推荐答案

在eclipse中运行好像是使用本地job runner.它只支持 0 或 1 个减速器.如果您尝试将其设置为使用多个减速器,它会忽略它并只使用一个.

Running it in eclipse seems to use the local job runner. It only supports 0 or 1 reducers. If you try to set it to use more than one reducer, it ignores it and just uses one anyway.

相关文章