线程“main"中的异常org.apache.hadoop.ipc.RemoteException:服务器 IPC 版本 9 无法与客户端版本 4 通信.如何解决这个问题?

2022-01-13 00:00:00 hadoop mapreduce netbeans java

我在 NetBeans IDE 8.0.2 中使用 hadoop 2.7.0 和 java oracle jdk1.7.0_79.当我尝试使用 Java 文件与 Hadoop 通信时,出现以下错误.是否涉及任何依赖问题?或者我该如何解决这个错误?

I am using hadoop 2.7.0 and java oracle jdk1.7.0_79 with NetBeans IDE 8.0.2. When I try to communicate with Hadoop using the Java file, then I get the following error. Is there any dependency issues involved? Or how can I resolve this error?

我看过相关问题的帖子,但没有一个有助于清楚地传达答案.所以,请在这里帮助我.谢谢!

I have seen posts with related issue, but none of them were helpful to convey the answer clearly. So, please help me out here. Thanks!

    Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4
at org.apache.hadoop.ipc.Client.call(Client.java:1066)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:118)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:222)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:187)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1328)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:65)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1346)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:244)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.setInputPaths(FileInputFormat.java:352)
at pir.PIR.run(PIR.java:317)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at pir.PIR.main(PIR.java:256)

推荐答案

如果您使用的是 maven,请检查您的 POM 文件中包含的 Hadoop-client 的版本.它可能比集群上当前运行的 hadoop 版本更旧(应该是 2.7.0)

If you are using maven then, check the version of Hadoop-client you are including in your POM file. It might be older version than the current hadoop running version on your cluster (it should be 2.7.0)

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.7.0</version>
</dependency>

相关文章