如何将自定义 Java 类转换为 Spark 数据集
我不知道如何将测试对象列表转换为 Spark 中的数据集这是我的课:
I can't figure out a way to convert a List of Test objects to a Dataset in Spark This is my class:
public class Test {
public String a;
public String b;
public Test(String a, String b){
this.a = a;
this.b = b;
}
public List getList(){
List l = new ArrayList();
l.add(this.a);
l.add(this.b);
return l;
}
}
推荐答案
您在注释中创建 DataFrame 的代码是正确的.但是,您定义 Test
的方式存在问题.您只能使用来自 Java Bean 的代码创建 DataFrame.您的 Test
类不是 Java Bean.修复该问题后,您可以使用以下代码创建 DataFrame:
Your code in the comments to create a DataFrame is correct. However, there is a problem with the way you define Test
. You can create DataFrames using your code only from Java Beans. Your Test
class is not a Java Bean. Once you fix that, you can use the following code to create a DataFrame:
Dataset<Row> dataFrame = spark.createDataFrame(listOfTestClasses, Test.class);
这些行来创建一个类型化的数据集:
and these lines to create a typed Dataset:
Encoder<Test> encoder = Encoders.bean(Test.class);
Dataset<Test> dataset = spark.createDataset(listOfTestClasses, encoder);
相关文章