使用spark-csv的方法重载错误 [英] overloaded method error using spark-csv

查看:86
本文介绍了使用spark-csv的方法重载错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Databricks spark-csv软件包(通过Scala API),但在定义自定义架构时遇到了问题.

I'm working with the Databricks spark-csv package (via Scala API), and having problems defining a custom schema.

使用以下命令启动控制台后

After starting up the console with

spark-shell  --packages com.databricks:spark-csv_2.11:1.2.0

我导入了必要的类型

import org.apache.spark.sql.types.{StructType, StructField, StringType, IntegerType}

,然后尝试尝试定义此架构:

and then simply try to define this schema:

val customSchema = StructType(
    StructField("user_id", IntegerType, true),
    StructField("item_id", IntegerType, true),
    StructField("artist_id", IntegerType, true),
    StructField("scrobble_time", StringType, true))

但是我收到以下错误:

<console>:26: error: overloaded method value apply with alternatives:
  (fields: Array[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType <and>
  (fields: java.util.List[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType <and>
  (fields: Seq[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType
 cannot be applied to (org.apache.spark.sql.types.StructField, org.apache.spark.sql.types.StructField, org.apache.spark.sql.types.StructField, org.apache.spark.sql.types.StructField)
       val customSchema = StructType(

我对scala还是很陌生,因此在解析它时遇到了麻烦,但是我在这里做错了什么?我正在使用非常简单的示例此处.

I'm very new to scala, so having trouble parsing this, but what am I doing wrong here? I'm following the very simple example here.

推荐答案

您需要将StructField集作为Seq传递.

类似于以下任何作品:

val customSchema = StructType(Seq(StructField("user_id", IntegerType, true), StructField("item_id", IntegerType, true), StructField("artist_id", IntegerType, true), StructField("scrobble_time", StringType, true)))

val customSchema = (new StructType)
  .add("user_id", IntegerType, true)
  .add("item_id", IntegerType, true)
  .add("artist_id", IntegerType, true)
  .add("scrobble_time", StringType, true)

val customSchema = StructType(StructField("user_id", IntegerType, true) :: StructField("item_id", IntegerType, true) :: StructField("artist_id", IntegerType, true) :: StructField("scrobble_time", StringType, true) :: Nil)

我不确定为什么自述文件中未显示它,但是如果您查看

I'm not sure why it's not presented as this on the README, but if you check the StructType documentation, it's clear about this.

这篇关于使用spark-csv的方法重载错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆