使用 spark-csv 重载方法错误 [英] overloaded method error using spark-csv

查看:31
本文介绍了使用 spark-csv 重载方法错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 Databricks spark-csv 包(通过 Scala API),但在定义自定义架构时遇到问题.

I'm working with the Databricks spark-csv package (via Scala API), and having problems defining a custom schema.

使用

spark-shell  --packages com.databricks:spark-csv_2.11:1.2.0

我导入我需要的类型

import org.apache.spark.sql.types.{StructType, StructField, StringType, IntegerType}

然后简单地尝试定义这个模式:

and then simply try to define this schema:

val customSchema = StructType(
    StructField("user_id", IntegerType, true),
    StructField("item_id", IntegerType, true),
    StructField("artist_id", IntegerType, true),
    StructField("scrobble_time", StringType, true))

但我收到以下错误:

<console>:26: error: overloaded method value apply with alternatives:
  (fields: Array[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType <and>
  (fields: java.util.List[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType <and>
  (fields: Seq[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType
 cannot be applied to (org.apache.spark.sql.types.StructField, org.apache.spark.sql.types.StructField, org.apache.spark.sql.types.StructField, org.apache.spark.sql.types.StructField)
       val customSchema = StructType(

我对 Scala 很陌生,所以在解析这个时遇到问题,但我在这里做错了什么?我正在关注此处的非常简单的示例.

I'm very new to scala, so having trouble parsing this, but what am I doing wrong here? I'm following the very simple example here.

推荐答案

您需要将 StructField 的集合作为 Seq 传递.

You need to pass your set of StructField's as a Seq.

类似于以下任何一种工作:

Something like any of the following works:

val customSchema = StructType(Seq(StructField("user_id", IntegerType, true), StructField("item_id", IntegerType, true), StructField("artist_id", IntegerType, true), StructField("scrobble_time", StringType, true)))

val customSchema = (new StructType)
  .add("user_id", IntegerType, true)
  .add("item_id", IntegerType, true)
  .add("artist_id", IntegerType, true)
  .add("scrobble_time", StringType, true)

val customSchema = StructType(StructField("user_id", IntegerType, true) :: StructField("item_id", IntegerType, true) :: StructField("artist_id", IntegerType, true) :: StructField("scrobble_time", StringType, true) :: Nil)

我不知道为什么它没有在自述文件中显示,但是如果您查看 StructType 文档,对此很清楚.

I'm not sure why it's not presented as this on the README, but if you check the StructType documentation, it's clear about this.

这篇关于使用 spark-csv 重载方法错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆