证明runtimeClass满足Scala中的Bound类型 [英] Prove that a runtimeClass satisfies a type Bound in Scala

查看:50
本文介绍了证明runtimeClass满足Scala中的Bound类型的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个方法可以以Parquet形式编写我的一个类 Foo (定义为Thrift).

I have a method that writes one of my classes Foo, which is defined as Thrift, in Parquet form.

  import Foo
  import org.apache.spark.rdd.RDD
  import org.apache.thrift.TBase
  import org.apache.hadoop.mapreduce.Job
  import org.apache.parquet.hadoop.ParquetOutputFormat
  import org.apache.parquet.hadoop.thrift.ParquetThriftOutputFormat

  def writeThriftParquet(rdd: RDD[Foo], outputPath: String): Unit = {
    val job = Job.getInstance()
    ParquetThriftOutputFormat.setThriftClass(job, classOf[Foo])
    ParquetOutputFormat.setWriteSupportClass(job, classOf[Foo])

    rdd
      .map(x => (null, x))
      .saveAsNewAPIHadoopFile(
        outputPath,
        classOf[Void],
        classOf[Foo],
        classOf[ParquetThriftOutputFormat[Foo]],
        job.getConfiguration)
  }

这很好用,但是我更愿意写一个更通用的方法.我尝试了(相对)简单的方法:

This works fine, but I'd prefer to write a more generic method. I tried the (relatively) simple:

  def writeThriftParquetGeneral[A <: TBase[_, _]](rdd: RDD[A], outputPath: String): Unit = {
    val job = Job.getInstance()
    ParquetThriftOutputFormat.setThriftClass(job, classOf[A])
    ParquetOutputFormat.setWriteSupportClass(job, classOf[A])

    rdd
      .map(x => (null, x))
      .saveAsNewAPIHadoopFile(
        outputPath,
        classOf[Void],
        classOf[A],
        classOf[ParquetThriftOutputFormat[A]],
        job.getConfiguration)
  }

但是失败,并显示以下错误:

but that fails with errors like:

 class type required but A found ParquetThriftOutputFormat.setThriftClass(job, classOf[A])
 class type required but A found ParquetOutputFormat.setWriteSupportClass(job, classOf[A])

为了解决这个问题,我使用了 ClassTag ,但是还没有编译的东西.

To try to remedy that, I've used a ClassTag, but haven't gotten things to compile.

  import scala.reflect._
  implicit val ct = ClassTag[Foo](classOf[Foo])

  def writeThriftParquetGeneral[A <: TBase[_, _]](rdd: RDD[A], outputPath: String)(
    implicit tag: ClassTag[A]): Unit = {
    val job = Job.getInstance()

    // The problem line
    ParquetThriftOutputFormat.setThriftClass(job, tag.runtimeClass)

    // Seems OK from here
    ParquetOutputFormat.setWriteSupportClass(job, tag.runtimeClass)

    rdd
      .map(x => (null, x))
      .saveAsNewAPIHadoopFile(
        outputPath,
        classOf[Void],
        tag.runtimeClass,
        classOf[ParquetThriftOutputFormat[A]],
        job.getConfiguration)
  }

此操作在以下行失败: ParquetThriftOutputFormat.setThriftClass(job,tag.runtimeClass)

This fails at the line: ParquetThriftOutputFormat.setThriftClass(job, tag.runtimeClass)

[error]  found   : Class[_$1] where type _$1
[error]  required: Class[_ <: org.apache.thrift.TBase[_, _]]

我很惊讶编译器(Scala 2.11)没有意识到 tag.runtimeClass 必须是 classOf [A] A 满足定义所定义的类型.

I'm surprised the compiler (Scala 2.11) isn't recognizing that tag.runtimeClass must be a classOf[A], and A satisfies the type bound by definition.

推荐答案

ClassTag#runtimeClass 仅返回 Class [_]

https://github.com/scala/scala/blob/2.13.x/src/library/scala/reflect/ClassTag.scala#L55

Class [_< ;: TBase [_,_]] 是不同于 Class [_] (实际上是其子类型)的存在类型

Class[_ <: TBase[_, _]] is an existential type different from Class[_] (actually its subtype)

implicitly[Class[_ <: TBase[_, _]] <:< Class[_]]

尝试将问题行替换为

ParquetThriftOutputFormat.setThriftClass(job, classTag.runtimeClass.asSubclass(classOf[TBase[_, _]]))

这篇关于证明runtimeClass满足Scala中的Bound类型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆