在星火SQL文件的实木复合地板 [英] Parquet file in Spark SQL
本文介绍了在星火SQL文件的实木复合地板的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我试图用木格式的文件使用星火SQL。当我尝试了基本的例子:
I am trying to use Spark SQL using parquet file formats. When I try the basic example :
object parquet {
case class Person(name: String, age: Int)
def main(args: Array[String]) {
val sparkConf = new SparkConf().setMaster("local").setAppName("HdfsWordCount")
val sc = new SparkContext(sparkConf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
// createSchemaRDD is used to implicitly convert an RDD to a SchemaRDD.
import sqlContext.createSchemaRDD
val people = sc.textFile("C:/Users/pravesh.jain/Desktop/people/people.txt").map(_.split(",")).map(p => Person(p(0), p(1).trim.toInt))
people.saveAsParquetFile("C:/Users/pravesh.jain/Desktop/people/people.parquet")
val parquetFile = sqlContext.parquetFile("C:/Users/pravesh.jain/Desktop/people/people.parquet")
}
}
我得到一个空指针异常:
I get a null pointer exception :
异常线程main显示java.lang.NullPointerException
在org.apache.spark.parquet $。主要(parquet.scala:16)
Exception in thread "main" java.lang.NullPointerException at org.apache.spark.parquet$.main(parquet.scala:16)
这是线saveAsParquetFile。什么是这里的问题?
which is the line saveAsParquetFile. What's the issue here?
推荐答案
当我使用月食中星火在Windows出现此错误。我试着火花壳一样,它工作正常。我猜可能的火花不与Windows 100%兼容。
This error occurs when I was using Spark in eclipse in Windows. I tried the same on spark-shell and it works fine. I guess spark might not be 100% compatible with windows.
这篇关于在星火SQL文件的实木复合地板的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文