的NoSuchMethodError:org.apache.spark.sql.SQLContext.applySchema [英] NoSuchMethodError: org.apache.spark.sql.SQLContext.applySchema

查看:1000
本文介绍了的NoSuchMethodError:org.apache.spark.sql.SQLContext.applySchema的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图存储在使用低于code。使用Apache中的星火提供sqlcontext HDFS文件查询,但我得到一个的NoSuchMethodError

  SQL包进口org.apache.spark.SparkContext
进口org.apache.spark.sql._反对SparSQLCSV {高清主(参数:数组[字符串]){    VAL SC =新SparkContext(本地[*],家)
    VAL sqlContext =新org.apache.spark.sql.SQLContext(SC)
    VAL人= sc.textFile(/家庭/德万/文件/数据/ peoplesTest.csv)
    VAL定界符=,
    VAL schemaString =A,B.split(分隔符)// CSV标题
    //自动模式创建
    VAL模式= StructType(schemaString.map(字段名=> StructField(字段名,StringType,真实)))
    VAL peopleLines = people.flatMap(X => x.split(\\ n))
    VAL rowRDD = peopleLines.map(P => {
      Row.fromSeq(p.split(分隔符))
    })
    VAL peopleSchemaRDD = sqlContext.applySchema(rowRDD,架构)
    peopleSchemaRDD.registerTempTable(人)
    sqlContext.sql(选择b从人)。的foreach(的println)  }}


  

异常线程mainjava.lang.NoSuchMethodError:
  org.apache.spark.sql.SQLContext.applySchema(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;
          在scalding.Main_Obj $。主要(Main_Obj.scala:34)
          在scalding.Main_Obj.main(Main_Obj.scala)
          在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
          在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
          在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          在java.lang.reflect.Method.invoke(Method.java:606)
          在org.apache.spark.deploy.SparkSubmit $ .launch(SparkSubmit.scala:358)
          在org.apache.spark.deploy.SparkSubmit $。主要(SparkSubmit.scala:75)
          在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


我曾尝试在火花提供与使用命令行,和它的作品,但是当我创建一个斯卡拉项目,并尝试运行它,我得到上述错误。我究竟做错了什么?


解决方案

的NoSuchMethodError 通常意味着你有库之间不相容。在这种特殊情况下,它看起来就像你可能会使用一个版本火花CSV的要求星火1.3与旧版本的星火。

I am trying to query on a file stored in hdfs using sqlcontext provided in Apache Spark using the below code but i am getting a NoSuchMethodError

package SQL

import org.apache.spark.SparkContext 
import org.apache.spark.sql._

object SparSQLCSV {   def main(args: Array[String]) {

    val sc = new SparkContext("local[*]","home")
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    val people = sc.textFile("/home/devan/Documents/dataset/peoplesTest.csv")
    val delimiter = ","
    val schemaString = "a,b".split(delimiter)//csv header
    //Automated Schema creation
    val schema =   StructType(schemaString.map(fieldName => StructField(fieldName, StringType, true)))
    val peopleLines = people.flatMap(x=> x.split("\n"))
    val rowRDD = peopleLines.map(p=>{
      Row.fromSeq(p.split(delimiter))
    })
    val peopleSchemaRDD = sqlContext.applySchema(rowRDD, schema)
    peopleSchemaRDD.registerTempTable("people")
    sqlContext.sql("SELECT b FROM people").foreach(println)

  } }

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.applySchema(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame; at scalding.Main_Obj$.main(Main_Obj.scala:34) at scalding.Main_Obj.main(Main_Obj.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I have tried the same using command line provided in spark, and it works but when i create a scala project and try to run it i get the above error. what am i doing wrong ?

解决方案

NoSuchMethodError usually means that you have incompatibles between libraries. In this particular case it looks like you might be using a version of spark-csv that requires Spark 1.3 with an older version of Spark.

这篇关于的NoSuchMethodError:org.apache.spark.sql.SQLContext.applySchema的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆