如何在Apache Spark中执行SQL查询 [英] How to Execute sql queries in Apache Spark

查看:175
本文介绍了如何在Apache Spark中执行SQL查询的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我对 Apache Spark 非常陌生.
我已经在本地Windows机器上配置了spark 2.0.2. 我已经完成了带有火花的字数统计"示例.
现在,我在执行SQL查询时遇到了问题. 我已经搜索了相同的内容,但没有得到正确的指导.

I am very new to Apache Spark.
I have already configured spark 2.0.2 on my local windows machine. I have done with "word count" example with spark.
Now, I have the problem in executing the SQL Queries. I have searched for the same , but not getting proper guidance .


任何帮助,不胜感激!
感谢进阶!!


Any help greatly appreciated!
Thanks in Advanced!!

推荐答案

所以您需要执行以下操作才能完成它,

So you need to do these things to get it done ,

在Spark 2.0.2中,我们有SparkSession,其中包含SparkContext实例和sqlContext实例.

In Spark 2.0.2 we have SparkSession which contains SparkContext instance as well as sqlContext instance.

因此步骤将是:

步骤1:创建SparkSession

Step 1: Create SparkSession

val spark = SparkSession.builder().appName("MyApp").master("local[*]").getOrCreate()

第2步:在您的情况下,从数据库中加载Mysql.

Step 2: Load from the database in your case Mysql.

val loadedData=spark
      .read
      .format("jdbc")
      .option("url", "jdbc:mysql://localhost:3306/mydatabase")
      .option("driver", "com.mysql.jdbc.Driver")
      .option("mytable", "mydatabase")
      .option("user", "root")
      .option("password", "toor")
      .load().createOrReplaceTempView("mytable")

步骤3:现在,您可以像在SqlDatabase中一样运行SqlQuery.

Step 3: Now you can run your SqlQuery just like you do in SqlDatabase.

val dataFrame=spark.sql("Select * from mytable")
dataFrame.show()

P.S:如果您使用DataFrame Api更好,或者如果使用DataSet Api更好,但是对于那些您需要阅读文档.

P.S: It would be better if you use DataFrame Api's or even better if DataSet Api's , but for those you need to go through the documentation.

链接到文档: 查看全文

登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆