从 PySpark 加载数据帧 [英] Load dataframe from PySpark

查看:28
本文介绍了从 PySpark 加载数据帧的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用 spark.read.jdbc

import os
from pyspark.sql import *
from pyspark.sql.functions import *
from pyspark import SparkContext;
from pyspark.sql.session import SparkSession
sc = SparkContext.getOrCreate()
spark = SparkSession(sc)

df = spark.read \
     .format('jdbc') \
     .option('url', 'jdbc:sqlserver://local:1433') \
     .option('user', 'sa') \
     .option('password', '12345') \
     .option('dbtable', '(select COL1, COL2 from tbl1 WHERE COL1 = 2)')

然后我执行 df.load() 并返回一个错误:

then I do df.load() and returns an error :

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\spark\spark\python\pyspark\sql\readwriter.py", line 172, in load
    return self._df(self._jreader.load())
  File "C:\spark\spark\python\lib\py4j-0.10.7-src.zip\py4j\java_gateway.py", line 1256, in __call__
  File "C:\spark\spark\python\pyspark\sql\utils.py", line 63, in deco
    return f(*a, **kw)
  File "C:\spark\spark\python\lib\py4j-0.10.7-src.zip\py4j\protocol.py", line 326, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o42.load.
: java.sql.SQLException: No suitable driver
        at java.sql.DriverManager.getDriver(Unknown Source)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:105)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:105)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:104)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)

怎么了?

推荐答案

您需要下载JDBC驱动程序并将其放入您的spark/jars文件夹中.

You need to download the JDBC driver and put it into your spark/jars folder.

对于 SQL SERVER JDBC 驱动程序,您可以从 https://docs.microsoft.com/en-us/sql/connect/jdbc/download-microsoft-jdbc-driver-for-sql-server?view=sql-server-ver15

For SQL SERVER JDBC driver, you can download it from https://docs.microsoft.com/en-us/sql/connect/jdbc/download-microsoft-jdbc-driver-for-sql-server?view=sql-server-ver15

这篇关于从 PySpark 加载数据帧的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆