将spark数据帧转换为sparklyR表"tbl_spark" [英] Convert spark dataframe to sparklyR table "tbl_spark"

查看:71
本文介绍了将spark数据帧转换为sparklyR表"tbl_spark"的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将Spark数据帧 org.apache.spark.sql.DataFrame 转换为sparklyr表 tbl_spark .我尝试使用 sdf_register ,但由于以下错误而失败.

I'm trying to convert spark dataframe org.apache.spark.sql.DataFrame to a sparklyr table tbl_spark. I tried with sdf_register, but it failed with following error.

在这里,df是spark数据帧.

In here, df is spark dataframe.

sdf_register(df, name = "my_tbl")

错误是

Error: org.apache.spark.sql.AnalysisException: Table not found: my_tbl; line 2 pos 17
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:306)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:315)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:310)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:56)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:281)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)

我错过了什么吗?还是有更好的方法将其转换为 tbl_spark ?

Have I missed anything? or Is there any better way to convert it to tbl_spark?

谢谢!

推荐答案

使用 sdf_copy_to() dplyr :: copy_to(),例如 my_tbl<-sdf_copy_to(sc,df,"my_tbl")

这篇关于将spark数据帧转换为sparklyR表"tbl_spark"的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆