忽略非火花配置属性:hive.exec.dynamic.partition.mode [英] Ignoring non-spark config property: hive.exec.dynamic.partition.mode

查看:1109
本文介绍了忽略非火花配置属性:hive.exec.dynamic.partition.mode的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何使用hive.exec.dynamic.partition.mode=nonstrict运行Spark-shell?

How to run a Spark-shell with hive.exec.dynamic.partition.mode=nonstrict?

我尝试(如建议在此处)

  export SPARK_MAJOR_VERSION=2; spark-shell  --conf "hive.exec.dynamic.partition.mode=nonstrict" --properties-file /opt/_myPath_/sparkShell.conf'

但警告忽略非火花配置属性:hive.exec.dynamic.partition.mode = nonstrict"

but Warning "Ignoring non-spark config property: hive.exec.dynamic.partition.mode=nonstrict"

PS:使用Spark版本2.2.0.2.6.4.0-91,Scala版本2.11.8

PS: using Spark version 2.2.0.2.6.4.0-91, Scala version 2.11.8

需求在df.write.mode("overwrite").insertInto("db.partitionedTable")上出现错误后到达,

The demand arrives after error on df.write.mode("overwrite").insertInto("db.partitionedTable"),

org.apache.spark.SparkException:动态分区严格模式至少需要一个静态分区列.要关闭此设置,请设置hive.exec.dynamic.partition.mode = nonstrict

推荐答案

您可以尝试使用spark.hadoop.*前缀.html#custom-hadoophive-configuration"rel =" nofollow noreferrer>版本2.3的自定义Spark配置" 部分.如果只是文档错误,在2.2中也可以工作:)

You can try using spark.hadoop.* prefix as suggested in Custom Spark Configuration section for version 2.3. Might work as well in 2.2 if it was just a doc bug :)

spark-shell \
  --conf "spark.hadoop.hive.exec.dynamic.partition=true" \
  --conf "spark.hadoop.hive.exec.dynamic.partition.mode=nonstrict" \
  ...

这篇关于忽略非火花配置属性:hive.exec.dynamic.partition.mode的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆