全局启用spark.sql的区分大小写 [英] Enable case sensitivity for spark.sql globally

查看:927
本文介绍了全局启用spark.sql的区分大小写的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

选项spark.sql.caseSensitive控制列名等是否区分大小写.可以设置为由

spark_session.sql('set spark.sql.caseSensitive=true')

,默认为false.

似乎不可能通过

spark.sql.caseSensitive: True

$SPARK_HOME/conf/spark-defaults.conf中全局启用它

spark.sql.caseSensitive: True

虽然. 是目的还是有其他文件来设置sql选项?

也在解决方案

原来是设置

spark.sql.caseSensitive: True

中的

毕竟可以工作.只需在Spark驱动程序的配置中完成它,而不是主服务器或工作程序.显然我上次尝试时忘记了这一点.

The option spark.sql.caseSensitive controls whether column names etc should be case sensitive or not. It can be set e.g. by

spark_session.sql('set spark.sql.caseSensitive=true')

and is false per default.

It does not seem to be possible to enable it globally in $SPARK_HOME/conf/spark-defaults.conf with

spark.sql.caseSensitive: True

though. Is that intended or is there some other file to set sql options?

Also in the source it is stated that it is highly discouraged to enable this at all. What is the rationale behind that advice?

解决方案

As it turns out setting

spark.sql.caseSensitive: True

in $SPARK_HOME/conf/spark-defaults.conf DOES work after all. It just has to be done in the configuration of the Spark driver as well, not the master or workers. Apparently I forgot that when I last tried.

这篇关于全局启用spark.sql的区分大小写的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆