Spark SQL-转义查询字符串 [英] Spark SQL - Escape Query String

查看:603
本文介绍了Spark SQL-转义查询字符串的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我不敢相信我在问这个,但是...

I can't believe I am asking this but...

您如何使用SCALA在SPARK SQL中逃逸SQL查询字符串?

HOW DO YOU ESCAPE A SQL QUERY STRING IN SPARK SQL USING SCALA?

我已经厌倦了一切,到处搜寻.我以为apache commons库可以做到,但是没有运气:

I have tired everything and searched everywhere. I thought the apache commons library would do it but no luck:

import org.apache.commons.lang.StringEscapeUtils

var sql = StringEscapeUtils.escapeSql("'Ulmus_minor_'Toledo'");

df.filter("topic = '" + sql + "'").map(_.getValuesMap[Any](List("hits","date"))).collect().foreach(println);

返回以下内容:

topic ='''Ulmus_minor_''托莱多''' ^在scala.sys.package $ .error(package.scala:27)在org.apache.spark.sql.catalyst.SqlParser.parseExpression(SqlParser.scala:45) 在org.apache.spark.sql.DataFrame.filter(DataFrame.scala:651)处 $ iwC $$ iwC $$ iwC $$ iwC $ iwC $$ iwC $$ iwC $$ iwC $$ iwC.:: 29 $ iwC $$ iwC $$ iwC $ iwC $ iwC $$ iwC $$ iwC $$ iwC.:: 34 $ iwC $$ iwC $$ iwC $ iwC $ iwC $$ iwC $$ iwC.:: 36 $ iwC $$ iwC $$ iwC $ iwC $$ iwC $$ iwC.:: 38 $ iwC $$ iwC $$ iwC $ iwC $$ iwC.(:40)在 $ iwC $$ iwC $$ iwC $$ iwC.(:42)在 $ iwC $$ iwC $$ iwC.(:44)在$ iwC $$ iwC.:: 46 在$ iwC.(:48)在(:50)在 .(:54)at.()at .(:7)at.()at $ print() 在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在java.lang.reflect.Method.invoke(Method.java:497)在 org.apache.spark.repl.SparkIMain $ ReadEvalPrint.call(SparkIMain.scala:1065) 在 org.apache.spark.repl.SparkIMain $ Request.loadAndRun(SparkIMain.scala:1338) 在 org.apache.spark.repl.SparkIMain.loadAndRunReq $ 1(SparkIMain.scala:840) 在org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) 在org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) 在 org.apache.spark.repl.SparkILoop.reallyInterpret $ 1(SparkILoop.scala:857) 在 org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) 在org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)处 org.apache.spark.repl.SparkILoop.processLine $ 1(SparkILoop.scala:657) 在org.apache.spark.repl.SparkILoop.innerLoop $ 1(SparkILoop.scala:665) 在 org.apache.spark.repl.SparkILoop.org $ apache $ spark $ repl $ SparkILoop $$ loop(SparkILoop.scala:670) 在 org.apache.spark.repl.SparkILoop $$ anonfun $ org $ apache $ spark $ repl $ SparkILoop $$ process $ 1.apply $ mcZ $ sp(SparkILoop.scala:997) 在 org.apache.spark.repl.SparkILoop $$ anonfun $ org $ apache $ spark $ repl $ SparkILoop $$ process $ 1.apply(SparkILoop.scala:945) 在 org.apache.spark.repl.SparkILoop $$ anonfun $ org $ apache $ spark $ repl $ SparkILoop $$ process $ 1.apply(SparkILoop.scala:945) 在 scala.tools.nsc.util.ScalaClassLoader $ .savingContextLoader(ScalaClassLoader.scala:135) 在 org.apache.spark.repl.SparkILoop.org $ apache $ spark $ repl $ SparkILoop $$ process(SparkILoop.scala:945) 在org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) 在org.apache.spark.repl.Main $ .main(Main.scala:31)处 org.apache.spark.repl.Main.main(Main.scala)在 sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)位于 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在java.lang.reflect.Method.invoke(Method.java:497)在 org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala:665) 在 org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:170) 在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:193) 在org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:112) 在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

topic = '''Ulmus_minor_''Toledo''' ^ at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.catalyst.SqlParser.parseExpression(SqlParser.scala:45) at org.apache.spark.sql.DataFrame.filter(DataFrame.scala:651) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:29) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:34) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:36) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:38) at $iwC$$iwC$$iwC$$iwC$$iwC.(:40) at $iwC$$iwC$$iwC$$iwC.(:42) at $iwC$$iwC$$iwC.(:44) at $iwC$$iwC.(:46) at $iwC.(:48) at (:50) at .(:54) at .() at .(:7) at .() at $print() at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

帮助会很棒.

j

推荐答案

这可能令人惊讶,但:

var sql = "'Ulmus_minor_'Toledo'"
df.filter(s"""topic = "$sql"""")

工作正常,尽管使用它会更清洁:

works just fine, although it would be much cleaner to use this:

df.filter($"topic" <=> sql)

这篇关于Spark SQL-转义查询字符串的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆