凤凰udf不工作 [英] Phoenix udf not working
问题描述
我试图在apache phoenix中运行一个自定义的udf,但是出现错误。请帮我弄清楚这个问题。
以下是我的函数类:
package co.abc.phoenix.customudfs;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.phoenix.expression.Expression;
import org.apache.phoenix.expression.function.ScalarFunction;
import org.apache.phoenix.parse.FunctionParseNode.Argument;
import org.apache.phoenix.parse.FunctionParseNode.BuiltInFunction;
import org.apache.phoenix.schema.tuple.Tuple;
import org.apache.phoenix.schema.types.PDataType;
import org.apache.phoenix.schema.types.PVarchar;
import org.joda.time.format.DateTimeFormatter;
import java.util.HashMap;
import java.util.Map;
import static java.lang.Long.parseLong;
import static org.joda.time.format.DateTimeFormat.forPattern;
@BuiltInFunction(name = EpochToDateFunction.NAME,args = {
@Argument(allowedTypes = {PVarchar.class}),@Argument(allowedTypes = {PVarchar.class})})
public class EpochToDateFunction extends ScalarFunction {
public static final String NAME =EpochToDate;
private static final Map< String,DateTimeFormatter> DATE_FORMATTERS =新的HashMap<>();
public String getName(){
return NAME;
public boolean evaluate(Tuple tuple,ImmutableBytesWritable ptr){
Expression arg = getChildren()。get(0);
if(!arg.evaluate(tuple,ptr))返回false;
String epochStr = new String(ptr.copyBytes());
arg = getChildren()。get(1);
if(!arg.evaluate(tuple,ptr))返回false;
String dfStr = new String(ptr.copyBytes());
if(!DATE_FORMATTERS.containsKey(dfStr))DATE_FORMATTERS.put(dfStr,forPattern(dfStr));
String dateStr = DATE_FORMATTERS.get(dfStr).print(parseLong(epochStr));
ptr.set(PVarchar.INSTANCE.toBytes(dateStr));
返回true;
}
public PDataType getDataType(){
return PVarchar.INSTANCE;
}
}
Maven dependency
<依赖项>
< groupId> org.apache.phoenix< / groupId>
< artifactId> phoenix-core< / artifactId>
< version> 4.8.1-HBase-1.2< / version>
< /依赖关系>
hbase-site.xml $ b
< configuration>
<属性>
<名称> phoenix.functions.allowUserDefinedFunctions< / name>
<值> true< /值>
< / property>
<属性>
<名称> hbase.rootdir< /名称>
< value> hdfs:// localhost:9000 / hbase< / value>
< / property>
<属性>
< name> hbase.dynamic.jars.dir< / name>
< value> $ {hbase.rootdir} / lib< / value>
< / property>
<属性>
<名称> hbase.local.dir< / name>
< value> $ {hbase.tmp.dir} / local /< / value>
< / property>
< / configuration>
我在hbase.dynamic.jars.dir中添加了自定义jar
$ ./bin/hadoop fs -ls / hbase / lib /
找到1项
-rw -r - r-- 1 nj supergroup 79798208 2017-03-16 10:08 /hbase/lib/phoenix-custom-udfs-1.0-SNAPSHOT.jar
$ b 创建并执行函数
0 :jdbc:phoenix:localhost> CREATE FUNCTION EpochToDate(varchar,varchar)使用jar'hdfs:// localhost:9000 / hbase / lib / phoenix-custom-udfs-1.0-SNAPSHOT.jar'将varchar返回为'co.abc.phoenix.customudfs.EpochToDateFunction';
没有行受到影响(0.018秒)
0:jdbc:phoenix:localhost>选择epochtodate('1489637458000','yyyy');
错误:错误6001(42F01):函数未定义。 functionName = EPOCHTODATE(state = 42F01,code = 6001)
org.apache.phoenix.schema.FunctionNotFoundException:错误6001(42F01):函数未定义。 functionName = EPOCHTODATE
at org.apache.phoenix.compile.FromCompiler $ 1.resolveFunction(FromCompiler.java:129)
at org.apache.phoenix.compile.ExpressionCompiler.visitLeave(ExpressionCompiler.java:313)
at org.apache.phoenix.compile.ProjectionCompiler $ SelectClauseVisitor.visitLeave(ProjectionCompiler.java:688)
at org.apache.phoenix.compile.ProjectionCompiler $ SelectClauseVisitor.visitLeave(ProjectionCompiler.java:584)
at org.apache.phoenix.parse.FunctionParseNode.accept(FunctionParseNode.java:86)
at org.apache.phoenix.compile.ProjectionCompiler.compile(ProjectionCompiler.java:416)
在org.apache.phoenix.compile.QueryCompiler.compileSingleFlatQuery(QueryCompiler.java:561)
at org.apache.phoenix.compile.QueryCompiler.compileSingleQuery(QueryCompiler.java:507)
at org.apache .phoenix.compile.QueryCompiler.compileSelect(QueryCompiler.java:202)
at org.apache.phoenix.compile.QueryCompiler.compile(QueryCompiler.ja va:157)
at org.apache.phoenix.jdbc.PhoenixStatement $ ExecutableSelectStatement.compilePlan(PhoenixStatement.java:406)
at org.apache.phoenix.jdbc.PhoenixStatement $ ExecutableSelectStatement.compilePlan(PhoenixStatement。
at org.apache.phoenix.jdbc.PhoenixStatement $ 1.call(PhoenixStatement.java:271)
at org.apache.phoenix.jdbc.PhoenixStatement $ 1.call(PhoenixStatement.java: 266)
在org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
在org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:265)
在org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1446)
在sqlline.Commands.execute(Commands.java:822)
在sqlline.Commands.sql(命令.java:732)
at sqlline.SqlLine.dispatch(SqlLine.java:807)
at sqlline.SqlLine.begin(SqlLine.java:681)
at sqlline.SqlLine.start( SqlLine.java:398)
在sqlline.SqlLine.main(SqlLine.java:292)
0:jdb C:凤:本地主机>
有人可以帮助我,让我知道我在哪里丢失任何配置。
过去我遇到过这个问题。
基本上你需要从一个UDF表的工作表(假设你已经正确地编写了你的UDF的其余部分)
所以类似于
select udffunc(1,1)will not work
但是
select udffunc(col1, 1)从表中将
http://eyang3.github.io/2016/12/13/post/
I am trying to run a custom udf in apache phoenix but getting error. Please help me to figure out the issue.
Following is my function class:
package co.abc.phoenix.customudfs;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.phoenix.expression.Expression;
import org.apache.phoenix.expression.function.ScalarFunction;
import org.apache.phoenix.parse.FunctionParseNode.Argument;
import org.apache.phoenix.parse.FunctionParseNode.BuiltInFunction;
import org.apache.phoenix.schema.tuple.Tuple;
import org.apache.phoenix.schema.types.PDataType;
import org.apache.phoenix.schema.types.PVarchar;
import org.joda.time.format.DateTimeFormatter;
import java.util.HashMap;
import java.util.Map;
import static java.lang.Long.parseLong;
import static org.joda.time.format.DateTimeFormat.forPattern;
@BuiltInFunction(name = EpochToDateFunction.NAME, args = {
@Argument(allowedTypes = {PVarchar.class}), @Argument(allowedTypes = {PVarchar.class})})
public class EpochToDateFunction extends ScalarFunction {
public static final String NAME = "EpochToDate";
private static final Map<String, DateTimeFormatter> DATE_FORMATTERS = new HashMap<>();
public String getName() {
return NAME;
}
public boolean evaluate(Tuple tuple, ImmutableBytesWritable ptr) {
Expression arg = getChildren().get(0);
if (!arg.evaluate(tuple, ptr)) return false;
String epochStr = new String(ptr.copyBytes());
arg = getChildren().get(1);
if (!arg.evaluate(tuple, ptr)) return false;
String dfStr = new String(ptr.copyBytes());
if (!DATE_FORMATTERS.containsKey(dfStr)) DATE_FORMATTERS.put(dfStr, forPattern(dfStr));
String dateStr = DATE_FORMATTERS.get(dfStr).print(parseLong(epochStr));
ptr.set(PVarchar.INSTANCE.toBytes(dateStr));
return true;
}
public PDataType getDataType() {
return PVarchar.INSTANCE;
}
}
Maven dependency
<dependency>
<groupId>org.apache.phoenix</groupId>
<artifactId>phoenix-core</artifactId>
<version>4.8.1-HBase-1.2</version>
</dependency>
hbase-site.xml
<configuration>
<property>
<name>phoenix.functions.allowUserDefinedFunctions</name>
<value>true</value>
</property>
<property>
<name>hbase.rootdir</name>
<value>hdfs://localhost:9000/hbase</value>
</property>
<property>
<name>hbase.dynamic.jars.dir</name>
<value>${hbase.rootdir}/lib</value>
</property>
<property>
<name>hbase.local.dir</name>
<value>${hbase.tmp.dir}/local/</value>
</property>
</configuration>
I added the custom jar in hbase.dynamic.jars.dir
$ ./bin/hadoop fs -ls /hbase/lib/
Found 1 items
-rw-r--r-- 1 nj supergroup 79798208 2017-03-16 10:08 /hbase/lib/phoenix-custom-udfs-1.0-SNAPSHOT.jar
Create and execute function
0: jdbc:phoenix:localhost> CREATE FUNCTION EpochToDate(varchar, varchar) returns varchar as 'co.abc.phoenix.customudfs.EpochToDateFunction' using jar 'hdfs://localhost:9000/hbase/lib/phoenix-custom-udfs-1.0-SNAPSHOT.jar';
No rows affected (0.018 seconds)
0: jdbc:phoenix:localhost> select epochtodate('1489637458000', 'yyyy');
Error: ERROR 6001 (42F01): Function undefined. functionName=EPOCHTODATE (state=42F01,code=6001)
org.apache.phoenix.schema.FunctionNotFoundException: ERROR 6001 (42F01): Function undefined. functionName=EPOCHTODATE
at org.apache.phoenix.compile.FromCompiler$1.resolveFunction(FromCompiler.java:129)
at org.apache.phoenix.compile.ExpressionCompiler.visitLeave(ExpressionCompiler.java:313)
at org.apache.phoenix.compile.ProjectionCompiler$SelectClauseVisitor.visitLeave(ProjectionCompiler.java:688)
at org.apache.phoenix.compile.ProjectionCompiler$SelectClauseVisitor.visitLeave(ProjectionCompiler.java:584)
at org.apache.phoenix.parse.FunctionParseNode.accept(FunctionParseNode.java:86)
at org.apache.phoenix.compile.ProjectionCompiler.compile(ProjectionCompiler.java:416)
at org.apache.phoenix.compile.QueryCompiler.compileSingleFlatQuery(QueryCompiler.java:561)
at org.apache.phoenix.compile.QueryCompiler.compileSingleQuery(QueryCompiler.java:507)
at org.apache.phoenix.compile.QueryCompiler.compileSelect(QueryCompiler.java:202)
at org.apache.phoenix.compile.QueryCompiler.compile(QueryCompiler.java:157)
at org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:406)
at org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:380)
at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:271)
at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:266)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:265)
at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1446)
at sqlline.Commands.execute(Commands.java:822)
at sqlline.Commands.sql(Commands.java:732)
at sqlline.SqlLine.dispatch(SqlLine.java:807)
at sqlline.SqlLine.begin(SqlLine.java:681)
at sqlline.SqlLine.start(SqlLine.java:398)
at sqlline.SqlLine.main(SqlLine.java:292)
0: jdbc:phoenix:localhost>
Can someone help me and let me know where am I missing any configuration.
I had this problem in the past.
Basically you need to select some row from a table for UDF's to work (provided that you've written the rest of your UDF properly)
so something like
select udffunc(1,1) won't work
but
select udffunc(col1, 1) from table will
http://eyang3.github.io/2016/12/13/post/
这篇关于凤凰udf不工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!