使用Hive(JSON文件)将数据插入到Hbase [英] Insert data into Hbase using Hive (JSON file)
问题描述
我已经使用hive在hbase中创建了一个表:
hive> CREATE TABLE hbase_table_emp(id int,name string,role string)
STORED BY'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES(hbase.columns.mapping=:key ,cf1:name,cf1:role)
TBLPROPERTIES(hbase.table.name=emp);
并创建另一个表格来加载数据:
hive>创建表testemp(id int,名称字符串,角色字符串)以'\ t'结尾的行格式分隔字段;
hive>将数据本地inpath'/home/user/sample.txt'加载到表testemp中;
最后将数据插入到hbase表中:
蜂房>插入覆盖表hbase_table_emp select * from testemp;
hive> select * from hbase_table_emp;
OK
123 Ram TeamLead
456 Silva会员
789 Krishna会员
所需时间:0.160秒,提取:3排
表格在hbase中看起来像这样:
HBase的(主):002:0> scan'emp'
ROW COLUMN + CELL
123 column = cf1:name,timestamp = 1422540225254,value = Ram
123 column = cf1:role,timestamp = 1422540225254,value = TeamLead
456 column = cf1:name,timestamp = 1422540225254,value = Silva
456 column = cf1:role,timestamp = 1422540225254,value = Member
789 column = cf1:name,timestamp = 1422540225254,value = Krishna
789 column = cf1:role,timestamp = 1422540225254,value = Member
3行2.1230秒
我可以为JSON文件做同样的事情:
{id: 123,name:Ram,role:TeamLead}
{id:456,name:Silva,role:Member}
{ id:789,name:Krishna,role:Member}
和do:
hive>将数据本地inpath'/home/user/sample.json'加载到表testemp中;
请帮忙! :)
您可以使用 get_json_object
函数来解析数据作为JSON对象。例如,如果使用JSON数据创建登台表:
DROP TABLE IF EXISTS staging;
CREATE TABLE staging(json STRING);
LOAD DATA LOCAL INPATH'/ local / path / to / jsonfile'INTO TABLE staging;
然后使用 get_json_object
来提取您的属性想要加载到表中:
INSERT OVERWRITE TABLE hbase_table_emp SELECT
get_json_object(json,$ .id) AS id,
get_json_object(json,$ .name)AS名称,
get_json_object(json,$ .role)AS角色
FROM staging;
这个函数有更全面的讨论 here 。
I have already created a table in hbase using hive:
hive> CREATE TABLE hbase_table_emp(id int, name string, role string)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:name,cf1:role")
TBLPROPERTIES ("hbase.table.name" = "emp");
and created another table to load data on it :
hive> create table testemp(id int, name string, role string) row format delimited fields terminated by '\t';
hive> load data local inpath '/home/user/sample.txt' into table testemp;
and finally insert data into the hbase table:
hive> insert overwrite table hbase_table_emp select * from testemp;
hive> select * from hbase_table_emp;
OK
123 Ram TeamLead
456 Silva Member
789 Krishna Member
time taken: 0.160 seconds, Fetched: 3 row(s)
the table looks like this in hbase:
hbase(main):002:0> scan 'emp'
ROW COLUMN+CELL
123 column=cf1:name, timestamp=1422540225254, value=Ram
123 column=cf1:role, timestamp=1422540225254, value=TeamLead
456 column=cf1:name, timestamp=1422540225254, value=Silva
456 column=cf1:role, timestamp=1422540225254, value=Member
789 column=cf1:name, timestamp=1422540225254, value=Krishna
789 column=cf1:role, timestamp=1422540225254, value=Member
3 row(s) in 2.1230 seconds
Can I do the same for a JSON file :
{"id": 123, "name": "Ram", "role":"TeamLead"}
{"id": 456, "name": "Silva", "role":"Member"}
{"id": 789, "name": "Krishna", "role":"Member"}
and do :
hive> load data local inpath '/home/user/sample.json' into table testemp;
please Help ! :)
You can use the get_json_object
function to parse the data as a JSON object. For instance, if you create a staging table with your JSON data:
DROP TABLE IF EXISTS staging;
CREATE TABLE staging (json STRING);
LOAD DATA LOCAL INPATH '/local/path/to/jsonfile' INTO TABLE staging;
Then use get_json_object
to extract the attributes you want to load into the table:
INSERT OVERWRITE TABLE hbase_table_emp SELECT
get_json_object(json, "$.id") AS id,
get_json_object(json, "$.name") AS name,
get_json_object(json, "$.role") AS role
FROM staging;
There is more comprehensive discussion of this function here.
这篇关于使用Hive(JSON文件)将数据插入到Hbase的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!