首页 > 其他 > 详细

看到以前项目,想到了hive与hbase集成

时间:2019-02-20 23:08:47      阅读:207      评论:0      收藏:0      [点我收藏+]

hive的复合数据类型如何取值:

Structs: structs内部的数据可以通过DOT(.)来存取,例如,表中一列c的类型为STRUCT{a INT; b INT},我们可以通过c.a来访问域a
Maps(K-V对):访问指定域可以通过["指定域名称"]进行,例如,一个Map M包含了一个group-》gid的kv对,gid的值可以通过M[‘group‘]来获取
Arrays:array中的数据为相同类型,例如,假如array A中元素[‘a‘,‘b‘,‘c‘],则A[1]的值为‘b‘

Struct使用
建表:

 create table student_test(id INT, info struct<name:STRING, age:INT>)

 ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘,‘
 COLLECTION ITEMS TERMINATED BY ‘:‘;
OK
Time taken: 0.446 seconds
‘FIELDS TERMINATED BY‘ :字段与字段之间的分隔符
‘‘COLLECTION ITEMS TERMINATED BY‘ :一个字段各个item的分隔符 
导入数据:
$ cat test5.txt
1,zhou:30
2,yan:30
3,chen:20
4,li:80
hive> LOAD DATA LOCAL INPATH ‘/home/work/data/test5.txt‘ INTO TABLE student_test;
Copying data from file:/home/work/data/test5.txt
Copying file: file:/home/work/data/test5.txt
Loading data to table default.student_test
OK
Time taken: 0.35 seconds
查询:
hive> select info.age from student_test;
Total MapReduce jobs = 1
......
Total MapReduce CPU Time Spent: 490 msec
OK
30
30
20
80
Time taken: 21.677 seconds

Array使用
建表:
hive> create table class_test(name string, student_id_list array<INT>)
> ROW FORMAT DELIMITED
> FIELDS TERMINATED BY ‘,‘
> COLLECTION ITEMS TERMINATED BY ‘:‘;
OK
Time taken: 0.099 seconds
导入数据:
$ cat test6.txt
034,1:2:3:4
035,5:6
036,7:8:9:10
hive> LOAD DATA LOCAL INPATH ‘/home/work/data/test6.txt‘ INTO TABLE class_test ;
Copying data from file:/home/work/data/test6.txt
Copying file: file:/home/work/data/test6.txt
Loading data to table default.class_test
OK
Time taken: 0.198 seconds
查询:
hive> select student_id_list[3] from class_test;
Total MapReduce jobs = 1
......
Total MapReduce CPU Time Spent: 480 msec
OK
4
NULL
10
Time taken: 21.574 seconds

Map使用
建表:
hive> create table employee(id string, perf map<string, int>)
> ROW FORMAT DELIMITED
> FIELDS TERMINATED BY ‘\t‘
> COLLECTION ITEMS TERMINATED BY ‘,‘
> MAP KEYS TERMINATED BY ‘:‘;
OK
Time taken: 0.144 seconds
‘MAP KEYS TERMINATED BY’ :key value分隔符

导入数据:
$ cat test7.txt
1 job:80,team:60,person:70
2 job:60,team:80
3 job:90,team:70,person:100
hive> LOAD DATA LOCAL INPATH ‘/home/work/data/test7.txt‘ INTO TABLE employee;
查询:
hive> select perf[‘person‘] from employee;
Total MapReduce jobs = 1
......
Total MapReduce CPU Time Spent: 460 msec
OK
70
NULL
100
Time taken: 20.902 seconds
hive> select perf[‘person‘] from employee where perf[‘person‘] is not null;   
Total MapReduce jobs = 1
.......
Total MapReduce CPU Time Spent: 610 msec
OK
70
100
Time taken: 21.989 seconds

 

2.hbase与hive集成

hbase 数据格式(16个字段)event_log

\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:ac, timestamp=1550595638000, value=2
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:api_v, timestamp=1550595638000, value=1.0
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:app_id, timestamp=1550595638000, value=3
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:c_time, timestamp=1550595638000, value=20190219000108499
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:ch_id, timestamp=1550595638000, value=lenovo
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:city, timestamp=1550595638000, value=unknown
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:country, timestamp=1550595638000, value=unknown
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:en, timestamp=1550595638000, value=e_se
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:ip, timestamp=1550595638000, value=180.139.110.101
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:kw, timestamp=1550595638000, value=\xE6\x98\xA5\xE8\xBF\x90
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:net_t, timestamp=1550595638000, value=WIFI
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:pl, timestamp=1550595638000, value=1
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:province, timestamp=1550595638000, value=unknown
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:s_time, timestamp=1550595638000, value=1550505666013
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:user_id, timestamp=1550595638000, value=0
\x00\x01\x01\xDD\x00\x00\x00\x00\xBB\xF7@| column=info:uuid, timestamp=1550595638000, value=1d6705ad-c62a-30ec-a832-263eea022683

1.进行字段全部隐射:存在弊端,如果hbase的字段有长又短 隐射字段得全部写出来,存储事件不同,映射可能会错乱

CREATE EXTERNAL TABLE hive_hbase_table(
rowkey string,
ac string,
api_v string,
app_id string,
c_time string,
ch_id string,
city double,
province string,
country string,
en string,
ip string,
kw string,
pl string,
s_time string,
user_id string,
uuid string,
ver string
)
STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler‘
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,info:ac,info:api_v,info:app_id,info:c_time,info:ch_id,info:city,info:province,info:country,info:en,info:ip,info:kw,info:pl,info:s_time,info:user_id,info:uuid,info:ver")
TBLPROPERTIES ("hbase.table.name" = "event_logs_20190219");

2.第二种映射:映射整个列族,hive所对应的数据类型为map(String,String)

 

CREATE EXTERNAL TABLE hive_hbase_table2 (
rowkey string,
info map<STRING,STRING>
) STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler‘
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,info:")
TBLPROPERTIES ("hbase.table.name" = "event_logs_20190219");

hive> select info["en"],info["uuid"] from hive_hbase_table2 limit 1; //查询用
e_st 0b37d003-78c5-3088-8253-2966410a97d7
Time taken: 17.401 seconds, Fetched: 1 row(s)

 

hive与hbase集成方式

【方案一】创建一个hive外表,使其与hbase中的china_mainland表的所有列簇映射(包括每个列簇下的所有列)

注意这里的关键步骤是在建表的时候,在WITH SERDEPROPERTIES指定关联到hbase表的哪个列簇或列!


hive> CREATE EXTERNAL TABLE china_mainland(
> rowkey string,
> act map<STRING,FLOAT>,
> basic map<STRING,FLOAT>,
> docs map<STRING,FLOAT>,
> pref map<STRING,FLOAT>,
> rc map<STRING,FLOAT>
> ) STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler‘
> WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,act:,basic:,docs:,pref:,rc:")
> TBLPROPERTIES ("hbase.table.name" = "users:china_mainland")
> ;


【方案二】与单一列簇下的单个列映射

hive表china_mainland_acturl中的2个字段rowkey、act_url分别映射到Hbase表users:china_mainland中的行健和“act列簇下的一个url列”

hive> CREATE EXTERNAL TABLE china_mainland_acturl(
> rowkey string,
> act_url STRING
> ) STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler‘
> WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,act:url")
> TBLPROPERTIES ("hbase.table.name" = "users:china_mainland")
> ;


【方案三】与单一列簇下的多个列映射

hive表china_mainland_kylin_test中的3个字段pp_professionact、pp_salary、pp_gender,分别映射到Hbase表users:china_mainland中的列簇act下的3个列pp_profession、pp_salary、pp_gender

hive> CREATE EXTERNAL TABLE china_mainland_kylin_test(
> rowkey string,
> pp_profession string,
> pp_salary double,
> pp_gender int)
> STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler‘
> WITH SERDEPROPERTIES ("hbase.columns.mapping" =":key,act:pp_profession,act:pp_salary,act:pp_gender")
> TBLPROPERTIES ("hbase.table.name" = "users:china_mainland");


【方案四】

关联到hbase表的单一列簇下的所有列
hive> CREATE EXTERNAL TABLE china_mainland_pref(
> rowkey STRING,
> pref map<STRING, STRING>
> )
> STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler‘
> WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,pref:")
> TBLPROPERTIES ("hbase.table.name" = "users:china_mainland")
> ;

看到以前项目,想到了hive与hbase集成

原文:https://www.cnblogs.com/hejunhong/p/10409739.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!