Home > Database > Mysql Tutorial > Hive与Oracle表关联语句对比

Hive与Oracle表关联语句对比

WBOY
Release: 2016-06-07 16:48:39
Original
1327 people have browsed it

在将ORACLE存储过程迁移到HIVE平台时,不可避免地会遇到表关联的相应语法问题。本文详细对比了ORALCE和HIVE的各种表关联语法,包

在将Oracle存储过程迁移到HIVE平台时,不可避免地会遇到表关联的相应语法问题。

本文详细对比了ORALCE和HIVE的各种表关联语法,,包括内关联,左,右关联,全外关联和笛卡尔积。

一.创建表

ORACLE:

create table a
(
a1  number(10),
a2 varchar2(50)
);

create table b
(
b1  number(10),
b2 varchar2(50)
);

HIVE:

CREATE TABLE IF NOT EXISTS a (
a1 STRING,
a2 STRING)
COMMENT 'TABLE A'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
STORED AS TEXTFILE
TBLPROPERTIES ( 'created_at'='2014-04-28','creator'='HENRY' );

二.插入数据

ORACLE:

insert into a(a1,a2) values(1,'X');
insert into a(a1,a2) values(2,'Y');
insert into a(a1,a2) values(3,'Z');

insert into b(b1,b2) values(1,'X');
insert into b(b1,b2) values(2,'Y');
insert into b(b1,b2) values(4,'Z');

HIVE:

hive (default)> load data local inpath './data1' into table a;
Copying data from file:/home/Hadoop/roger/sql/renguihe/data
Copying file: file:/home/hadoop/roger/sql/renguihe/data
Loading data to table default.a
Table default.a stats: [num_partitions: 0, num_files: 1, num_rows: 0, total_size: 12, raw_data_size: 0]
OK
Time taken: 1.961 seconds
hive (default)> load data local inpath './data1' into table b;
Copying data from file:/home/hadoop/roger/sql/renguihe/data
Copying file: file:/home/hadoop/roger/sql/renguihe/data
Loading data to table default.b
Table default.b stats: [num_partitions: 0, num_files: 1, num_rows: 0, total_size: 12, raw_data_size: 0]
OK
Time taken: 0.392 seconds

其中data1数据文件内容为:

1|X
2|Y
3|Z

data2数据文件内容为:

1|X
2|Y
4|Z

三.等值关联

ORACLE:

select * from a,b where a.a1 = b.b1;

或:

select * from a join b on a.a1 = b.b1;

结果如下图所示:

 

HIVE:

select * from a join b on a.a1 = b.b1;

注意HIVE中不能使用where来表示关联条件。

执行过程及结果如下图所示:

hive (default)> select * from a join b on a.a1 = b.b1;       
Total MapReduce jobs = 1
setting HADOOP_USER_NAME        hadoop
Execution log at: /tmp/hadoop/.log
2014-04-29 09:13:27    Starting to launch local task to process map join;      maximum memory = 1908932608
2014-04-29 09:13:27    Processing rows:        3      Hashtable size: 3      Memory usage:  110981704      rate:  0.058
2014-04-29 09:13:27    Dump the hashtable into file: file:/tmp/hadoop/hive_2014-04-29_09-13-25_273_8486588204512196396/-local-10002/HashTable-Stage-3/MapJoin-mapfile00--.hashtable
2014-04-29 09:13:27    Upload 1 File to: file:/tmp/hadoop/hive_2014-04-29_09-13-25_273_8486588204512196396/-local-10002/HashTable-Stage-3/MapJoin-mapfile00--.hashtable File size: 438
2014-04-29 09:13:27    End of local task; Time Taken: 0.339 sec.
Execution completed successfully
Mapred Local Task Succeeded . Convert the Join into MapJoin
Mapred Local Task Succeeded . Convert the Join into MapJoin
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201404251509_0131, Tracking URL = IP:50030/jobdetails.jsp?jobid=job_201404251509_0131
Kill
Command = /home/hadoop/package/hadoop-1.0.4/libexec/../bin/hadoop job  -kill job_201404251509_0131
Hadoop job information for Stage-3: number of mappers: 1; number of reducers: 0
2014-04-29 09:13:39,979 Stage-3 map = 0%,  reduce = 0%
2014-04-29 09:13:46,025 Stage-3 map = 100%,  reduce = 0%, Cumulative CPU 1.59 sec
2014-04-29 09:13:47,034 Stage-3 map = 100%,  reduce = 0%, Cumulative CPU 1.59 sec
2014-04-29 09:13:48,044 Stage-3 map = 100%,  reduce = 0%, Cumulative CPU 1.59 sec
2014-04-29 09:13:49,052 Stage-3 map = 100%,  reduce = 0%, Cumulative CPU 1.59 sec
2014-04-29 09:13:50,061 Stage-3 map = 100%,  reduce = 0%, Cumulative CPU 1.59 sec
2014-04-29 09:13:51,069 Stage-3 map = 100%,  reduce = 0%, Cumulative CPU 1.59 sec
2014-04-29 09:13:52,077 Stage-3 map = 100%,  reduce = 100%, Cumulative CPU 1.59 sec
MapReduce Total cumulative CPU time: 1 seconds 590 msec
Ended Job = job_201404251509_0131
MapReduce Jobs Launched:
Job 0: Map: 1  Cumulative CPU: 1.59 sec  HDFS Read: 211 HDFS Write: 16 SUCCESS
Total MapReduce CPU Time Spent: 1 seconds 590 msec
OK
a1      a2      b1      b2
1      X      1      X
2      Y      2      Y

更多详情见请继续阅读下一页的精彩内容

linux

source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template