大数据 - Sqoop从本地MySQL导入到Hive为什么要求Sqoop一定要在HDFS中
迷茫
迷茫 2017-04-17 15:27:51
0
1
529

问题来源是这样,我使用sqoop向Hive中导入mysql的表

sqoop import --connect jdbc:mysql://127.0.0.1:3306/employees_db --table titles --username root -P --hive-import -- --default-character-set=utf-8

然后发生了报错:

16/08/10 22:08:36 ERROR tool.ImportTool: Encountered IOException running import job: java.io.FileNotFoundException: File does not exist: hdfs://192.168.1.128:9000/usr/local/sqoop/lib/mysql-connector-java-5.1.39-bin.jar

于是我Google了一下,发现清一色的回答均为在hdfs中创建一个/usr/local/sqoop,然后将本地的sqoopput上去,即

[root@Master local]# hadoop dfs -put /usr/local/sqoop /usr/local
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

16/08/10 22:23:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[root@Master local]# hadoop dfs -ls /usr/local
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

16/08/10 22:25:28 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
drwxr-xr-x   - root supergroup          0 2016-08-10 22:25 /usr/local/sqoop

没错,这样是成功解决了问题
但是为什么会把sqoop存到hdfs中?我看到了有些人并不需要把sqoop移到hdfs中也能成功运行,是我哪里的配置出错还是只能这么做?hdfs是分布式储存应该是存数据文件,怎么会把一个程序存进去呢?这里令我很不解,希望能得到解答。

迷茫
迷茫

业精于勤,荒于嬉;行成于思,毁于随。

reply all(1)
大家讲道理

Regarding this problem, I seem to have found an explanation. The error reported here is that hdfs://192.168.1.128:9000/usr/local/sqoop/lib/mysql-connector-java-5.1.39-bin.jar is missing, so my understanding is that it is not that sqoop runs on HDFS, but that the jar of sqoop needs to be placed on HDFS, so we only need to put Just upload the jar to HDFS

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!