Alex的Hadoop菜鸟教程:第8课Sqoop1安装/导入/导出教程
靠!sqoop2的文档太少了,而且居然不支持Hbase,十分简陋,所以我愤而放弃Sqoop2转为使用Sqoop1,之前跟着我教程看到朋友不要拿砖砸我,我是也是不知情的群众 卸载sqoop2 这步可选,如果你们是照着我之前的教程你们已经装了sqoop2就得先卸载掉,没装的可以跳
靠!sqoop2的文档太少了,而且居然不支持Hbase,十分简陋,所以我愤而放弃Sqoop2转为使用Sqoop1,之前跟着我教程看到朋友不要拿砖砸我,我是也是不知情的群众
卸载sqoop2
这步可选,如果你们是照着我之前的教程你们已经装了sqoop2就得先卸载掉,没装的可以跳过这步
$ sudo su - $ service sqoop2-server stop $ yum -y remove sqoop2-server $ yum -y remove sqoop2-client
安装Sqoop1
yum install -y sqoop
用help测试下是否有安装好
# sqoop help Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/11/28 11:33:11 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 usage: sqoop COMMAND [ARGS] Available commands: codegen Generate code to interact with database records create-hive-table Import a table definition into Hive eval Evaluate a SQL statement and display the results export Export an HDFS directory to a database table help List available commands import Import a table from a database to HDFS import-all-tables Import tables from a database to HDFS job Work with saved jobs list-databases List available databases on a server list-tables List available tables in a database merge Merge results of incremental imports metastore Run a standalone Sqoop metastore version Display version information See 'sqoop help COMMAND' for information on a specific command.
拷贝驱动到 /usr/lib/sqoop/lib
mysql jdbc 驱动下载地址
下载后,解压开找到驱动jar包,upload到服务器上,然后移过去
mv /home/alex/mysql-connector-java-5.1.34-bin.jar /usr/lib/sqoop/lib
导入
数据准备
在mysql里面建立一个表CREATE TABLE `employee` ( `id` int(11) NOT NULL, `name` varchar(20) NOT NULL, PRIMARY KEY (`id`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8;
插入几条数据
insert into employee (id,name) values (1,'michael'); insert into employee (id,name) values (2,'ted'); insert into employee (id,name) values (3,'jack');
导入mysql到hdfs
列出所有表
我们先不急着导入,先做几个准备步骤热身一下,也方便排查问题列出所有数据库
# sqoop list-databases --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/12/01 09:20:28 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 14/12/01 09:20:28 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 14/12/01 09:20:28 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. information_schema cacti metastore mysql sqoop_test wordpress zabbix
先用sqoop连接上数据库并列出所有表
# sqoop list-tables --connect jdbc:mysql://localhost/sqoop_test --username root --password root Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/11/28 11:46:11 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 14/11/28 11:46:11 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 14/11/28 11:46:11 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. employee student workers
这条命令不用跟驱动的类名是因为sqoop默认支持mysql的,如果要跟jdbc驱动的类名用
# sqoop list-tables --connect jdbc:mysql://localhost/sqoop_test --username root --password root --driver com.mysql.jdbc.Driver
导入数据到hdfs
sqoop import --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --m 1 --target-dir /user/test3
# sqoop import --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --m 1 --target-dir /user/test Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/12/01 14:15:41 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 14/12/01 14:15:41 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 14/12/01 14:15:41 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 14/12/01 14:15:41 INFO tool.CodeGenTool: Beginning code generation 14/12/01 14:15:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1 14/12/01 14:15:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1 14/12/01 14:15:42 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce Note: /tmp/sqoop-root/compile/7b8091924ce8deb4f2ccae14c404a5bf/employee.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 14/12/01 14:15:45 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/7b8091924ce8deb4f2ccae14c404a5bf/employee.jar 14/12/01 14:15:45 WARN manager.MySQLManager: It looks like you are importing from mysql. 14/12/01 14:15:45 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 14/12/01 14:15:45 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 14/12/01 14:15:45 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 14/12/01 14:15:45 INFO mapreduce.ImportJobBase: Beginning import of employee 14/12/01 14:15:46 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 14/12/01 14:15:47 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 14/12/01 14:15:47 INFO client.RMProxy: Connecting to ResourceManager at xmseapp01/10.172.78.111:8032 14/12/01 14:15:50 INFO db.DBInputFormat: Using read commited transaction isolation 14/12/01 14:15:51 INFO mapreduce.JobSubmitter: number of splits:1 14/12/01 14:15:51 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1406097234796_0019 14/12/01 14:15:52 INFO impl.YarnClientImpl: Submitted application application_1406097234796_0019 14/12/01 14:15:52 INFO mapreduce.Job: The url to track the job: http://xmseapp01:8088/proxy/application_1406097234796_0019/ 14/12/01 14:15:52 INFO mapreduce.Job: Running job: job_1406097234796_0019 14/12/01 14:16:08 INFO mapreduce.Job: Job job_1406097234796_0019 running in uber mode : false 14/12/01 14:16:08 INFO mapreduce.Job: map 0% reduce 0% 14/12/01 14:16:19 INFO mapreduce.Job: map 100% reduce 0% 14/12/01 14:16:20 INFO mapreduce.Job: Job job_1406097234796_0019 completed successfully 14/12/01 14:16:21 INFO mapreduce.Job: Counters: 30 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=99855 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=87 HDFS: Number of bytes written=16 HDFS: Number of read operations=4 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=1 Other local map tasks=1 Total time spent by all maps in occupied slots (ms)=8714 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=8714 Total vcore-seconds taken by all map tasks=8714 Total megabyte-seconds taken by all map tasks=8923136 Map-Reduce Framework Map input records=2 Map output records=2 Input split bytes=87 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=58 CPU time spent (ms)=1560 Physical memory (bytes) snapshot=183005184 Virtual memory (bytes) snapshot=704577536 Total committed heap usage (bytes)=148897792 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=16 14/12/01 14:16:21 INFO mapreduce.ImportJobBase: Transferred 16 bytes in 33.6243 seconds (0.4758 bytes/sec) 14/12/01 14:16:21 INFO mapreduce.ImportJobBase: Retrieved 2 records.
查看一下结果
# hdfs dfs -ls /user/test Found 2 items -rw-r--r-- 2 root supergroup 0 2014-12-01 14:16 /user/test/_SUCCESS -rw-r--r-- 2 root supergroup 16 2014-12-01 14:16 /user/test/part-m-00000 # hdfs dfs -cat /user/test/part-m-00000 1,michael 2,ted
我也不知道为什么mysql有3条数据,而导入了之后只有2条,有哪位懂的介绍下?
我遇到遇到的问题
如果你遇到以下问题14/12/01 10:12:42 INFO mapreduce.Job: Task Id : attempt_1406097234796_0017_m_000000_0, Status : FAILED Error: employee : Unsupported major.minor version 51.0
原因:sqoop是使用jdk1.7编译的,所以如果你用 ps aux| grep hadoop 看到hadoop用的是1.6运行的,那sqoop不能正常工作 注意:CDH4.7以上已经兼容jdk1.7 ,但如果你是从4.5升级上来的会发现hadoop用的是jdk1.6,需要修改一下整个hadoop调用的jdk为1.7,而且这是官方推荐的搭配
关于改jdk的方法
官方提供了2个方法 http://www.cloudera.com/content/cloudera/en/documentation/cdh4/latest/CDH4-Requirements-and-Supported-Versions/cdhrsv_topic_3.html这个是让你把 /usr/java/ 下建一个软链叫 default 指向你要的jdk,我这么做了,无效 http://www.cloudera.com/content/cloudera/en/documentation/archives/cloudera-manager-4/v4-5-3/Cloudera-Manager-Enterprise-Edition-Installation-Guide/cmeeig_topic_16_2.html
这个是叫你增加一个环境变量, 我这么做了,无效 最后我用了简单粗暴的办法:停掉所有相关服务,然后删掉那个该死的jdk1.6然后再重启,这回就用了 /usr/java/default 了
停掉所有hadoop相关服务的命令
for x in `cd /etc/init.d ; ls hive-*` ; do sudo service $x stop ; done for x in `cd /etc/init.d ; ls hbase-*` ; do sudo service $x stop ; done /etc/init.d/zookeeper-server stop for x in `cd /etc/init.d ; ls hadoop-*` ; do sudo service $x stop ; done
zookeeper , hbase, hive 如果你们没装就跳过。建议你们用ps aux | grep jre1.6 去找找有什么服务,然后一个一个关掉,先关其他的,最后关hadoop
启动所有
for x in `cd /etc/init.d ; ls hadoop-*` ; do sudo service $x start ; done /etc/init.d/zookeeper-server start for x in `cd /etc/init.d ; ls hbase-*` ; do sudo service $x start ; done for x in `cd /etc/init.d ; ls hive-*` ; do sudo service $x start ; done
从hdfs导出数据到mysql
接着这个例子做数据准备
清空employeetruncate employee
导出数据到mysql
# sqoop export --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --m 1 --export-dir /user/test Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 14/12/01 15:16:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.1 14/12/01 15:16:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 14/12/01 15:16:51 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 14/12/01 15:16:51 INFO tool.CodeGenTool: Beginning code generation 14/12/01 15:16:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1 14/12/01 15:16:52 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1 14/12/01 15:16:52 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce Note: /tmp/sqoop-root/compile/f4a75fdefe1eb604181d47d6bc827e48/employee.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 14/12/01 15:16:55 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/f4a75fdefe1eb604181d47d6bc827e48/employee.jar 14/12/01 15:16:55 INFO mapreduce.ExportJobBase: Beginning export of employee 14/12/01 15:16:55 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 14/12/01 15:16:57 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative 14/12/01 15:16:57 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 14/12/01 15:16:57 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 14/12/01 15:16:57 INFO client.RMProxy: Connecting to ResourceManager at xmseapp01/10.172.78.111:8032 14/12/01 15:17:00 INFO input.FileInputFormat: Total input paths to process : 1 14/12/01 15:17:00 INFO input.FileInputFormat: Total input paths to process : 1 14/12/01 15:17:00 INFO mapreduce.JobSubmitter: number of splits:1 14/12/01 15:17:00 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1406097234796_0021 14/12/01 15:17:01 INFO impl.YarnClientImpl: Submitted application application_1406097234796_0021 14/12/01 15:17:01 INFO mapreduce.Job: The url to track the job: http://xmseapp01:8088/proxy/application_1406097234796_0021/ 14/12/01 15:17:01 INFO mapreduce.Job: Running job: job_1406097234796_0021 14/12/01 15:17:13 INFO mapreduce.Job: Job job_1406097234796_0021 running in uber mode : false 14/12/01 15:17:13 INFO mapreduce.Job: map 0% reduce 0% 14/12/01 15:17:21 INFO mapreduce.Job: Task Id : attempt_1406097234796_0021_m_000000_0, Status : FAILED Error: java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'sqoop_test' at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:79) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:624) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:744) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'sqoop_test' at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) at com.mysql.jdbc.Util.getInstance(Util.java:360) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:870) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1659) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1206) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2234) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2265) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2064) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:790) at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:395) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302) at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:76) at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:95) at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:77) ... 8 more 14/12/01 15:17:29 INFO mapreduce.Job: Task Id : attempt_1406097234796_0021_m_000000_1, Status : FAILED Error: java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'sqoop_test' at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:79) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:624) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:744) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'sqoop_test' at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) at com.mysql.jdbc.Util.getInstance(Util.java:360) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:870) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1659) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1206) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2234) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2265) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2064) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:790) at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:395) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302) at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:76) at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:95) at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:77) ... 8 more 14/12/01 15:17:40 INFO mapreduce.Job: map 100% reduce 0% 14/12/01 15:17:41 INFO mapreduce.Job: Job job_1406097234796_0021 completed successfully 14/12/01 15:17:41 INFO mapreduce.Job: Counters: 32 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=99542 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=139 HDFS: Number of bytes written=0 HDFS: Number of read operations=4 HDFS: Number of large read operations=0 HDFS: Number of write operations=0 Job Counters Failed map tasks=2 Launched map tasks=3 Other local map tasks=2 Rack-local map tasks=1 Total time spent by all maps in occupied slots (ms)=21200 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=21200 Total vcore-seconds taken by all map tasks=21200 Total megabyte-seconds taken by all map tasks=21708800 Map-Reduce Framework Map input records=2 Map output records=2 Input split bytes=120 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=86 CPU time spent (ms)=1330 Physical memory (bytes) snapshot=177094656 Virtual memory (bytes) snapshot=686768128 Total committed heap usage (bytes)=148897792 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=0 14/12/01 15:17:41 INFO mapreduce.ExportJobBase: Transferred 139 bytes in 43.6687 seconds (3.1831 bytes/sec) 14/12/01 15:17:41 INFO mapreduce.ExportJobBase: Exported 2 records.
那一串异常我也不知道为什么会有?!反正最后去mysql看成功导出了2条数据
mysql> select * from employee; +----+---------+ | id | name | +----+---------+ | 1 | michael | | 2 | ted | +----+---------+ 2 rows in set (0.00 sec)
好,下课!

热AI工具

Undresser.AI Undress
人工智能驱动的应用程序,用于创建逼真的裸体照片

AI Clothes Remover
用于从照片中去除衣服的在线人工智能工具。

Undress AI Tool
免费脱衣服图片

Clothoff.io
AI脱衣机

AI Hentai Generator
免费生成ai无尽的。

热门文章

热工具

记事本++7.3.1
好用且免费的代码编辑器

SublimeText3汉化版
中文版,非常好用

禅工作室 13.0.1
功能强大的PHP集成开发环境

Dreamweaver CS6
视觉化网页开发工具

SublimeText3 Mac版
神级代码编辑软件(SublimeText3)

热门话题

得物APP是当前十分火爆品牌购物的软件,但是多数的用户不知道得物APP中功能如何的使用,下方会整理最详细的使用教程攻略,接下来就是小编为用户带来的得物多功能使用教程汇总,感兴趣的用户快来一起看看吧!得物使用教程【2024-03-20】得物分期购怎么使用【2024-03-20】得物优惠券怎么获得【2024-03-20】得物人工客服怎么找【2024-03-20】得物取件码怎么查看【2024-03-20】得物求购在哪里看【2024-03-20】得物vip怎么开【2024-03-20】得物怎么申请退换货

在Linux上安装安卓应用一直是许多用户所关心的问题,尤其是对于喜欢使用安卓应用的Linux用户来说,掌握如何在Linux系统上安装安卓应用是非常重要的。虽然在Linux系统上直接运行安卓应用并不像在Android平台上那么简单,但是通过使用模拟器或者第三方工具,我们依然可以在Linux上愉快地享受安卓应用的乐趣。下面将为大家介绍在Linux系统上安装安卓应

夏天雨后,经常能见到一种美丽且神奇的特殊天气景象——彩虹。这也是摄影中可遇而不可求的难得景象,非常出片。彩虹出现有这样几个条件:一是空气中有充足的水滴,二是太阳以较低的角度进行照射。所以下午雨过天晴后的一段时间内,是最容易看到彩虹的时候。不过彩虹的形成受天气、光线等条件的影响较大,因此一般只会持续一小段时间,而最佳观赏、拍摄时间更为短暂。那么遇到彩虹,怎样才能合理将其记录下来并拍出质感呢?1.寻找彩虹除了上面提到的条件外,彩虹通常出现在阳光照射的方向,即如果太阳由西向东照射,彩虹更有可能出现在东

如果您使用过Docker,则必须了解守护进程、容器及其功能。守护进程是在容器已在任何系统中使用时在后台运行的服务。Podman是一个免费的管理工具,用于管理和创建容器,而不依赖于任何守护程序,如Docker。因此,它在管理集装箱方面具有优势,而不需要长期的后台服务。此外,Podman不需要使用根级别的权限。本指南详细讨论了如何在Ubuntu24上安装Podman。更新系统我们首先要进行系统更新,打开Ubuntu24的Terminalshell。在安装和升级过程中,我们都需要使用命令行。一种简单的

在高中学习的时候,有些学生做的笔记非常清晰准确,比同一个班级的其他人都做得更多。对于一些人来说,记笔记是一种爱好,而对于其他人来说,当他们很容易忘记任何重要事情的小信息时,则是一种必需品。Microsoft的NTFS应用程序对于那些希望保存除常规讲座以外的重要笔记的学生特别有用。在这篇文章中,我们将描述Ubuntu24上的Ubuntu应用程序的安装。更新Ubuntu系统在安装Ubuntu安装程序之前,在Ubuntu24上我们需要确保新配置的系统已经更新。我们可以使用Ubuntu系统中最著名的“a

PhotoshopCS是PhotoshopCreativeSuite的缩写,由Adobe公司出品的软件,被广泛用于平面设计和图像处理,作为新手学习PS,今天就让小编为您解答一下photoshopcs5是什么软件以及photoshopcs5使用教程。一、photoshopcs5是什么软件AdobePhotoshopCS5Extended是电影、视频和多媒体领域的专业人士,使用3D和动画的图形和Web设计人员,以及工程和科学领域的专业人士的理想选择。呈现3D图像并将它合并到2D复合图像中。轻松编辑视

Win7电脑上安装Go语言的详细步骤Go(又称Golang)是一种由Google开发的开源编程语言,其简洁、高效和并发性能优秀,适合用于开发云端服务、网络应用和后端系统等领域。在Win7电脑上安装Go语言,可以让您快速入门这门语言并开始编写Go程序。下面将会详细介绍在Win7电脑上安装Go语言的步骤,并附上具体的代码示例。步骤一:下载Go语言安装包访问Go官
