How to implement HDFS file upload and download in Java
1、pom.xml配置
<!--配置--> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <maven.compiler.source>1.8</maven.compiler.source> <maven.compiler.target>1.8</maven.compiler.target> <hadoop.version>3.1.3</hadoop.version> </properties> <!--依赖库--> <dependencies> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>${hadoop.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-mapreduce-client-core</artifactId> <version>${hadoop.version}</version> </dependency> </dependencies>
2、创建与删除
//导包 import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import java.io.IOException; public static void main( String[] args ){ //初始化hadoop文件系统的configration对象 Configuration conf = new Configuration(); //将hadoop的configration信息传入 conf.set("fs.defaultFS","hdfs://192.168.50.102:9000"); //初始化Hadoop文件系统的句柄 FileSystem fs=null; try { //配置Hadoop的文件句柄信息 fs=FileSystem.get(conf); //定义Hadoop的文件路径 final String PATH="/test/kb16/hadoop/ratings.csv"; //初始化Hadoop的路径信息 Path path = new Path(PATH); //如果文件路径存在就删除 if (fs.exists(path)) { System.out.println("DELETE "+fs.delete(path, true)); }else{ //如果文件路径不存在就创建 System.out.println("CREATE "+fs.create(path)); } } catch (IOException e) { e.printStackTrace(); }finally { //结束的时候,句柄还没有释放就进行释放 if (fs!=null) { try { fs.close() ; }catch (IOException e) { e.printStackTrace(); } } } }
3、文件上传
//导包 import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import java.io.File; import java.io.IOException; public static void main(String[] args) { //定义本地上传文件路径 final String formPath="E:\\ratings.csv"; //本地文件不存在就报错,并强制让程序终止 if (!new File(formPath).exists()) { System.out.println(formPath +"doesn't exits"); return; } //初始化hadoop文件系统的configration对象 Configuration conf = new Configuration(); //将hadoop的configration信息传入 conf.set("fs.defaultFS","hdfs://192.168.50.102:9000"); //初始化Hadoop文件系统的句柄 FileSystem fs=null; try { //将config信息传入 fs=FileSystem.get(conf); //定义上传到HDFS的路径 final String toPath="/test/kb16/hive"; //初始化路径 Path to =new Path(toPath); //如果文件路径存在不执行,如果文件路径不存在就尝试创建,如果创建失败就跳过 if (!fs.exists(to)&& !fs.mkdirs(to)) { System.out.println(toPath +"doesn't exit and can't be created"); return; } //初始化上传文件路径 Path from=new Path(formPath); //利用方法将本地文件复制到HDFS中 fs.copyFromLocalFile(from, to); System.out.println("succeed in copying from "+formPath+" to "+toPath); } catch (IOException e) { e.printStackTrace(); System.out.println("FAILURE"); }finally{ //如果结束Hadoop文件系统句柄没有关闭,利用方法进行句柄释放 if (null!=fs) { try { fs.close(); } catch (IOException e) { e.printStackTrace(); } } } }
4、文件下载
//导包 import com.google.inject.internal.cglib.core.$LocalVariablesSorter; import com.google.inject.internal.cglib.proxy.$Factory; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import java.io.File; import java.io.IOException; public class Download { public static void main(String[] args) { //定义文件下载路径 final String toPath = "C:\\Users\\Jialin\\Desktop"; //获取路径 File to = new File(toPath); //如果路存在或者文件路径不存在但是创建成功就不执行if方法 if (!to.exists()&&!to.mkdirs()) { System.err.println(toPath + "doesn't exist and can't be created"); return; } //初始化hadoop文件系统的configration对象 Configuration config = new Configuration(); //将hadoop的configration信息传入 config.set("fs.defaultFS", "hdfs://192.168.50.102:9000"); //初始化Hadoop文件系统的句柄 FileSystem fs = null; try { //将config信息传入 fs = FileSystem.get(config); //定义下载文件路径 final String fromPath = "/test/kb16/hive/ratings.csv"; //获取路径信息 Path from = new Path(fromPath); //如果指定下载文件不存在就退出 if (!fs.exists(from)) { System.err.println(toPath + "doesn't exist "); return; } //获取文件下载路径信息 Path _to = new Path(toPath); //利用方法将Hadoop文件下载到本地 fs.copyToLocalFile(from,_to); System.out.println("succeed in downloading from "+fromPath+" to"+toPath); } catch (IOException e) { e.printStackTrace(); System.out.println("FAILURE"); } finally { //如果结束Hadoop文件系统句柄没有关闭,利用方法进行句柄释放 if (null != fs) try { fs.close(); } catch (IOException e) { e.printStackTrace(); } } } }
The above is the detailed content of How to implement HDFS file upload and download in Java. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

Guide to Square Root in Java. Here we discuss how Square Root works in Java with example and its code implementation respectively.

Guide to Perfect Number in Java. Here we discuss the Definition, How to check Perfect number in Java?, examples with code implementation.

Guide to Random Number Generator in Java. Here we discuss Functions in Java with examples and two different Generators with ther examples.

Guide to Weka in Java. Here we discuss the Introduction, how to use weka java, the type of platform, and advantages with examples.

Guide to the Armstrong Number in Java. Here we discuss an introduction to Armstrong's number in java along with some of the code.

Guide to Smith Number in Java. Here we discuss the Definition, How to check smith number in Java? example with code implementation.

In this article, we have kept the most asked Java Spring Interview Questions with their detailed answers. So that you can crack the interview.

Java 8 introduces the Stream API, providing a powerful and expressive way to process data collections. However, a common question when using Stream is: How to break or return from a forEach operation? Traditional loops allow for early interruption or return, but Stream's forEach method does not directly support this method. This article will explain the reasons and explore alternative methods for implementing premature termination in Stream processing systems. Further reading: Java Stream API improvements Understand Stream forEach The forEach method is a terminal operation that performs one operation on each element in the Stream. Its design intention is
