hadoop的idea环境搭建
⼤数据专栏
⽬录
【前⾔】
Intellij IDEA连接Hadoop HDFS实现本地调试
1 在IDEA中安装Hadoop插件
插件下载地址
安装插件
重启IDEA,菜单栏新增Hadoop
菜单栏Hadoop–设置,填写连接参数
连接成功
每次更改⽂件,可能都需要以⽤户登录权限,⽐较⿇烦。这个可以配置,在l来配置
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
2 操作虚拟机hdfs集
2.1 创建Maven项⽬
代码结构如下:
2.l添加依赖
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="/POM/4.0.0"
xsi="/2001/XMLSchema-instance"
schemaLocation="/POM/4.0.0 /xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion>
<groupId>com.shane.hadoop</groupId>
<artifactId>bigdata</artifactId>
<version>1.0-SNAPSHOT</version>
<repositories>
<repository>
<id>apache</id>
<url></url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>下载apache
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.3</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<configuration>
<excludeTransitive>false</excludeTransitive>
<stripVersion>true</stripVersion>
<outputDirectory>./lib</outputDirectory>
</configuration>
</plugin>
</plugins>
</build>
</project>
2.3 编写测试连接代码
package t_hdfs;
import java.URI;
import org.f.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class MyFirstHDFSDemo {
public static void main(String[] args)throws Exception {
URI uri =new URI("hdfs://192.168.65.101:9000/");
Configuration conf =new Configuration();
String user ="hadoop";
FileSystem fs = (uri,conf,user);
//fs.copyFromLocalFile(new Path(args[0]), new Path(args[1]));
boolean exist = fs.exists(new Path("/"));
if(exist){
System.out.println("success");
}else{
System.out.println("failed");
}
fs.close();
}
}
2.4 结果
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论