log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.log4j:WARN See for more info.Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: hdfs://sandbox.hortonworks.com:8020/user/testdir, expected: file:/// at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645) at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:80) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:423) at org.apache.hadoop.fs.ChecksumFileSystem.mkdirs(ChecksumFileSystem.java:590) at com.hdfs.directory.CreateDir.main(CreateDir.java:19)在本机运行是会出现上面的错误,改成下面的代码:package com.hdfs.directory;import java.io.IOException;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;public class CreateDir { /** * @param args * @throws IOException */ public static void main(String[] args) throws IOException { // TODO Auto-generated method stub Configuration conf=new Configuration(); Path newPath = new Path("hdfs://sandbox.hortonworks.com:8020/user/testdir"); //FileSystem fs=FileSystem.get(conf); FileSystem fs = newPath.getFileSystem(conf); fs.mkdirs(newPath); fs.close();// FileStatus fileStatus=fs.getFileStatus(new Path("/user/long1657/20130908")); }}
或者hadoop相关的配置信息,放在工程目录下:
执行上面的代码后:/user/testdir目录被成功删除:
在将hadoop相关的配置文件添加到工程中后,用下面的代码就可以成功创建文件夹:
package com.hdfs.directory;import java.io.IOException;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;public class CreateDirWithConf { /** * @param args * @throws IOException */ public static void main(String[] args) throws IOException { // TODO Auto-generated method stub Configuration conf=new Configuration(); FileSystem fs=FileSystem.get(conf); fs.mkdirs(new Path("hdfs://sandbox.hortonworks.com:8020/user/testdir")); fs.close();// FileStatus fileStatus=fs.getFileStatus(new Path("/user/long1657/20130908")); }}