Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> Accessing Hadoop DFS for Data Storage and Retrieval Using Java


Copy link to this message
-
Re: Accessing Hadoop DFS for Data Storage and Retrieval Using Java
Yes, I think so.

2010/3/10 Miguel Ángel Álvarez de la Concepción <[EMAIL PROTECTED]>

>  I have intalled Hadoop in CentOS (Linux) and the test code is running on
> Windows.
>
> Do I need cygwin to run the test code?
>
>
>
> *De:* Jeff Zhang [mailto:[EMAIL PROTECTED]]
> *Enviado el:* miércoles, 10 de marzo de 2010 14:10
>
> *Para:* [EMAIL PROTECTED]
> *Asunto:* Re: Accessing Hadoop DFS for Data Storage and Retrieval Using
> Java
>
>
>
> It seems you are running it in windows, then you should install cygwin, and
> add C:/cygwin/bin on the Path environment variable .
>
>
>  2010/3/10 Miguel Ángel Álvarez de la Concepción <[EMAIL PROTECTED]>
>
> Thanks!
>
>
>
> Now, the error occurs after copying the remote file I uploaded before:
>
>
>
> java.io.IOException: Cannot run program "chmod": CreateProcess error=2, The
> system can’t find the specified file
>
>         at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)
>
>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
>
>         at org.apache.hadoop.util.Shell.run(Shell.java:134)
>
>         at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)
>
>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)
>
>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)
>
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)
>
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)
>
>         at
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)
>
>         at
> org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)
>
>         at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
>
>         at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)
>
>         at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)
>
>         at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:208)
>
>         at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)
>
>         at
> org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1216)
>
>         at
> org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1197)
>
>         at hadoop.testHadoop.main(testHadoop.java:53)
>
> Caused by: java.io.IOException: CreateProcess error=2, The system can’t
> find the specified file
>
>         at java.lang.ProcessImpl.create(Native Method)
>
>         at java.lang.ProcessImpl.<init>(ProcessImpl.java:81)
>
>         at java.lang.ProcessImpl.start(ProcessImpl.java:30)
>
>         at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)
>
>         ... 17 more
>
>
>
> Thanks again for your help.
>
>
>
> *De:* Jeff Zhang [mailto:[EMAIL PROTECTED]]
> *Enviado el:* martes, 09 de marzo de 2010 17:13
> *Para:* [EMAIL PROTECTED]
> *Asunto:* Re: Accessing Hadoop DFS for Data Storage and Retrieval Using
> Java
>
>
>
>
> add ugi configuration like this :
>
> conf.set("hadoop.job.ugi",your_hadoop_user_name+","+your_hadoop_group_name);
>
> 2010/3/9 Miguel Ángel Álvarez de la Concepción <[EMAIL PROTECTED]>
>
> Hi,
>
>
>
> I tried to run the Java code and it doesn't work.
>
>
>
> I pasted the code below:
>
>
>
> public class testHadoop {
>
>     public static final String DIR_HADOOP = "hdfs://my.machine.com";
>
>     public static final String PORT_HADOOP = "9000";
>
>
>
>     public static void main(String[] args) {
>
>         Configuration config = new Configuration();
>
>         config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);
>
>
>
>         try {
>
>             FileSystem haddopFileSystem = FileSystem.get(config);
>
>
>
>             String directory = "test";
>
>             Path hadoopDirectory = new
> Path(haddopFileSystem.getWorkingDirectory() + "/" + directory);
>
>
>
>             haddopFileSystem.mkdirs(hadoopDirectory);
>
>
>
>             Path sourceDirectory = new
> Path("C://Windows/media/ringout.wav");
>
>
>
>             haddopFileSystem.copyFromLocalFile(sourceDirectory,

Best Regards

Jeff Zhang
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB