Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hive >> mail # user >> FileAlreadyExistsException Parent path is not a directory


+
Abhishek Gayakwad 2013-02-25, 12:34
+
Arthur Boender 2013-02-25, 13:39
Copy link to this message
-
Re: FileAlreadyExistsException Parent path is not a directory
do a $HADOOP_HOME/bin/hadoop dfs -ls  /a/b/c/d/log

you should see it as a file

also when you give a location in create table statement... make it a
directory
On Mon, Feb 25, 2013 at 6:04 PM, Abhishek Gayakwad <[EMAIL PROTECTED]>wrote:

> I am using Hive 0.9.0, while creating external table
>
> create external table if not exists table1
> (
> Id int,
> Name string,
> PName string,
> timestamp string
> )
> row format delimited fields terminated by '\t'
> LOCATION '/a/b/c/d/log';
>
>
> and getting this error
>
> FAILED: Error in metadata: MetaException(message:Got exception:
> org.apache.hadoop.fs.FileAlreadyExistsException Parent path is not a
> directory: /a/b/c/d/log log
>         at
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.mkdirs(FSDirectory.java:1485)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2891)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2844)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2823)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:639)
>         at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
>         at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> )
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
>
>
> Does any body know the reason why this is happening
>
> Thanks
>

--
Nitin Pawar
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB