Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> alter table add partition error


Copy link to this message
-
RE: alter table add partition error
Looks like the standalone script works fine against the existing
partition:

./ThriftHiveMetastore-remote -h localhost:9080 get_partition_by_name
default dummy datestamp=20100602/srcid=100/action=view/testid=10

Partition(parameters={'transient_lastDdlTime': '1276881277'},
tableName='dummy', createTime=1276881277, lastAccessTime=0,
values=['20100602', '100', 'view', '10'], dbName='default',
sd=StorageDescriptor(outputFormat='org.apache.hadoop.hive.ql.io.HiveIgno
reKeyTextOutputFormat', sortCols=[],
inputFormat='org.apache.hadoop.mapred.TextInputFormat',
cols=[FieldSchema(comment=None, type='string', name='partition_name'),
FieldSchema(comment=None, type='int', name='partition_id')],
compressed=False, bucketCols=[], numBuckets=-1, parameters={},
serdeInfo=SerDeInfo(serializationLib='org.apache.hadoop.hive.serde2.lazy
.LazySimpleSerDe', name=None, parameters={'serialization.format': '1'}),
location='hdfs://wilbur21.labs.corp.sp1.yahoo.com/user/pradeepk/dummy/20
100602/100/view/10'))

[pradeepk@chargesize:~/dev/howl/src/metastore/src/gen-py/hive_metastore]

 

However when I tried to add another partition with the hive cli using
thrift:

hive  -e "ALTER TABLE dummy add partition(datestamp = '20100602', srcid
= '100',action='click',testid='10') location
'/user/pradeepk/dummy/20100602/100/click/10';"

10/06/18 14:49:13 WARN conf.Configuration: DEPRECATED: hadoop-site.xml
found in the classpath. Usage of hadoop-site.xml is deprecated. Instead
use core-site.xml, mapred-site.xml and hdfs-site.xml to override
properties of core-default.xml, mapred-default.xml and hdfs-default.xml
respectively

Hive history
file=/tmp/pradeepk/hive_job_log_pradeepk_201006181449_1158492515.txt

FAILED: Error in metadata: org.apache.thrift.TApplicationException:
get_partition failed: unknown result

FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask

 

tail -30 /tmp/pradeepk/hive.log

        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
Impl.java:25)

        at java.lang.reflect.Method.invoke(Method.java:597)

        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

 

2010-06-18 14:49:14,124 ERROR exec.DDLTask
(SessionState.java:printError(277)) - FAILED: Error in metadata:
org.apache.thrift.TApplicationException: get_partition failed: unknown
result

org.apache.hadoop.hive.ql.metadata.HiveException:
org.apache.thrift.TApplicationException: get_partition failed: unknown
result

        at
org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:778)

        at
org.apache.hadoop.hive.ql.exec.DDLTask.addPartition(DDLTask.java:255)

        at
org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:169)

        at
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107)

        at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:
55)

        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:633)

        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:506)

        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:384)

        at
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)

        at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)

        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:267)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav
a:39)

        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
Impl.java:25)

        at java.lang.reflect.Method.invoke(Method.java:597)

        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

Caused by: org.apache.thrift.TApplicationException: get_partition
failed: unknown result

        at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get
_partition(ThriftHiveMetastore.java:931)

        at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_part
ition(ThriftHiveMetastore.java:899)

        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getPartition(HiveMe
taStoreClient.java:500)

        at
org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:756)

        ... 15 more

 

2010-06-18 14:49:14,124 ERROR ql.Driver
(SessionState.java:printError(277)) - FAILED: Execution Error, return
code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

 

________________________________

From: Paul Yang [mailto:[EMAIL PROTECTED]]
Sent: Friday, June 18, 2010 2:19 PM
To: [EMAIL PROTECTED]
Subject: RE: alter table add partition error

 

Looks like the thrift python libraries aren't in your path. run:

 

export PYTHONPATH=<path-to-trunk>/build/dist/lib/py/

 

before trying the remote command

 

From: Pradeep Kamath [mailto:[EMAIL PROTECTED]]
Sent: Friday, June 18, 2010 1:38 PM
To: [EMAIL PROTECTED]
Subject: RE: alter table add partition error

 

Sorry, don't know python - so couldn't make sense out of the following
error when I run the command you suggested:

[src/metastore/src/gen-py/hive_metastore]./ThriftHiveMetastore-remote -h
localhost:9080 get_partition_by_name default dummy
datestamp=20100602/srcid=100/action=view/testid=10

Traceback (most recent call last):

  File "./ThriftHiveMetastore-remote", line 11, in ?

    from thrift.transport import TTransport

ImportError: No module named thrift.transport

 

Thanks,

Pradeep

 

________________________________

From: Paul Yang [mailto:[EMAIL PROTECTED]]
Sent: Friday, June 18, 2010 12:10 PM
To: [EMAIL PROTECTED]
Subject: RE: alter table add partition error

 

Hi Pradeep,

 

In trunk/metastore/src/gen-py/hive_metastore/, there is a script called
ThriftHiveMetastore-remote that can be used to test out the thrift
server independent of the CLI. As a quick test to narrow down the
problem, after the partition is created, can you try running

 

ThriftHiveMetastore-remote -h localhost:9080 get_partition_by_name
default d