Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hive >> mail # user >> Re: Export hive table format issue


+
Jarek Jarcec Cecho 2013-06-18, 23:44
+
Nitin Pawar 2013-06-19, 05:21
Copy link to this message
-
Re: Export hive table format issue
Thnx alot Nitin and all, Thats the root cause. Field separator was default
i.e ^A and the above issue u have mentioned. Thnx again :) Stay Blessed
On Wed, Jun 19, 2013 at 10:21 AM, Nitin Pawar <[EMAIL PROTECTED]>wrote:

> Jarek,
>
> Any chances that Hamza is hitting this one SQOOP-188: Problem with NULL
> values in MySQL export <https://issues.cloudera.org/browse/SQOOP-188>
>
> In that case I would recommend him to use
> --input-null-string "\\\\N"   --input-null-non-string "\\\\N"
>
> Hamza, can you try above options
>
>
>
> On Wed, Jun 19, 2013 at 5:14 AM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:
>
>> Would you mind upgrading Sqoop to version 1.4.3?
>>
>> We've significantly improved error logging for case when the input data
>> can't be parsed during export. You should get state dump (exception, input
>> file, position in the file, entire input line) available in the associated
>> map task log.
>>
>> Jarcec
>>
>> On Tue, Jun 18, 2013 at 03:14:52PM +0000, Arafat, Moiz wrote:
>> > Can you try using default value ex 0 or 9999999 instead of storing NULL
>> in the numeric column on hive side ?
>> >
>> > Thanks,
>> > Moiz Arafat
>> >
>> > On Jun 18, 2013, at 9:14 AM, Hamza Asad <[EMAIL PROTECTED]<mailto:
>> [EMAIL PROTECTED]>> wrote:
>> >
>> > Nitin,
>> >        Issue is not with the INT or BIGINT (as i have verified both),
>> exception is same.. Issue is with some thing else.. Please sort out any
>> solution... following exception still raising (# in input string is not
>> visible in terminal and is translated to # when copied to office writer
>> which i pasted below)
>> > java.lang.NumberFormatException: For input string: " 433649#1#534782#2"
>> >     at
>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>> >     at java.lang.Long.parseLong(Long.java:441)
>> >     at java.lang.Long.valueOf(Long.java:540)
>> >     at
>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>> >     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >     at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> >     at java.lang.Long.parseLong(Long.java:441)
>> >     at java.lang.Long.valueOf(Long.java:540)
>> >     at
>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>> >     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >     at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> >     at java.security.AccessController.doPrivileged(Native Method)
>> >     at javax.security.auth.Subject.doAs(Subject.java:415)
>> >     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> >
>> >
>> >
>> > On Tue, Jun 18, 2013 at 5:53 PM, Nitin Pawar <[EMAIL PROTECTED]
>> <mailto:[EMAIL PROTECTED]>> wrote:
>> > can you change your mysql schema to have bigint instead of just int.
>> > for more you can refer this
>> http://stackoverflow.com/questions/16886668/why-sqoop-fails-on-numberformatexception-for-numeric-column-during-the-export-fr

*Muhammad Hamza Asad*
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB