I'm a beginner and I'm having issues with MR when trying to read values of the form 99.99.
I'm reading up as much as I can. I wanted to try to use Java's types to determine whether to use a DoubleWritable or FloatWritable, but after some research, Java seems to recommend against using either of these for representing currency.
I'm using the FileInputFormat class which I understand defaults to a LongWritable,Text <K,V> pair.
What Hadoop data type should I use with my Driver section/Job class to work with currency?
I've tried FloatWritable and DoubleWritable and my output always ends up as String,Number.0 in the output file ("something[tab]55.0", for example, when I know 55.0 is wrong).
Unless I'm missing something, I can't find a BigDecimalWritable class in Hadoop.
Hive appears to have such a class...
I'll keep searching...
NEW: Monitor These Apps!
Apache Lucene, Apache Solr and all other Apache Software Foundation projects and their respective logos are trademarks of the Apache Software Foundation.
Elasticsearch, Kibana, Logstash, and Beats are trademarks of Elasticsearch BV, registered in the U.S. and in other countries. This site and Sematext Group is in no way affiliated with Elasticsearch BV.
Service operated by Sematext