I'm heading home but I can play with this tomorrow. On the positive side, I have D4M reading the data that I wrote from Java. So that's nice. :) On Thu, May 1, 2014 at 4:46 PM, Al Krinker <[EMAIL PROTECTED]> wrote:
I checked HDFS and it was there... the issue was and I have to thank one of my friends who ran into it before..
When importdirectory runs it uses cachedconfig... so it was getting my local one...
All I did to solve it was to add CachedConfiguration.setInstance(conf); right after I created conf and pointed it to my hadoop hdfs...
Worked perfectly... i was able to create new rfile and write it to a table in accumulo... The code that I posted works (plus the fix) for anyone interested.
Anyway, that was it... and thank you Josh for your feedback! You are awesome :)
On Thu, May 1, 2014 at 5:52 PM, Josh Elser <[EMAIL PROTECTED]> wrote:
NEW: Monitor These Apps!
Apache Lucene, Apache Solr and all other Apache Software Foundation projects and their respective logos are trademarks of the Apache Software Foundation.
Elasticsearch, Kibana, Logstash, and Beats are trademarks of Elasticsearch BV, registered in the U.S. and in other countries. This site and Sematext Group is in no way affiliated with Elasticsearch BV.
Service operated by Sematext