Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Chukwa >> mail # user >> About the hicc start  in eclipse


Copy link to this message
-
Re: About the hicc start in eclipse
You might want to check if hicc.log is written somewhere in your
system.  It is default to /tmp/chukwa/logs, but it may be at different
location when running in eclipse.  The log file might have more
information regarding populating /chukwa/hicc directory on hdfs.

regards,
Eric

On Wed, Jun 20, 2012 at 10:29 PM, Eric Yang <[EMAIL PROTECTED]> wrote:
> If you are running this in eclipse, chukwa-env.sh is not executed.
> Therefore you should defined all environment declared in chukwa-env.sh
> in environment variables in eclipse run configuration.
>
> regards,
> Eric
>
> On Wed, Jun 20, 2012 at 7:04 AM, mason deng <[EMAIL PROTECTED]> wrote:
>> hi ,Eric:
>>
>>                  thanks. you are right ,there must be  a .xml file rather
>> than a directory.  and i have already set it according to your instructions
>> .  now     the question is that the config.get("fs.default.name")  result
>> is null. Actually, even if when i manually
>> loading  config.get("fs.default.name") =hdfs://localhost:9000.  just only
>>  create a  /chukwa/hicc/widgets  empty directory  structure on the hdfs ,and
>> did not upload the "descriptors" file to hdfs。
>>    the key is why  unable to find the jar file in the classpath of the
>> current。
>>    This is my chukwa-env.sh
>>
>>
>> export JAVA_HOME=/home/rainerdun/project/jdk1.6.0_21
>>
>> #export HADOOP_HOME=/home/rainerdun/project/hadoop-1.0.0
>> export HADOOP_CONF_DIR=/home/rainerdun/project/hadoop-1.0.0/conf
>>
>> #export HBASE_HOME=/home/rainerdun/project/hbase-0.90.4
>> export HBASE_CONF_DIR=/home/rainerdun/project/hbase-0.90.4/conf
>>
>> export
>> CLASSPATH=${CLASSPATH}:/home/rainerdun/project/hbase-0.90.4/conf:/home/rainerdun/project/hadoop-1.0.0/conf
>> # The location of chukwa data repository (in either HDFS or your local
>> # file system, whichever you are using)
>> export chukwaRecordsRepository="/chukwa/repos/"
>>
>> # The directory where pid files are stored. CHUKWA_HOME/var/run by default.
>> #export CHUKWA_PID_DIR=/tmp/chukwa/pidDir
>>
>> # The location of chukwa logs, defaults to CHUKWA_HOME/logs
>> #export CHUKWA_LOG_DIR=/tmp/chukwa/log
>>
>> # The location to store chukwa data, defaults to CHUKWA_HOME/data
>> #export CHUKWA_DATA_DIR="${CHUKWA_HOME}/data"
>>
>> # Instance name for chukwa deployment
>> export CHUKWA_IDENT_STRING=$USER
>>
>> export JAVA_PLATFORM=Linux-i386-32
>> export JAVA_LIBRARY_PATH=${HADOOP_HOME}/lib/native/${JAVA_PLATFORM}
>>
>> # Datatbase driver name for storing Chukwa Data.
>> export JDBC_DRIVER=${TODO_CHUKWA_JDBC_DRIVER}
>>
>> # Database URL prefix for Database Loader.
>> export JDBC_URL_PREFIX=${TODO_CHUKWA_JDBC_URL_PREFIX}
>>
>> # HICC Jetty Server heap memory settings
>> # Specify min and max size of heap to JVM, e.g. 300M
>> export CHUKWA_HICC_MIN_MEM>> export CHUKWA_HICC_MAX_MEM>>
>> # HICC Jetty Server port, defaults to 4080
>> #export CHUKWA_HICC_PORT>>
>>       core-site.xml file
>>
>> <configuration>
>>      <property>
>>          <name>fs.default.name</name>
>>          <value>hdfs://localhost:9000</value>
>>      </property>
>> </configuration>
>>
>>
>>       and  chukwa-collector-conf.xml file
>>           ......
>>
>>  <property>
>>     <name>writer.hdfs.filesystem</name>
>>      <value>hdfs://localhost:9000</value>
>>     <description>HDFS to dump to</description>
>>   </property>
>> ......
>>
>>
>>
>>  Best Regards
>>   Mason
>>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB