Anyone know what I need to set in order for spark-submit to use the HDP
version of spark and not the internal one?

currently i see:

export HADOOP_CONF_DIR=/ebs/kylin/hadoop-conf &&
I see in the files:
## Spark conf (default is in spark/conf/spark-defaults.conf)

Although it doesn't how how I can change this to use the HDP spark-submit.

Also HDP is on 1.6.1 version of spark and kylin internally uses 2.x.  Not
sure if that matters during submit.  I can't seem to get more than 2
executors to run without it failing with other errors.  We have about 44
slots on our cluster.

Also uncommented:
## uncomment for HDP



see attached for other properties set.
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB