Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Drill >> mail # dev >> sqlline access to Drill


Copy link to this message
-
sqlline access to Drill
Hi Drillers,

I've just logged https://issues.apache.org/jira/browse/DRILL-72, a set of patches to implement the SQL front-end. It also includes a command-line based on the sqlline utility.

It's easy to use. Just run the 'sqlline' script in the sandbox/prototype directory, like this:

$ cd sandbox/prototype/
$ mvn install
$ ./sqlline
Loaded singnal handler: SunSignalHandler
/home/jhyde/.sqlline/sqlline.properties (No such file or directory)
sqlline version ??? by Marc Prud'hommeaux
sqlline> !connect jdbc:optiq:model=common/target/test-classes/donuts-model.json admin admin
Connecting to jdbc:optiq:model=common/target/test-classes/donuts-model.json

Connected to: Optiq (version 0.4.2)
Driver: Optiq JDBC Driver (version 0.4.2)
Autocommit status: true
Transaction isolation: TRANSACTION_REPEATABLE_READ
sqlline> !tables
+------------+--------------+-------------+---------------+----------+---------+
| TABLE_CAT  | TABLE_SCHEM  | TABLE_NAME  |  TABLE_TYPE   | REMARKS  | TYPE_CA |
+------------+--------------+-------------+---------------+----------+---------+
| null       | DONUTS       | DONUTS      | TABLE         | null     | null    |
| null       | metadata     | COLUMNS     | SYSTEM_TABLE  | null     | null    |
| null       | metadata     | TABLES      | SYSTEM_TABLE  | null     | null    |
+------------+--------------+-------------+---------------+----------+---------+
sqlline> !set outputformat csv
sqlline> select * from donuts;
==========='_MAP'
'{batters={batter=[{id=1001, type=Regular}, {id=1002, type=Chocolate}, {id=1003, type=Blueberry}, {id=1004, type=Devil's Food}]}, id=0001, name=Cake, ppu=0.55, sales=35, topping=[{id=5001, type=None}, {id=5002, type=Glazed}, {id=5005, type=Sugar}, {id=5007, type=Powdered Sugar}, {id=5006, type=Chocolate with Sprinkles}, {id=5003, type=Chocolate}, {id=5004, type=Maple}], type=donut}'
'{batters={batter=[{id=1001, type=Regular}]}, id=0002, name=Raised, ppu=0.69, sales=145, topping=[{id=5001, type=None}, {id=5002, type=Glazed}, {id=5005, type=Sugar}, {id=5003, type=Chocolate}, {id=5004, type=Maple}], type=donut}'
'{batters={batter=[{id=1001, type=Regular}, {id=1002, type=Chocolate}]}, id=0003, name=Old Fashioned, ppu=0.55, sales=300, topping=[{id=5001, type=None}, {id=5002, type=Glazed}, {id=5003, type=Chocolate}, {id=5004, type=Maple}], type=donut}'
'{batters={batter=[{id=1001, type=Regular}, {id=1002, type=Chocolate}, {id=1003, type=Blueberry}, {id=1004, type=Devil's Food}]}, filling=[{id=6001, type=None}, {id=6002, type=Raspberry}, {id=6003, type=Lemon}, {id=6004, type=Chocolate}, {id=6005, type=Kreme}], id=0004, name=Filled, ppu=0.69, sales=14, topping=[{id=5001, type=None}, {id=5002, type=Glazed}, {id=5005, type=Sugar}, {id=5007, type=Powdered Sugar}, {id=5006, type=Chocolate with Sprinkles}, {id=5003, type=Chocolate}, {id=5004, type=Maple}], type=donut}'
'{batters={batter=[{id=1001, type=Regular}]}, id=0005, name=Apple Fritter, ppu=1.0, sales=700, topping=[{id=5002, type=Glazed}], type=donut}'
5 rows selected (1.027 seconds)
sqlline> explain plan for select * from donuts;
'PLAN'
'EnumerableDrillRel
  DrillProjectRel(_MAP=[$0])
    DrillScan(table=[[DONUTS, DONUTS]])
'
1 row selected (0.053 seconds)
sqlline> select _map['ppu'] as ppu, _map['name'] as name from donuts;
'PPU','NAME'
'0.55','Cake'
'0.69','Raised'
'0.55','Old Fashioned'
'0.69','Filled'
'1.0','Apple Fritter'
5 rows selected (0.119 seconds)
sqlline> !quit
Closing: net.hydromatic.optiq.jdbc.FactoryJdbc41$OptiqConnectionJdbc41
$

Note that each Drill table has a single column in SQL, called _MAP. You can access fields using "_MAP['fieldName']", and you can also use "CAST(... AS <datatype>)" to convert to the desired type.

Julian
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB