Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> rules engine with Hadoop


+
Luangsay Sourygna 2012-10-19, 19:25
+
Peter Lin 2012-10-19, 20:30
+
Luangsay Sourygna 2012-10-20, 14:24
+
Peter Lin 2012-10-20, 14:38
+
Luangsay Sourygna 2012-10-20, 18:03
+
Peter Lin 2012-10-21, 13:49
Copy link to this message
-
Re: rules engine with Hadoop
Unification in a parallel cluster is a difficult problem.  Writing very
large scale unification programs is an even harder problem.

What problem are you trying to solve?

One option would be that you need to evaluate a conventionally-sized
rulebase against many inputs.  Map-reduce should be trivially capable of
this.

Another option would be that you want to evaluate a huge rulebase against a
few inputs.  It isn't clear that this would be useful given the problems of
huge rulebases and the typically super-linear cost of resolution algorithms.

Another option is that you want to evaluate many conventionally-sized
rulebases against one or many inputs in order to implement a boosted rule
engine.  Map-reduce should be relatively trivial for this as well.

What is it that you are trying to do?

On Fri, Oct 19, 2012 at 12:25 PM, Luangsay Sourygna <[EMAIL PROTECTED]>wrote:

> Hi,
>
> Does anyone know any (opensource) project that builds a rules engine
> (based on RETE) on top Hadoop?
> Searching a bit on the net, I have only seen a small reference to
> Concord/IBM but there is barely any information available (and surely
> it is not open source).
>
> Alpha and beta memories would be stored on HBase. Should be possible, no?
>
> Regards,
>
> Sourygna
>
+
Peter Lin 2012-10-19, 20:37
+
Luangsay Sourygna 2012-10-20, 13:48
+
Peter Lin 2012-10-20, 14:22
+
Ted Dunning 2012-10-21, 02:07