-Re: Query Regarding design MR job for Billing
Marcos Ortiz 2012-02-27, 20:31
On 02/27/2012 11:33 PM, Stuti Awasthi wrote:
> Hi Marcos,
> Thanks for the pointers. I am also thinking on the similar lines.
> I am doubtful at 1 point :
> I will be having separate data files for every interval. Let's take example if I have 5 mins interval file which contain data for 2 hours and 10 mins. In this scenario I want to process 2 hours data with hours job and 10 mins data with mins job. Now since I will provide my data file as Input to MR jobs so I think original file needs to split in 2 files : HourFile and
> MinsFile. HourFile wll contain data for 2 hours and MinsFile will conatin data for 10 mins.
Well, you can with Oozie(http://yahoo.github.com/oozie/) or
Cascading(http://cascading.org) for complex workflow programming.
1- For example, you can write a MapReduce job for spit your data: one by
hour, and one by mins. In your case: a simple output would be one data
file containing your data for 2 hours, and another data file for your 10
mins. I think that this job could be Mapper-only type with the
2- Then you can write the different jobs for each interval
(HourIntervalJob, MonthIntervalJob, etc), spliting its outputs depending
of each interval in HDFS.
You can define your complete workflow, and then, you can evaluate Oozie
or Cascading to control that workflow.
Remember that all thes are suggestions. I'm not a MR expert
> I have attained file splitting with simple Java class but I think there is too much I/O operations and if I can attain this also in MR or in some efficient way, it will be good because the original data files can be huge and then the initial breaking of files will itself take too much time.
> Please suggest.
> -----Original Message-----
> From: Marcos Ortiz [mailto:[EMAIL PROTECTED]]
> Sent: Sunday, February 26, 2012 7:40 PM
> To: [EMAIL PROTECTED]
> Cc: Stuti Awasthi
> Subject: Re: Query Regarding design MR job for Billing
> Well, first, you can design 6 MR jobs:
> 1- for 5 mins interval
> 2- for 1 hour
> 3- for 1 day
> 4- for 1 month
> 5- for 1 year
> 6- and a last for any interval
> If you say that for each interval, you have to do a different calculation; this way could be a solution (at least I think that).
> You can read the "design patterns" for MapReduce algorithms proposed by Jimmy Lin and Chris Dyer on his "Data-Intensive Text Processing with MapReduce" book.
> On 02/27/2012 05:39 AM, Stuti Awasthi wrote:
>> No. The data will be either of 5 mins interval, or 1 hour interval or 1 day interval and so on ....
>> So suppose utilization is for 40 days then I will charge 30 days according to months billing and remaining 10 days as days billing job.
>> -----Original Message-----
>> From: Rohit Kelkar [mailto:[EMAIL PROTECTED]]
>> Sent: Monday, February 27, 2012 4:06 PM
>> To: [EMAIL PROTECTED]
>> Subject: Re: Query Regarding design MR job for Billing
>> Just trying to understand your use case you need an hour job to run on
>> data between 6:40 AM and 7:40 AM. Would it be like a moving window?
>> For ex. run hour jobs on
>> 6:41 AM to 7:41 AM
>> 6:42 AM to 7:42 AM
>> and so on...
>> On Mon, Feb 27, 2012 at 1:01 PM, Stuti Awasthi<[EMAIL PROTECTED]> wrote:
>>> Hi all,
>>> I have to implement BillingEngine using MR jobs. My usecase is like this:
>>> I will be having data files of format<TimeStamp> <Information for Billing>.
>>> Now these datafiles will be containing timestamp either at minute interval, hour inverval, day interval, month interval, year interval. Every type of interval will be having different type of calculation for billing so basically different jobs for every type of interval.
>>> Suppose I have a data file which contain minute interval timestamp. I have a scenario that if data is present for hours , then it should be processed by hourly job and remaining will be processed by minutejob.
>>> Example :
>>> 2/10/12 6:40 AM<data for billing>
Marcos Luis Ortï¿½z Valmaseda
Sr. Software Engineer (UCI)
Fin a la injusticia, LIBERTAD AHORA A NUESTROS CINCO COMPATRIOTAS QUE SE ENCUENTRAN INJUSTAMENTE EN PRISIONES DE LOS EEUU!