Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - Automatically upload files into HDFS


Copy link to this message
-
Re: Automatically upload files into HDFS
Mohammad Tariq 2012-11-19, 12:35
BTW, Alex has got a point. You could write a cronjob or something as you
just have to move data from your Local FS to HDFS.

Regards,
    Mohammad Tariq

On Mon, Nov 19, 2012 at 6:04 PM, Mohammad Tariq <[EMAIL PROTECTED]> wrote:

> I am so so sorry for the blunder. I was doing something with the twitter
> API and copied that link by mistake. Apologies. Please use this link :
> http://cloudfront.blogspot.in/2012/06/how-to-build-and-use-flume-ng.html
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 19, 2012 at 6:00 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>
>> Thanks M. Tariq
>>
>> I have tried to visit the link but I think is not accessible as generate
>> the following error message:
>>
>>  Whoa there!
>>
>> The request token for this page is invalid. It may have already been
>> used, or expired because it is too old. Please go back to the site or
>> application that sent you here and try again; it was probably just a
>> mistake.
>>
>>    - Go to Twitter <http://twitter.com/home>.
>>
>>  You can revoke access to any application at any time from the Applications
>> tab <http://twitter.com/settings/applications> of your Settings page.
>>
>> By authorizing an application you continue to operate under Twitter's
>> Terms of Service <http://twitter.com/tos>. In particular, some usage
>> information will be shared back with Twitter. For more, see our Privacy
>> Policy <http://twitter.com/privacy>.
>>
>>
>>
>> Best regards,
>>
>> KK
>>
>>
>>
>>
>>
>> On Mon, Nov 19, 2012 at 10:50 AM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>
>>> Hello Kashif,
>>>
>>>     You can visit this link and see if it is of any help to you. I have
>>> shared some of my initial experience here.
>>>
>>> http://api.twitter.com/oauth/authorize?oauth_token=ndACNGIkLSeMJdeMIeQYowyzpjDtvvmqo5ja9We7zo
>>>
>>> You may want to skip the build part and download the release directly
>>> and start off with that.
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>>
>>> On Mon, Nov 19, 2012 at 4:14 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>>
>>>> HI,
>>>>
>>>> I am generating files continuously in local folder of my base machine.
>>>> How I can now use the flume to stream the generated files from local folder
>>>> to HDFS.
>>>> I dont know how exactly configure the sources, sinks and hdfs.
>>>>
>>>> 1) location of folder where files are generating: /usr/datastorage/
>>>> 2) name node address: htdfs://hadoop1.example.com:8020
>>>>
>>>> Please let me help.
>>>>
>>>> Many thanks
>>>>
>>>> Best regards,
>>>> KK
>>>
>>>
>>>
>>
>