We get click data through API calls. I now need to send this data to our
hadoop environment. I am wondering if I could open one sequence file and
write to it until it's of certain size. Once it's over the specified size I
can close that file and open a new one. Is this a good approach?
Only thing I worry about is what happens if the server crashes before I am
able to cleanly close the file. Would I lose all previous data?