I think you need to go back and think about why you want to use Hadoop and HBase in the first place.
Second, you need to think about your data and how you are planning to use it.
Beyond that, we can only give you a bit of generic answers....
1) You can create a table with 600 columns, however... it depends on what you are trying to do. There are some limitations that you have to consider in your design. However for the specific use case you stated.... they are not applicable.
2) You can have models with different column families. However again it depends on what you are trying to do.
However, in your example ... customer address... That's not a good example of when to use a column family.
I was going to do a schema design course at a Hadoop conference next year, but it got turned down because it was considered to 'basic'. Maybe I'll propose it for the Hadoop conference in Amsterdam... sorry, I digressed.
Have you thought about using a schema on top of HBase? At a minimum, Avro, or possibly Wibidata's Kiji ? (Not that I'm plugging Aaron's project. ;-)
I am also curious... this isn't the first time this question has come up on the lists... class project?
On Nov 26, 2012, at 1:04 AM, Ramasubramanian Narayanan <[EMAIL PROTECTED]> wrote:
> I have a requirement of physicalising the logical model... I have a
> client model which has 600+ entities...
> Need suggestion how to go about physicalising it...
> I have few other doubts :
> 1) Whether is it good to create a single table for all the 600+ columns?
> 2) To have different column families for different groups or can it be
> under a single column family? For example, customer address can we have as
> a different column family?
> Please help on this..