Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Accumulo >> mail # dev >> Hadoop 2 compatibility issues

Copy link to this message
Re: Hadoop 2 compatibility issues
Response to Benson inline, but additional note here:

It should be noted that the situation will be made worse for the
solution I was considering for ACCUMULO-1402, which would move the
accumulo artifacts, classified by the hadoop2 variant, into the
profiles... meaning they will no longer resolve transitively when they
did before. Can go into details on that ticket, if needed.

On Tue, May 14, 2013 at 7:41 PM, Benson Margulies <[EMAIL PROTECTED]> wrote:
> On Tue, May 14, 2013 at 7:36 PM, Christopher <[EMAIL PROTECTED]> wrote:
>> Benson-
>> They produce different byte-code. That's why we're even considering
>> this. ACCUMULO-1402 is the ticket under which our intent is to add
>> classifiers, so that they can be distinguished.
> whoops, missed that.
> Then how do people succeed in just fixing up their dependencies and using it?

The specific differences are things like changes from abstract class
to an interface. Apparently an import of these do not produce
compatible byte-code, even though the method signature looks the same.

> In any case, speaking as a Maven-maven, classifiers are absolutely,
> positively, a cure worse than the disease. If you want the details
> just ask.

Agreed. I just don't see a good alternative here.

>> All-
>> To Keith's point, I think perhaps all this concern is a non-issue...
>> because as Keith points out, the dependencies in question are marked
>> as "provided", and dependency resolution doesn't occur for provided
>> dependencies anyway... so even if we leave off the profiles, we're in
>> the same boat. Maybe not the boat we should be in... but certainly not
>> a sinking one as I had first imagined. It's as afloat as it was
>> before, when they were not in a profile, but still marked as
>> "provided".
>> --
>> Christopher L Tubbs II
>> http://gravatar.com/ctubbsii
>> On Tue, May 14, 2013 at 7:09 PM, Benson Margulies <[EMAIL PROTECTED]> wrote:
>>> I just doesn't make very much sense to me to have two different GAV's
>>> for the very same .class files, just to get different dependencies in
>>> the poms. However, if someone really wanted that, I'd look to make
>>> some scripting that created this downstream from the main build.
>>> On Tue, May 14, 2013 at 6:16 PM, John Vines <[EMAIL PROTECTED]> wrote:
>>>> They're the same currently. I was requesting separate gavs for hadoop 2.
>>>> It's been on the mailing list and jira.
>>>> Sent from my phone, please pardon the typos and brevity.
>>>> On May 14, 2013 6:14 PM, "Keith Turner" <[EMAIL PROTECTED]> wrote:
>>>>> On Tue, May 14, 2013 at 5:51 PM, Benson Margulies <[EMAIL PROTECTED]
>>>>> >wrote:
>>>>> > I am a maven developer, and I'm offering this advice based on my
>>>>> > understanding of reason why that generic advice is offered.
>>>>> >
>>>>> > If you have different profiles that _build different results_ but all
>>>>> > deliver the same GAV, you have chaos.
>>>>> >
>>>>> What GAV are we currently producing for hadoop 1 and hadoop 2?
>>>>> >
>>>>> > If you have different profiles that test against different versions of
>>>>> > dependencies, but all deliver the same byte code at the end of the
>>>>> > day, you don't have chaos.
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Tue, May 14, 2013 at 5:48 PM, Christopher <[EMAIL PROTECTED]>
>>>>> wrote:
>>>>> > > I think it's interesting that Option 4 seems to be most preferred...
>>>>> > > because it's the *only* option that is explicitly advised against by
>>>>> > > the Maven developers (from the information I've read). I can see its
>>>>> > > appeal, but I really don't think that we should introduce an explicit
>>>>> > > problem for users (that applies to users using even the Hadoop version
>>>>> > > we directly build against... not just those using Hadoop 2... I don't
>>>>> > > know if that point was clear), to only partially support a version of
>>>>> > > Hadoop that is still alpha and has never had a stable release.
>>>>> > >
>>>>> > > BTW, Option 4 was how I had have achieved a solution for