This page is no longer maintained — Please continue to the home page at www.scala-lang.org

actors and the "inheritance anamoly"

12 replies
Russ P.
Joined: 2009-01-31,
User offline. Last seen 1 year 26 weeks ago.
I had an email discussion recently with someone who claims that object-oriented inheritance is a bad idea due to something called the "inheritance anaomoly." I googled it and found this paper:

http://wisnesky.net/anomaly.pdf

which says,

"The Inheritance Anomaly has been a thorn in the side of the concurrent object-oriented language community for 15 years. Simply put, the anomaly is a failure of inheritance and concurrency to work well with each other, negating the usefulness of inheritance as a mechanism for code-reuse in a concurrent setting."

The paper does not discuss actors, but I am wondering of actors solve this problem. I don't need to know this for my current work, but I am curious. Does anyone know about this topic? Thanks.

--Russ P.

--
http://RussP.us
Danielk
Joined: 2009-06-08,
User offline. Last seen 3 years 21 weeks ago.
Re: actors and the "inheritance anamoly"
Ok, so I didn't know about this topic, and to be honest I've only had time to glance at the paper you referred to and at some other papers (Classifying Inheritance Mechanisms inConcurrent Object-Oriented Programming and The inheritance anomaly: ten years after). But it seems the issue is that when inheriting a class contaning synchronization, you often have to override many more methods than you'd  have to if synchronization is not necessary.

So this is a problem that arises when you want to implement a class A' that adds capabilities to an existing class A, and want to use inheritance to achieve code reuse. Now, if you can assume each instance of the class is *always* used by "one thread at a time" (with appropriate happens-before relationships in jvm-speak), you don't need to use synchronization in A or A', meaning you won't suffer from the problem.

Now, using actors is a disciplined way to build concurrent programs without using shared mutable state, where each instance is used by "one thread at a time". So actors can certainly help solving the problem, in the sense that your A and A' are perfectly usable in actor-based programs without any internal synchronization.

Best regards,
Daniel




On Sat, Jun 4, 2011 at 8:47 AM, Russ Paielli <russ [dot] paielli [at] gmail [dot] com> wrote:
I had an email discussion recently with someone who claims that object-oriented inheritance is a bad idea due to something called the "inheritance anaomoly." I googled it and found this paper:

http://wisnesky.net/anomaly.pdf

which says,

"The Inheritance Anomaly has been a thorn in the side of the concurrent object-oriented language community for 15 years. Simply put, the anomaly is a failure of inheritance and concurrency to work well with each other, negating the usefulness of inheritance as a mechanism for code-reuse in a concurrent setting."

The paper does not discuss actors, but I am wondering of actors solve this problem. I don't need to know this for my current work, but I am curious. Does anyone know about this topic? Thanks.

--Russ P.

--
http://RussP.us

Lex
Joined: 2010-02-28,
User offline. Last seen 42 years 45 weeks ago.
Re: actors and the "inheritance anamoly"

It seems to me the term is deliberately named to be sensational. If
anything, it should be called "Synchronization Issues", not
"Inheritance Anomaly". One has to be careful with buzzwords.

On Sat, Jun 4, 2011 at 12:00 PM, Daniel Kristensen
wrote:
> Ok, so I didn't know about this topic, and to be honest I've only had time
> to glance at the paper you referred to and at some other papers (Classifying
> Inheritance Mechanisms inConcurrent Object-Oriented Programming and The
> inheritance anomaly: ten years after). But it seems the issue is that when
> inheriting a class contaning synchronization, you often have to override
> many more methods than you'd  have to if synchronization is not necessary.
>
> So this is a problem that arises when you want to implement a class A' that
> adds capabilities to an existing class A, and want to use inheritance to
> achieve code reuse. Now, if you can assume each instance of the class is
> *always* used by "one thread at a time" (with appropriate happens-before
> relationships in jvm-speak), you don't need to use synchronization in A or
> A', meaning you won't suffer from the problem.
>
> Now, using actors is a disciplined way to build concurrent programs without
> using shared mutable state, where each instance is used by "one thread at a
> time". So actors can certainly help solving the problem, in the sense that
> your A and A' are perfectly usable in actor-based programs without any
> internal synchronization.
>
> Best regards,
> Daniel
>
>
>
>
> On Sat, Jun 4, 2011 at 8:47 AM, Russ Paielli wrote:
>>
>> I had an email discussion recently with someone who claims that
>> object-oriented inheritance is a bad idea due to something called the
>> "inheritance anaomoly." I googled it and found this paper:
>>
>> http://wisnesky.net/anomaly.pdf
>>
>> which says,
>>
>> "The Inheritance Anomaly has been a thorn in the side of the concurrent
>> object-oriented language community for 15 years. Simply put, the anomaly is
>> a failure of inheritance and concurrency to work well with each other,
>> negating the usefulness of inheritance as a mechanism for code-reuse in a
>> concurrent setting."
>>
>> The paper does not discuss actors, but I am wondering of actors solve this
>> problem. I don't need to know this for my current work, but I am curious.
>> Does anyone know about this topic? Thanks.
>>
>> --Russ P.
>>
>> --
>> http://RussP.us
>
>

Randall R Schulz
Joined: 2008-12-16,
User offline. Last seen 1 year 29 weeks ago.
Re: actors and the "inheritance anamoly"

On Friday June 3 2011, Russ Paielli wrote:
> I had an email discussion recently with someone who claims that
> object-oriented inheritance is a bad idea due to something called the
> "inheritance anaomoly." I googled it and found this paper:
>
> http://wisnesky.net/anomaly.pdf
>
> which says,
>
> "The Inheritance Anomaly has been a thorn in the side of the
> concurrent object-oriented language community for 15 years. Simply
> put, the anomaly is a failure of inheritance and concurrency to work
> well with each other, negating the usefulness of inheritance as a
> mechanism for code-reuse in a concurrent setting."
>
> The paper does not discuss actors, but I am wondering of actors solve
> this problem. I don't need to know this for my current work, but I am
> curious. Does anyone know about this topic? Thanks.
>
> --Russ P.

A very (surely overly) quick glance at that paper makes me think the
whole problem they're addressing is non-monotonic extensions in
subclasses. That is, subclasses that wish to retract some of the
semantics of the superclass, which is clearly a violation of the LSP.

So sure, unrestricted, undisciplined inheritance can
introduce "anomalies" (otherwise known as bugs). Removing language
constructs that can potentially be used to create such anomalies will
leave us with very little to work with.

Randall Schulz

Russ P.
Joined: 2009-01-31,
User offline. Last seen 1 year 26 weeks ago.
Re: actors and the "inheritance anamoly"
Thanks for the replies. If anyone else is more familiar with the "inheritance anamoly," I'd still like to know what they think about it and whether actors solve it.

By the way, this query started when I googled quotes of Dijkstra and came across this little gem:

"Object-oriented programming is an exceptionally bad idea which could only have originated in California."

This quote appears all over the place and is used by opponents of object-oriented programming. However, I can find very little if any elaboration from Dijkstra on exactly what he meant and why he said it. Does anyone know why Dijkstra rejected OOP?

--Russ P.

Florian Hars 3
Joined: 2011-05-08,
User offline. Last seen 42 years 45 weeks ago.
Re: actors and the "inheritance anamoly"

On Sat, Jun 04, 2011 at 10:30:56AM -0700, Randall R Schulz wrote:
> So sure, unrestricted, undisciplined inheritance can
> introduce "anomalies" (otherwise known as bugs).

Quite disciplined mixin inheritance in scala can do that, too, if
concurrency is involved. The part of the "Tour of Scala" that is
now http://www.scala-lang.org/node/117 used to contain another trait
"SyncIterator" that is still there on page 12 of
http://www.scala-lang.org/docu/files/ScalaOverview.pdf, with the
description:

To obtain rich, synchronized iterators over strings, one uses
a mixin composition involving three classes:

StringIterator(someString) with RichIterator[char]
with SyncIterator[char]

The result of that is of course broken, as the foreach method mixed
in from RichIterator still has a race condition between the calls to
the (now synchronized) methods hasNext and next, any class that mixes in
RichIterator and SyncIterator must also override foreach with
a synchronized call to super.foreach to get the desired semantics.

- Florian.

Meredith Gregory
Joined: 2008-12-17,
User offline. Last seen 42 years 45 weeks ago.
Re: actors and the "inheritance anamoly"
Dear All,
i've been mumbling about this on-list for years, now. Having been one of the designers of one of the first high performance actor-based language, we ran into this issue early. It's usually called the inheritance-synchronization anomaly in the literature. If you specialize and relax synchronization constraints relied upon in inherited code, you will likely break the inherited code. The remedy is to ensure that you always only narrow synchronization constraints in specializations. Language-based inheritance constructs in popular languages don't provide any support for this. So, programmers have to be aware and guarantee this on their own.
It's yet another of many reasons to abandon inheritance as a language-based mechanism for reuse. Actors, themselves, are also not a remedy for what ails you in concurrency. Of course, it's much worse when 'actors' doesn't actually mean actors, but is some loosey-goosey catch-all phrase for a programming convention that has a variety of semantics. So, 'actors' plus inheritance is guaranteed to be provide a rollicking good time when it comes to programming correctness. It is definitely a combination squarely in synch with the programmer full employment act.
Best wishes,
--greg

On Sat, Jun 4, 2011 at 2:49 PM, Florian Hars <florian [at] hars [dot] de> wrote:
On Sat, Jun 04, 2011 at 10:30:56AM -0700, Randall R Schulz wrote:
> So sure, unrestricted, undisciplined inheritance can
> introduce "anomalies" (otherwise known as bugs).

Quite disciplined mixin inheritance in scala can do that, too, if
concurrency is involved. The part of the "Tour of Scala" that is
now http://www.scala-lang.org/node/117 used to contain another trait
"SyncIterator" that is still there on page 12 of
http://www.scala-lang.org/docu/files/ScalaOverview.pdf, with the
description:

  To obtain rich, synchronized iterators over strings, one uses
  a mixin composition involving three classes:

    StringIterator(someString) with RichIterator[char]
                               with SyncIterator[char]

The result of that is of course broken, as the foreach method mixed
in from RichIterator still has a race condition between the calls to
the (now synchronized) methods hasNext and next, any class that mixes in
RichIterator and SyncIterator must also override foreach with
a synchronized call to super.foreach to get the desired semantics.

- Florian.
--
#!/bin/sh -
set - `type -p $0` 'tr [a-m][n-z]RUXJAKBOZ [n-z][a-m]EH$W/@OBM' fu XUBZRA.fvt\
angher echo;while [ "$5" != "" ];do shift;done;$4 "gbhpu $3;fraqznvy sKunef.q\
r<$3&&frq -a -rc "`$4 "$0"|$1`">$3;rpub 'Jr ner Svt bs Obet.'"|$1|`$4 $2|$1`



--
L.G. Meredith
Managing Partner
Biosimilarity LLC
7329 39th Ave SWSeattle, WA 98136

+1 206.650.3740

http://biosimilarity.blogspot.com
Danielk
Joined: 2009-06-08,
User offline. Last seen 3 years 21 weeks ago.
Re: actors and the "inheritance anamoly"
Hi Greg,

What is the problem with combining 'actors' and inheritance (I'm assuming we're not talking about inheriting actors here, but simply calling methods in classes implemented using inheritance from actors)? Or do you mean that it's not a solution because avoiding shared mutable state when using actors in Scala relies on convention, as opposed to being enforced by the compiler?

Best regards,
Daniel

On Sun, Jun 5, 2011 at 12:17 AM, Meredith Gregory <lgreg [dot] meredith [at] gmail [dot] com> wrote:

It's yet another of many reasons to abandon inheritance as a language-based mechanism for reuse. Actors, themselves, are also not a remedy for what ails you in concurrency. Of course, it's much worse when 'actors' doesn't actually mean actors, but is some loosey-goosey catch-all phrase for a programming convention that has a variety of semantics. So, 'actors' plus inheritance is guaranteed to be provide a rollicking good time when it comes to programming correctness. It is definitely a combination squarely in synch with the programmer full employment act.

Russ P.
Joined: 2009-01-31,
User offline. Last seen 1 year 26 weeks ago.
Re: actors and the "inheritance anamoly"
On Sat, Jun 4, 2011 at 3:17 PM, Meredith Gregory <lgreg [dot] meredith [at] gmail [dot] com> wrote:
Dear All,
i've been mumbling about this on-list for years, now. Having been one of the designers of one of the first high performance actor-based language, we ran into this issue early. It's usually called the inheritance-synchronization anomaly in the literature. If you specialize and relax synchronization constraints relied upon in inherited code, you will likely break the inherited code. The remedy is to ensure that you always only narrow synchronization constraints in specializations. Language-based inheritance constructs in popular languages don't provide any support for this. So, programmers have to be aware and guarantee this on their own.

Thanks for that explanation, but it's a bit over my head. If you have time to elaborate and perhaps provide a small example, that would be helpful to me.
 

It's yet another of many reasons to abandon inheritance as a language-based mechanism for reuse.

Interesting, but that statement invites a couple of questions.  What are some of the other reasons to abandon inheritance? Also, are you implying that inheritance provides benefits other than re-use, or are you implying that it should just be avoided completely?

--Russ P

 
Actors, themselves, are also not a remedy for what ails you in concurrency. Of course, it's much worse when 'actors' doesn't actually mean actors, but is some loosey-goosey catch-all phrase for a programming convention that has a variety of semantics. So, 'actors' plus inheritance is guaranteed to be provide a rollicking good time when it comes to programming correctness. It is definitely a combination squarely in synch with the programmer full employment act.
Best wishes,
--greg

Meredith Gregory
Joined: 2008-12-17,
User offline. Last seen 42 years 45 weeks ago.
Re: actors and the "inheritance anamoly"
Dear Daniel,
Thanks for your query. i meant the latter. The issue is that you have a pair of language features (inheritance and a concurrency construct) that now invite you to make mistakes. Not the best combo.
Best wishes,
--greg

On Sat, Jun 4, 2011 at 5:02 PM, Daniel Kristensen <daniel [dot] kristensen [at] gmail [dot] com> wrote:
Hi Greg,

What is the problem with combining 'actors' and inheritance (I'm assuming we're not talking about inheriting actors here, but simply calling methods in classes implemented using inheritance from actors)? Or do you mean that it's not a solution because avoiding shared mutable state when using actors in Scala relies on convention, as opposed to being enforced by the compiler?

Best regards,
Daniel

On Sun, Jun 5, 2011 at 12:17 AM, Meredith Gregory <lgreg [dot] meredith [at] gmail [dot] com> wrote:

It's yet another of many reasons to abandon inheritance as a language-based mechanism for reuse. Actors, themselves, are also not a remedy for what ails you in concurrency. Of course, it's much worse when 'actors' doesn't actually mean actors, but is some loosey-goosey catch-all phrase for a programming convention that has a variety of semantics. So, 'actors' plus inheritance is guaranteed to be provide a rollicking good time when it comes to programming correctness. It is definitely a combination squarely in synch with the programmer full employment act.




--
L.G. Meredith
Managing Partner
Biosimilarity LLC
7329 39th Ave SWSeattle, WA 98136

+1 206.650.3740

http://biosimilarity.blogspot.com
Meredith Gregory
Joined: 2008-12-17,
User offline. Last seen 42 years 45 weeks ago.
Re: actors and the "inheritance anamoly"
Dear Russ,
The list of places to go wrong with inheritance is pretty well explored from the fragile base class (arguably a variant of which just reared it's ugly head in combination with variance issues in Set vs List in Scala) to the synchronization inheritance anomaly. At the core of this issue is that is-a relationships are almost always in context. Inheritance sublimates the context. This is exactly the opposite of what is needed for maximal reuse -- and actually at odds with O-O principles. Instead, reify the context, give it an explicit computational representation and you will get more reuse. 
Thus, in the example the Set vs List case, it's only in some context that we agree that a Set[A] may be interpreted as a function A => Boolean. The context has very specific assumptions about what a function is and what a Set is.[1] If we reify that context, we arrive at something like
trait SetsAreFunctions[A,B] { /* you could make this implicit */ def asFunction( s : Set[A] ) : A => B }
Notice that one "specialization" of this SetsAreFunctions[A,Float] is a candidate for representing fuzzy sets. Another "specialization" SetsAreFunctions[A,Option[Boolean]] is a candidate for various representations of partial functions. In this design our assumptions about the meaning of Set and function have been reified. Further, the act of specialization is much more clear, much more explicit and more highly governed by compiler constraints. If you compare it to what has been frozen into the Scala collections, it has a wider range, is mathematically and computationally more accurate and doesn't introduce an anomalous variance difference between collections that mathematical consistency demands be the same (List comes from the monad of the free monoid, Set is an algebra of this monad guaranteeing idempotency and commutativity of the operation -- variance should not differ!).
As for an example of inheritance synchronization anomaly, let's see if i can construct off the top of my head the standard example from the literature (which can be had by googling inheritance synchronization anomaly). We might imagine a family of Buffer classes with (atomic) put and get methods together with an accept predicate satisfying the following conditions:
  • b : Buffer & accept( b, get ) => size( b ) > 0
  • b : Buffer & accept( b, put ) => size( b ) < Buffer.maxSize
These represent synchronization constraints that allow for harmonious engagement between a Buffer and many clients. If a specialization of Buffer provides a consumer behavior getting more than 1 element at a time on a get, the synchronization constraints will be violated. Successful specializations have to narrow the constraints.
While this example might seem contrived (and doubly so since i'm just pulling it off the top of my head) versions of this very example occurs more naturally in the wild if you consider the relationship between alternating bit protocol and sliding window protocol.
Best wishes,
--greg
[1] This sentence actually summarizes a major content of the interaction of computing and mathematics over the last 70 years. A real understanding of the various proposals of what a function is has been at the core of the dialogue between mathematics and computing. Category Theory, arguably, giving the most flexible and pragmatic account to date -- but seriously lacking (imho) in a decent account of concurrent computing.

On Sat, Jun 4, 2011 at 5:04 PM, Russ Paielli <russ [dot] paielli [at] gmail [dot] com> wrote:
On Sat, Jun 4, 2011 at 3:17 PM, Meredith Gregory <lgreg [dot] meredith [at] gmail [dot] com> wrote:
Dear All,
i've been mumbling about this on-list for years, now. Having been one of the designers of one of the first high performance actor-based language, we ran into this issue early. It's usually called the inheritance-synchronization anomaly in the literature. If you specialize and relax synchronization constraints relied upon in inherited code, you will likely break the inherited code. The remedy is to ensure that you always only narrow synchronization constraints in specializations. Language-based inheritance constructs in popular languages don't provide any support for this. So, programmers have to be aware and guarantee this on their own.

Thanks for that explanation, but it's a bit over my head. If you have time to elaborate and perhaps provide a small example, that would be helpful to me.
 

It's yet another of many reasons to abandon inheritance as a language-based mechanism for reuse.

Interesting, but that statement invites a couple of questions.  What are some of the other reasons to abandon inheritance? Also, are you implying that inheritance provides benefits other than re-use, or are you implying that it should just be avoided completely?

--Russ P

 
Actors, themselves, are also not a remedy for what ails you in concurrency. Of course, it's much worse when 'actors' doesn't actually mean actors, but is some loosey-goosey catch-all phrase for a programming convention that has a variety of semantics. So, 'actors' plus inheritance is guaranteed to be provide a rollicking good time when it comes to programming correctness. It is definitely a combination squarely in synch with the programmer full employment act.
Best wishes,
--greg




--
L.G. Meredith
Managing Partner
Biosimilarity LLC
7329 39th Ave SWSeattle, WA 98136

+1 206.650.3740

http://biosimilarity.blogspot.com
Russ P.
Joined: 2009-01-31,
User offline. Last seen 1 year 26 weeks ago.
Re: actors and the "inheritance anamoly"
Thanks for that explanation, Greg. Just to be sure I am "seeing the big picture," let me ask a basic question. If I understand you correctly, you are saying that inheritance and concurrency can be successfully combined if done carefully. Is that correct, or are you saying that inheritance and concurrency should never be used together? Or are you saying something else altogether?

--Russ P.


On Sat, Jun 4, 2011 at 8:46 PM, Meredith Gregory <lgreg [dot] meredith [at] gmail [dot] com> wrote:
Dear Russ,
The list of places to go wrong with inheritance is pretty well explored from the fragile base class (arguably a variant of which just reared it's ugly head in combination with variance issues in Set vs List in Scala) to the synchronization inheritance anomaly. At the core of this issue is that is-a relationships are almost always in context. Inheritance sublimates the context. This is exactly the opposite of what is needed for maximal reuse -- and actually at odds with O-O principles. Instead, reify the context, give it an explicit computational representation and you will get more reuse. 
Thus, in the example the Set vs List case, it's only in some context that we agree that a Set[A] may be interpreted as a function A => Boolean. The context has very specific assumptions about what a function is and what a Set is.[1] If we reify that context, we arrive at something like
trait SetsAreFunctions[A,B] { /* you could make this implicit */ def asFunction( s : Set[A] ) : A => B }
Notice that one "specialization" of this SetsAreFunctions[A,Float] is a candidate for representing fuzzy sets. Another "specialization" SetsAreFunctions[A,Option[Boolean]] is a candidate for various representations of partial functions. In this design our assumptions about the meaning of Set and function have been reified. Further, the act of specialization is much more clear, much more explicit and more highly governed by compiler constraints. If you compare it to what has been frozen into the Scala collections, it has a wider range, is mathematically and computationally more accurate and doesn't introduce an anomalous variance difference between collections that mathematical consistency demands be the same (List comes from the monad of the free monoid, Set is an algebra of this monad guaranteeing idempotency and commutativity of the operation -- variance should not differ!).
As for an example of inheritance synchronization anomaly, let's see if i can construct off the top of my head the standard example from the literature (which can be had by googling inheritance synchronization anomaly). We might imagine a family of Buffer classes with (atomic) put and get methods together with an accept predicate satisfying the following conditions:
  • b : Buffer & accept( b, get ) => size( b ) > 0
  • b : Buffer & accept( b, put ) => size( b ) < Buffer.maxSize
These represent synchronization constraints that allow for harmonious engagement between a Buffer and many clients. If a specialization of Buffer provides a consumer behavior getting more than 1 element at a time on a get, the synchronization constraints will be violated. Successful specializations have to narrow the constraints.
While this example might seem contrived (and doubly so since i'm just pulling it off the top of my head) versions of this very example occurs more naturally in the wild if you consider the relationship between alternating bit protocol and sliding window protocol.
Best wishes,
--greg
[1] This sentence actually summarizes a major content of the interaction of computing and mathematics over the last 70 years. A real understanding of the various proposals of what a function is has been at the core of the dialogue between mathematics and computing. Category Theory, arguably, giving the most flexible and pragmatic account to date -- but seriously lacking (imho) in a decent account of concurrent computing.

On Sat, Jun 4, 2011 at 5:04 PM, Russ Paielli <russ [dot] paielli [at] gmail [dot] com> wrote:
On Sat, Jun 4, 2011 at 3:17 PM, Meredith Gregory <lgreg [dot] meredith [at] gmail [dot] com> wrote:
Dear All,
i've been mumbling about this on-list for years, now. Having been one of the designers of one of the first high performance actor-based language, we ran into this issue early. It's usually called the inheritance-synchronization anomaly in the literature. If you specialize and relax synchronization constraints relied upon in inherited code, you will likely break the inherited code. The remedy is to ensure that you always only narrow synchronization constraints in specializations. Language-based inheritance constructs in popular languages don't provide any support for this. So, programmers have to be aware and guarantee this on their own.

Thanks for that explanation, but it's a bit over my head. If you have time to elaborate and perhaps provide a small example, that would be helpful to me.
 

It's yet another of many reasons to abandon inheritance as a language-based mechanism for reuse.

Interesting, but that statement invites a couple of questions.  What are some of the other reasons to abandon inheritance? Also, are you implying that inheritance provides benefits other than re-use, or are you implying that it should just be avoided completely?

--Russ P

 
Actors, themselves, are also not a remedy for what ails you in concurrency. Of course, it's much worse when 'actors' doesn't actually mean actors, but is some loosey-goosey catch-all phrase for a programming convention that has a variety of semantics. So, 'actors' plus inheritance is guaranteed to be provide a rollicking good time when it comes to programming correctness. It is definitely a combination squarely in synch with the programmer full employment act.
Best wishes,
--greg




--
L.G. Meredith
Managing Partner
Biosimilarity LLC
7329 39th Ave SWSeattle, WA 98136

+1 206.650.3740

http://biosimilarity.blogspot.com



--
http://RussP.us
Meredith Gregory
Joined: 2008-12-17,
User offline. Last seen 42 years 45 weeks ago.
Re: actors and the "inheritance anamoly"
Dear Russ,
Thanks for your patience and diligence in communicating! Yes, i am saying they can be combined. In Rosette we looked at the mechanism of enabled sets. The semantics of our become method allowed the programmer to specify which methods were "accepted" (to use the language of the previous email). Method invocations not in the accepted set remained undispatched in the mailbox until they were in the accepted set. However, though we had inheritance, we did not have language level support to allow compiler validation that specializations narrowed constraints. 
Greg Lavender, who was one of the most active members of the Rosette team (and has headed up many groups at Sun and now Oracle), argued that you could use a variant of enabled sets to overcome the inheritance synchronization anomaly. i don't remember if he worked out automated support. i'm sure his papers on the subject (along with his excellent reviews of all the various actor-based and concurrent languages -- from ABCL to Pool-T) come up if you execute the google search i suggested.  
Though you didn't ask, i will share that these and other considerations launched my 10 year investigation of types for concurrency. Based on my experience, types for concurrency is the most directionally correct response to the issues facing us in developing high performance concurrent and distributed programs.
Best wishes,
--greg

On Sun, Jun 5, 2011 at 12:22 PM, Russ Paielli <russ [dot] paielli [at] gmail [dot] com> wrote:
Thanks for that explanation, Greg. Just to be sure I am "seeing the big picture," let me ask a basic question. If I understand you correctly, you are saying that inheritance and concurrency can be successfully combined if done carefully. Is that correct, or are you saying that inheritance and concurrency should never be used together? Or are you saying something else altogether?

--Russ P.


On Sat, Jun 4, 2011 at 8:46 PM, Meredith Gregory <lgreg [dot] meredith [at] gmail [dot] com> wrote:
Dear Russ,
The list of places to go wrong with inheritance is pretty well explored from the fragile base class (arguably a variant of which just reared it's ugly head in combination with variance issues in Set vs List in Scala) to the synchronization inheritance anomaly. At the core of this issue is that is-a relationships are almost always in context. Inheritance sublimates the context. This is exactly the opposite of what is needed for maximal reuse -- and actually at odds with O-O principles. Instead, reify the context, give it an explicit computational representation and you will get more reuse. 
Thus, in the example the Set vs List case, it's only in some context that we agree that a Set[A] may be interpreted as a function A => Boolean. The context has very specific assumptions about what a function is and what a Set is.[1] If we reify that context, we arrive at something like
trait SetsAreFunctions[A,B] { /* you could make this implicit */ def asFunction( s : Set[A] ) : A => B }
Notice that one "specialization" of this SetsAreFunctions[A,Float] is a candidate for representing fuzzy sets. Another "specialization" SetsAreFunctions[A,Option[Boolean]] is a candidate for various representations of partial functions. In this design our assumptions about the meaning of Set and function have been reified. Further, the act of specialization is much more clear, much more explicit and more highly governed by compiler constraints. If you compare it to what has been frozen into the Scala collections, it has a wider range, is mathematically and computationally more accurate and doesn't introduce an anomalous variance difference between collections that mathematical consistency demands be the same (List comes from the monad of the free monoid, Set is an algebra of this monad guaranteeing idempotency and commutativity of the operation -- variance should not differ!).
As for an example of inheritance synchronization anomaly, let's see if i can construct off the top of my head the standard example from the literature (which can be had by googling inheritance synchronization anomaly). We might imagine a family of Buffer classes with (atomic) put and get methods together with an accept predicate satisfying the following conditions:
  • b : Buffer & accept( b, get ) => size( b ) > 0
  • b : Buffer & accept( b, put ) => size( b ) < Buffer.maxSize
These represent synchronization constraints that allow for harmonious engagement between a Buffer and many clients. If a specialization of Buffer provides a consumer behavior getting more than 1 element at a time on a get, the synchronization constraints will be violated. Successful specializations have to narrow the constraints.
While this example might seem contrived (and doubly so since i'm just pulling it off the top of my head) versions of this very example occurs more naturally in the wild if you consider the relationship between alternating bit protocol and sliding window protocol.
Best wishes,
--greg
[1] This sentence actually summarizes a major content of the interaction of computing and mathematics over the last 70 years. A real understanding of the various proposals of what a function is has been at the core of the dialogue between mathematics and computing. Category Theory, arguably, giving the most flexible and pragmatic account to date -- but seriously lacking (imho) in a decent account of concurrent computing.

On Sat, Jun 4, 2011 at 5:04 PM, Russ Paielli <russ [dot] paielli [at] gmail [dot] com> wrote:
On Sat, Jun 4, 2011 at 3:17 PM, Meredith Gregory <lgreg [dot] meredith [at] gmail [dot] com> wrote:
Dear All,
i've been mumbling about this on-list for years, now. Having been one of the designers of one of the first high performance actor-based language, we ran into this issue early. It's usually called the inheritance-synchronization anomaly in the literature. If you specialize and relax synchronization constraints relied upon in inherited code, you will likely break the inherited code. The remedy is to ensure that you always only narrow synchronization constraints in specializations. Language-based inheritance constructs in popular languages don't provide any support for this. So, programmers have to be aware and guarantee this on their own.

Thanks for that explanation, but it's a bit over my head. If you have time to elaborate and perhaps provide a small example, that would be helpful to me.
 

It's yet another of many reasons to abandon inheritance as a language-based mechanism for reuse.

Interesting, but that statement invites a couple of questions.  What are some of the other reasons to abandon inheritance? Also, are you implying that inheritance provides benefits other than re-use, or are you implying that it should just be avoided completely?

--Russ P

 
Actors, themselves, are also not a remedy for what ails you in concurrency. Of course, it's much worse when 'actors' doesn't actually mean actors, but is some loosey-goosey catch-all phrase for a programming convention that has a variety of semantics. So, 'actors' plus inheritance is guaranteed to be provide a rollicking good time when it comes to programming correctness. It is definitely a combination squarely in synch with the programmer full employment act.
Best wishes,
--greg




--
L.G. Meredith
Managing Partner
Biosimilarity LLC
7329 39th Ave SWSeattle, WA 98136

+1 206.650.3740

http://biosimilarity.blogspot.com



--
http://RussP.us



--
L.G. Meredith
Managing Partner
Biosimilarity LLC
7329 39th Ave SWSeattle, WA 98136

+1 206.650.3740

http://biosimilarity.blogspot.com

Copyright © 2012 École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland