This page is no longer maintained — Please continue to the home page at www.scala-lang.org

Re: Questions about axiomatic programming (was: Native Compiler for Scala)

3 replies
Naftoli Gugenheim
Joined: 2008-12-17,
User offline. Last seen 42 years 45 weeks ago.

A program is-a mathematical construct? Uh-oh, that sounds like inheritance. :)
Seriously though, mathematics may see a program as a kind of mathematical construct, but that doesn't mean it's the only way to understand what a program is.
Some may say believe the universe itself is a mathematical construct, which again may be true in a mathematical sense, but there's much more to it than mathematics can express.

-------------------------------------
Daniel Sobral wrote:

Roland, you can't dissociate programs from math, because programs are a
mathematical construct, whether you realize it or not.

Now, the question is not so much about theorem proving -- and, by the way,
any program you compile with a statically typed language is proving a lot of
things already, no matter how many lines it has -- as about *soundness*. By
soundness I mean the ability of writing code with the confidence it will do
what you expect it to, as long as you are following correctly a set of
rules.

For example, we expect == to me commutative, associative and transitive. A
language which is sound enable us to define such an operation over a set of
values with the confidence that these rules will work. It just so happens
that in the presence of inheritance no such guarantee can be made. You may
write your code correctly, but external code, outside your control, might
make that untrue.

That means the fundaments of the language are unsound.

So, yes, you can write software without caring about mathematics. But if the
*language designer* doesn't care about mathematics, you'll be given a flawed
tool to work with.

On Wed, Nov 18, 2009 at 6:49 AM, Roland Kuhn wrote:

> What you describe below coincides quite nicely with the typical learning
> showcase of OOP. This is a tool like many others, and I completely agree
> with the "choose the best tool for the job" mantra, especially when
> programming for money. So, as a side remark, when you don't need to model
> the world, then don't do it.
>
> On the other hand, there have been advances in the computer sciences
> towards an axiomatic view on programming. Mathematics, as you probably know,
> has nothing to do with the real world in the sense that it does not even
> need a universe to exist (that is the reason I studied physics instead). In
> that sense, mathematics is abstraction in its pure form. The ability to
> solve problems without taking references to our daily experiences allows
> such wondrous techniques as transferring a problem from one domain (say
> geometry) to another (say algebra) to prove it. So, one might be inclined to
> take this to the extreme and transform computer programming into theorem
> proving. Not wanting to put anything into other people's mouth, I presume
> that this point of view might help understanding some of the previous posts
> in this thread.
>
> I have one question, though, which has been on my mind for quite some time.
> While I accept that a program can be transformed and analyzed under a set of
> axioms, what is the point? To my knowledge, the largest program ever proven
> consists of around 8000 lines of code, and that was a major effort. This
> seems to indicate that this technique is not ready for the real world, where
> every non-trivial project exceeds this size by a large factor.
>
> One other question comes to mind. If you can prove that a program adheres
> to its specification for all possible inputs, does that not imply that the
> set of such programs must be restricted somehow? If not, I would have
> difficulties reconciling this with the answer to the halting problem.
>
> If someone could enlighten me, I'd be grateful even for some links.
>
> Regards,
>
> Roland
>
> On Nov 18, 2009, at 06:30 , Naftoli Gugenheim wrote:
>
> Sure, interfaces are also an is-a relationship. They're a limited form of
>> multiple inheritance.
>> I think this thread shows that people approach programming paradigms
>> differently.
>> Some people view programming from a mathematical/functional angle. So
>> inheritance is an odd specialization of a function.
>> Some people take the approach that it's all just a matter of tools, and
>> whatever covers your immediate needs best should be used. If you need a
>> simple derivation use inheritance; for a more complex one use other
>> techniques.
>> My approach is that programs are a virtual representation of the world.
>> OOP is about modelling the world, but I don't take the to mean physical
>> objects but modelling real world concepts. A transaction, material, or
>> button are concepts in the real world. A Role or a Request, although not
>> tangible, are real concepts that exist. A program models those concepts, and
>> the closer it adheres to the way the world is structured, the more
>> comprehensible and coherent that program is, and the more likely to be
>> designed well because you can reason about it better.
>> Thus, it makes the most sense to declare Animal as a class and Cat as
>> subclass of Animal. Because a cat is indeed an animal, anything that applies
>> to animals automatically applies to cats, unless cat overrides it. This is
>> not a convenience for delegation. It's the way it is really.
>> Traits make sense, because besides a cat being a mammal, it also has a
>> tail (maybe not the best use of a trait but I think it gets the point
>> across). Mammals and fish (don't know if this is true but you get the point)
>> may or may not have tails. So a cat is-a Mammal and it is-a TailedAnimal.
>> That is how it is in the real world, and therefore by sticking as close to
>> this hierarchy as possible you have increased ability to reason about the
>> program.
>> Of course there maybe many corner cases where it's not obvious whether A
>> is a B. You have to use your judgement. At the end of the day the ancestor's
>> contract should best model the real-world concept it represents, so that
>> descendants don't break it.
>> So I postulate that the differences of opinion in this thread stem from
>> these different programming-worldviews.
>> Eagerly awaiting any thoughts on my wild philosophising!
>>
>>
>> -------------------------------------
>> Raoul Duke wrote:
>>
>> On Tue, Nov 17, 2009 at 6:39 AM, Ricky Clarkson
>>
>> wrote:
>>
>>> The reason inheritance is detrimental when used to model is-a, is that
>>> everyone has a different idea of is-a, and a compiler cannot help you
>>> to restrict it to any safe subset. A square is a rectangle.
>>>
>>
>> if i have an interface, and i have things which implement that
>> interface, and i pass different instances of those implementations
>> into the same code at different times, that seems to me to also be
>> assuming a particular meaning about how the things relate, about what
>> it means to implement that interface, and that doesn't seem on the
>> face of it to me much different than is-a via inheritance. cowboy,
>> artist, draw()!
>>
>> (i'm not wed to anything, i'm just trying to learn.)
>>
>> sincerely.
>>
>
> --
> I'm a physicist: I have a basic working knowledge of the universe and
> everything it contains!
> - Sheldon Cooper (The Big Bang Theory)
>
>

Ricky Clarkson
Joined: 2008-12-19,
User offline. Last seen 3 years 2 weeks ago.
Re: Questions about axiomatic programming (was: Native Compile

Always bear in mind that there is an infinite amount of mathematics
that we don't know about.

2009/11/18 Naftoli Gugenheim :
> A program is-a mathematical construct? Uh-oh, that sounds like inheritance. :)
> Seriously though, mathematics may see a program as a kind of mathematical construct, but that doesn't mean it's the only way to understand what a program is.
> Some may say believe the universe itself is a mathematical construct, which again may be true in a mathematical sense, but there's much more to it than mathematics can express.
>
>
>
> -------------------------------------
> Daniel Sobral wrote:
>
> Roland, you can't dissociate programs from math, because programs are a
> mathematical construct, whether you realize it or not.
>
> Now, the question is not so much about theorem proving -- and, by the way,
> any program you compile with a statically typed language is proving a lot of
> things already, no matter how many lines it has -- as about *soundness*. By
> soundness I mean the ability of writing code with the confidence it will do
> what you expect it to, as long as you are following correctly a set of
> rules.
>
> For example, we expect == to me commutative, associative and transitive. A
> language which is sound enable us to define such an operation over a set of
> values with the confidence that these rules will work. It just so happens
> that in the presence of inheritance no such guarantee can be made. You may
> write your code correctly, but external code, outside your control, might
> make that untrue.
>
> That means the fundaments of the language are unsound.
>
> So, yes, you can write software without caring about mathematics. But if the
> *language designer* doesn't care about mathematics, you'll be given a flawed
> tool to work with.
>
> On Wed, Nov 18, 2009 at 6:49 AM, Roland Kuhn wrote:
>
>> What you describe below coincides quite nicely with the typical learning
>> showcase of OOP. This is a tool like many others, and I completely agree
>> with the "choose the best tool for the job" mantra, especially when
>> programming for money. So, as a side remark, when you don't need to model
>> the world, then don't do it.
>>
>> On the other hand, there have been advances in the computer sciences
>> towards an axiomatic view on programming. Mathematics, as you probably know,
>> has nothing to do with the real world in the sense that it does not even
>> need a universe to exist (that is the reason I studied physics instead). In
>> that sense, mathematics is abstraction in its pure form. The ability to
>> solve problems without taking references to our daily experiences allows
>> such wondrous techniques as transferring a problem from one domain (say
>> geometry) to another (say algebra) to prove it. So, one might be inclined to
>> take this to the extreme and transform computer programming into theorem
>> proving. Not wanting to put anything into other people's mouth, I presume
>> that this point of view might help understanding some of the previous posts
>> in this thread.
>>
>> I have one question, though, which has been on my mind for quite some time.
>> While I accept that a program can be transformed and analyzed under a set of
>> axioms, what is the point? To my knowledge, the largest program ever proven
>> consists of around 8000 lines of code, and that was a major effort. This
>> seems to indicate that this technique is not ready for the real world, where
>> every non-trivial project exceeds this size by a large factor.
>>
>> One other question comes to mind. If you can prove that a program adheres
>> to its specification for all possible inputs, does that not imply that the
>> set of such programs must be restricted somehow? If not, I would have
>> difficulties reconciling this with the answer to the halting problem.
>>
>> If someone could enlighten me, I'd be grateful even for some links.
>>
>> Regards,
>>
>> Roland
>>
>> On Nov 18, 2009, at 06:30 , Naftoli Gugenheim wrote:
>>
>>  Sure, interfaces are also an is-a relationship. They're a limited form of
>>> multiple inheritance.
>>> I think this thread shows that people approach programming paradigms
>>> differently.
>>> Some people view programming from a mathematical/functional angle. So
>>> inheritance is an odd specialization of a function.
>>> Some people take the approach that it's all just a matter of tools, and
>>> whatever covers your immediate needs best should be used. If you need a
>>> simple derivation use inheritance; for a more complex one use other
>>> techniques.
>>> My approach is that programs are a virtual representation of the world.
>>> OOP is about modelling the world, but I don't take the to mean physical
>>> objects but modelling real world concepts. A transaction, material, or
>>> button are concepts in the real world. A Role or a Request, although not
>>> tangible, are real concepts that exist. A program models those concepts, and
>>> the closer it adheres to the way the world is structured, the more
>>> comprehensible and coherent that program is, and the more likely to be
>>> designed well because you can reason about it better.
>>> Thus, it makes the most sense to declare Animal as a class and Cat as
>>> subclass of Animal. Because a cat is indeed an animal, anything that applies
>>> to animals automatically applies to cats, unless cat overrides it. This is
>>> not a convenience for delegation. It's the way it is really.
>>> Traits make sense, because besides a cat being a mammal, it also has a
>>> tail (maybe not the best use of a trait but I think it gets the point
>>> across). Mammals and fish (don't know if this is true but you get the point)
>>> may or may not have tails. So a cat is-a Mammal and it is-a TailedAnimal.
>>> That is how it is in the real world, and therefore by sticking as close to
>>> this hierarchy as possible you have increased ability to reason about the
>>> program.
>>> Of course there maybe many corner cases where it's not obvious whether A
>>> is a B. You have to use your judgement. At the end of the day the ancestor's
>>> contract should best model the real-world concept it represents, so that
>>> descendants don't break it.
>>> So I postulate that the differences of opinion in this thread stem from
>>> these different programming-worldviews.
>>> Eagerly awaiting any thoughts on my wild philosophising!
>>>
>>>
>>> -------------------------------------
>>> Raoul Duke wrote:
>>>
>>> On Tue, Nov 17, 2009 at 6:39 AM, Ricky Clarkson
>>>
>>> wrote:
>>>
>>>> The reason inheritance is detrimental when used to model is-a, is that
>>>> everyone has a different idea of is-a, and a compiler cannot help you
>>>> to restrict it to any safe subset.  A square is a rectangle.
>>>>
>>>
>>> if i have an interface, and i have things which implement that
>>> interface, and i pass different instances of those implementations
>>> into the same code at different times, that seems to me to also be
>>> assuming a particular meaning about how the things relate, about what
>>> it means to implement that interface, and that doesn't seem on the
>>> face of it to me much different than is-a via inheritance. cowboy,
>>> artist, draw()!
>>>
>>> (i'm not wed to anything, i'm just trying to learn.)
>>>
>>> sincerely.
>>>
>>
>> --
>> I'm a physicist: I have a basic working knowledge of the universe and
>> everything it contains!
>>    - Sheldon Cooper (The Big Bang Theory)
>>
>>
>
>
> --
> Daniel C. Sobral
>
> Veni, vidi, veterni.
>

ounos
Joined: 2008-12-29,
User offline. Last seen 3 years 44 weeks ago.
Re: Questions about axiomatic programming (was: Native Compile
Plus an infinite amount of true statements that cannot be proven mathematically :)

2009/11/18 Ricky Clarkson <ricky [dot] clarkson [at] gmail [dot] com>
Always bear in mind that there is an infinite amount of mathematics
that we don't know about.

2009/11/18 Naftoli Gugenheim <naftoligug [at] gmail [dot] com>:
> A program is-a mathematical construct? Uh-oh, that sounds like inheritance. :)
> Seriously though, mathematics may see a program as a kind of mathematical construct, but that doesn't mean it's the only way to understand what a program is.
> Some may say believe the universe itself is a mathematical construct, which again may be true in a mathematical sense, but there's much more to it than mathematics can express.
>
>
>
> -------------------------------------
> Daniel Sobral<dcsobral [at] gmail [dot] com> wrote:
>
> Roland, you can't dissociate programs from math, because programs are a
> mathematical construct, whether you realize it or not.
>
> Now, the question is not so much about theorem proving -- and, by the way,
> any program you compile with a statically typed language is proving a lot of
> things already, no matter how many lines it has -- as about *soundness*. By
> soundness I mean the ability of writing code with the confidence it will do
> what you expect it to, as long as you are following correctly a set of
> rules.
>
> For example, we expect == to me commutative, associative and transitive. A
> language which is sound enable us to define such an operation over a set of
> values with the confidence that these rules will work. It just so happens
> that in the presence of inheritance no such guarantee can be made. You may
> write your code correctly, but external code, outside your control, might
> make that untrue.
>
> That means the fundaments of the language are unsound.
>
> So, yes, you can write software without caring about mathematics. But if the
> *language designer* doesn't care about mathematics, you'll be given a flawed
> tool to work with.
>
> On Wed, Nov 18, 2009 at 6:49 AM, Roland Kuhn <rk [at] rkuhn [dot] info> wrote:
>
>> What you describe below coincides quite nicely with the typical learning
>> showcase of OOP. This is a tool like many others, and I completely agree
>> with the "choose the best tool for the job" mantra, especially when
>> programming for money. So, as a side remark, when you don't need to model
>> the world, then don't do it.
>>
>> On the other hand, there have been advances in the computer sciences
>> towards an axiomatic view on programming. Mathematics, as you probably know,
>> has nothing to do with the real world in the sense that it does not even
>> need a universe to exist (that is the reason I studied physics instead). In
>> that sense, mathematics is abstraction in its pure form. The ability to
>> solve problems without taking references to our daily experiences allows
>> such wondrous techniques as transferring a problem from one domain (say
>> geometry) to another (say algebra) to prove it. So, one might be inclined to
>> take this to the extreme and transform computer programming into theorem
>> proving. Not wanting to put anything into other people's mouth, I presume
>> that this point of view might help understanding some of the previous posts
>> in this thread.
>>
>> I have one question, though, which has been on my mind for quite some time.
>> While I accept that a program can be transformed and analyzed under a set of
>> axioms, what is the point? To my knowledge, the largest program ever proven
>> consists of around 8000 lines of code, and that was a major effort. This
>> seems to indicate that this technique is not ready for the real world, where
>> every non-trivial project exceeds this size by a large factor.
>>
>> One other question comes to mind. If you can prove that a program adheres
>> to its specification for all possible inputs, does that not imply that the
>> set of such programs must be restricted somehow? If not, I would have
>> difficulties reconciling this with the answer to the halting problem.
>>
>> If someone could enlighten me, I'd be grateful even for some links.
>>
>> Regards,
>>
>> Roland
>>
>> On Nov 18, 2009, at 06:30 , Naftoli Gugenheim wrote:
>>
>>  Sure, interfaces are also an is-a relationship. They're a limited form of
>>> multiple inheritance.
>>> I think this thread shows that people approach programming paradigms
>>> differently.
>>> Some people view programming from a mathematical/functional angle. So
>>> inheritance is an odd specialization of a function.
>>> Some people take the approach that it's all just a matter of tools, and
>>> whatever covers your immediate needs best should be used. If you need a
>>> simple derivation use inheritance; for a more complex one use other
>>> techniques.
>>> My approach is that programs are a virtual representation of the world.
>>> OOP is about modelling the world, but I don't take the to mean physical
>>> objects but modelling real world concepts. A transaction, material, or
>>> button are concepts in the real world. A Role or a Request, although not
>>> tangible, are real concepts that exist. A program models those concepts, and
>>> the closer it adheres to the way the world is structured, the more
>>> comprehensible and coherent that program is, and the more likely to be
>>> designed well because you can reason about it better.
>>> Thus, it makes the most sense to declare Animal as a class and Cat as
>>> subclass of Animal. Because a cat is indeed an animal, anything that applies
>>> to animals automatically applies to cats, unless cat overrides it. This is
>>> not a convenience for delegation. It's the way it is really.
>>> Traits make sense, because besides a cat being a mammal, it also has a
>>> tail (maybe not the best use of a trait but I think it gets the point
>>> across). Mammals and fish (don't know if this is true but you get the point)
>>> may or may not have tails. So a cat is-a Mammal and it is-a TailedAnimal.
>>> That is how it is in the real world, and therefore by sticking as close to
>>> this hierarchy as possible you have increased ability to reason about the
>>> program.
>>> Of course there maybe many corner cases where it's not obvious whether A
>>> is a B. You have to use your judgement. At the end of the day the ancestor's
>>> contract should best model the real-world concept it represents, so that
>>> descendants don't break it.
>>> So I postulate that the differences of opinion in this thread stem from
>>> these different programming-worldviews.
>>> Eagerly awaiting any thoughts on my wild philosophising!
>>>
>>>
>>> -------------------------------------
>>> Raoul Duke<raould [at] gmail [dot] com> wrote:
>>>
>>> On Tue, Nov 17, 2009 at 6:39 AM, Ricky Clarkson
>>>
>>> <ricky [dot] clarkson [at] gmail [dot] com> wrote:
>>>
>>>> The reason inheritance is detrimental when used to model is-a, is that
>>>> everyone has a different idea of is-a, and a compiler cannot help you
>>>> to restrict it to any safe subset.  A square is a rectangle.
>>>>
>>>
>>> if i have an interface, and i have things which implement that
>>> interface, and i pass different instances of those implementations
>>> into the same code at different times, that seems to me to also be
>>> assuming a particular meaning about how the things relate, about what
>>> it means to implement that interface, and that doesn't seem on the
>>> face of it to me much different than is-a via inheritance. cowboy,
>>> artist, draw()!
>>>
>>> (i'm not wed to anything, i'm just trying to learn.)
>>>
>>> sincerely.
>>>
>>
>> --
>> I'm a physicist: I have a basic working knowledge of the universe and
>> everything it contains!
>>    - Sheldon Cooper (The Big Bang Theory)
>>
>>
>
>
> --
> Daniel C. Sobral
>
> Veni, vidi, veterni.
>



--
Ricky Clarkson
Java and Scala Programmer, AD Holdings
+44 1565 770804
Skype: ricky_clarkson
Google Talk: ricky [dot] clarkson [at] gmail [dot] com
Google Wave: ricky [dot] clarkson [at] googlewave [dot] com

dcsobral
Joined: 2009-04-23,
User offline. Last seen 38 weeks 5 days ago.
Re: Questions about axiomatic programming (was: Native Compile
Ok, let me rephrase that, then.   Any program that can be run by a computer is a mathematical construct. Programs that cannot be in any way translated to and run on a computer are free not to be a mathematical construct.   Or, to put it in another way, when you pick up apples at grocery store, you may choose not to see any relation between your actions and mathematics. Yet, each time you pick an apple, the total amount of apples in the basket will be increased according to basic arithmetic laws.   Now, back to computers running... as long as you don't want your program to run reliably -- and by "reliably" I mean give the results you expect --, you really don't need to care about mathematics. But _only_ mathematics will give you reliability.
And, to put it in still another way, math is about symbol manipulation according to rules. If you are manipulating symbols, you are using math. If you do it intuitively, you'll get intuitive results -- ie, results which look right, and may or may not be right.
On Wed, Nov 18, 2009 at 12:56 PM, Naftoli Gugenheim <naftoligug [at] gmail [dot] com> wrote:
A program is-a mathematical construct? Uh-oh, that sounds like inheritance. :)
Seriously though, mathematics may see a program as a kind of mathematical construct, but that doesn't mean it's the only way to understand what a program is.
Some may say believe the universe itself is a mathematical construct, which again may be true in a mathematical sense, but there's much more to it than mathematics can express.



-------------------------------------
Daniel Sobral<dcsobral [at] gmail [dot] com> wrote:

Roland, you can't dissociate programs from math, because programs are a
mathematical construct, whether you realize it or not.

Now, the question is not so much about theorem proving -- and, by the way,
any program you compile with a statically typed language is proving a lot of
things already, no matter how many lines it has -- as about *soundness*. By
soundness I mean the ability of writing code with the confidence it will do
what you expect it to, as long as you are following correctly a set of
rules.

For example, we expect == to me commutative, associative and transitive. A
language which is sound enable us to define such an operation over a set of
values with the confidence that these rules will work. It just so happens
that in the presence of inheritance no such guarantee can be made. You may
write your code correctly, but external code, outside your control, might
make that untrue.

That means the fundaments of the language are unsound.

So, yes, you can write software without caring about mathematics. But if the
*language designer* doesn't care about mathematics, you'll be given a flawed
tool to work with.

On Wed, Nov 18, 2009 at 6:49 AM, Roland Kuhn <rk [at] rkuhn [dot] info> wrote:

> What you describe below coincides quite nicely with the typical learning
> showcase of OOP. This is a tool like many others, and I completely agree
> with the "choose the best tool for the job" mantra, especially when
> programming for money. So, as a side remark, when you don't need to model
> the world, then don't do it.
>
> On the other hand, there have been advances in the computer sciences
> towards an axiomatic view on programming. Mathematics, as you probably know,
> has nothing to do with the real world in the sense that it does not even
> need a universe to exist (that is the reason I studied physics instead). In
> that sense, mathematics is abstraction in its pure form. The ability to
> solve problems without taking references to our daily experiences allows
> such wondrous techniques as transferring a problem from one domain (say
> geometry) to another (say algebra) to prove it. So, one might be inclined to
> take this to the extreme and transform computer programming into theorem
> proving. Not wanting to put anything into other people's mouth, I presume
> that this point of view might help understanding some of the previous posts
> in this thread.
>
> I have one question, though, which has been on my mind for quite some time.
> While I accept that a program can be transformed and analyzed under a set of
> axioms, what is the point? To my knowledge, the largest program ever proven
> consists of around 8000 lines of code, and that was a major effort. This
> seems to indicate that this technique is not ready for the real world, where
> every non-trivial project exceeds this size by a large factor.
>
> One other question comes to mind. If you can prove that a program adheres
> to its specification for all possible inputs, does that not imply that the
> set of such programs must be restricted somehow? If not, I would have
> difficulties reconciling this with the answer to the halting problem.
>
> If someone could enlighten me, I'd be grateful even for some links.
>
> Regards,
>
> Roland
>
> On Nov 18, 2009, at 06:30 , Naftoli Gugenheim wrote:
>
>  Sure, interfaces are also an is-a relationship. They're a limited form of
>> multiple inheritance.
>> I think this thread shows that people approach programming paradigms
>> differently.
>> Some people view programming from a mathematical/functional angle. So
>> inheritance is an odd specialization of a function.
>> Some people take the approach that it's all just a matter of tools, and
>> whatever covers your immediate needs best should be used. If you need a
>> simple derivation use inheritance; for a more complex one use other
>> techniques.
>> My approach is that programs are a virtual representation of the world.
>> OOP is about modelling the world, but I don't take the to mean physical
>> objects but modelling real world concepts. A transaction, material, or
>> button are concepts in the real world. A Role or a Request, although not
>> tangible, are real concepts that exist. A program models those concepts, and
>> the closer it adheres to the way the world is structured, the more
>> comprehensible and coherent that program is, and the more likely to be
>> designed well because you can reason about it better.
>> Thus, it makes the most sense to declare Animal as a class and Cat as
>> subclass of Animal. Because a cat is indeed an animal, anything that applies
>> to animals automatically applies to cats, unless cat overrides it. This is
>> not a convenience for delegation. It's the way it is really.
>> Traits make sense, because besides a cat being a mammal, it also has a
>> tail (maybe not the best use of a trait but I think it gets the point
>> across). Mammals and fish (don't know if this is true but you get the point)
>> may or may not have tails. So a cat is-a Mammal and it is-a TailedAnimal.
>> That is how it is in the real world, and therefore by sticking as close to
>> this hierarchy as possible you have increased ability to reason about the
>> program.
>> Of course there maybe many corner cases where it's not obvious whether A
>> is a B. You have to use your judgement. At the end of the day the ancestor's
>> contract should best model the real-world concept it represents, so that
>> descendants don't break it.
>> So I postulate that the differences of opinion in this thread stem from
>> these different programming-worldviews.
>> Eagerly awaiting any thoughts on my wild philosophising!
>>
>>
>> -------------------------------------
>> Raoul Duke<raould [at] gmail [dot] com> wrote:
>>
>> On Tue, Nov 17, 2009 at 6:39 AM, Ricky Clarkson
>>
>> <ricky [dot] clarkson [at] gmail [dot] com> wrote:
>>
>>> The reason inheritance is detrimental when used to model is-a, is that
>>> everyone has a different idea of is-a, and a compiler cannot help you
>>> to restrict it to any safe subset.  A square is a rectangle.
>>>
>>
>> if i have an interface, and i have things which implement that
>> interface, and i pass different instances of those implementations
>> into the same code at different times, that seems to me to also be
>> assuming a particular meaning about how the things relate, about what
>> it means to implement that interface, and that doesn't seem on the
>> face of it to me much different than is-a via inheritance. cowboy,
>> artist, draw()!
>>
>> (i'm not wed to anything, i'm just trying to learn.)
>>
>> sincerely.
>>
>
> --
> I'm a physicist: I have a basic working knowledge of the universe and
> everything it contains!
>    - Sheldon Cooper (The Big Bang Theory)
>
>


--
Daniel C. Sobral

Veni, vidi, veterni.



--
Daniel C. Sobral

Veni, vidi, veterni.

Copyright © 2012 École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland