Daniel Lepage on 28 Apr 2003 22:43:01 -0000

[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: [spoon-discuss] Why we need OO

On Monday, April 28, 2003, at 09:51  AM, Rob Speer wrote:

On Mon, Apr 28, 2003 at 06:19:11AM -0400, bd wrote:
Okay, either 'Root' or 'Class' (note the uppercase). I just think 'Thingy'
is... bad.

There's already a Class class. It defines the properties that classes
have. Every class is an instance of it. Most instances will not in fact
be classes themselves.

Again, the name of a class describes what its derived objects are, not
what the class itself is. Here are some examples of Thingies:

* the number 17
* my last spoon-business post
* the 'b' at the end of my name
* Unbridled Hostility
* failing diplomacy with Thermodynomic
* me
* you
* a Ford Prefect Gnome

In fact, the first five are Concepts as well. The last three should be
defined as Objects eventually. But all of them are Thingies, because
they exist in some form within the game. Would you prefer that they
should all be called Roots?

It seems to me that some of these don't need to be defined like this... an Integer, for example, is simply a Number that has an integer as a value; in that case, I see no reason not to use the integers defined in standard english, rather than use a special Integer class that we define as being a standard english integer. AFAICT, there's no reason why the ClassList can't start with what you're currently calling an 'Object' and go from there...

I'll consider "Thing" instead of "Thingy", but I wanted to capture the
vagueness of that class and the fact that anything at all can be in it.
I don't want to rename "Object", since this is what people are almost
always thinking of when they say "object". If I call the root class
Object, then someone may make a rule involving "an object" and suddenly
the number 3 is able to do stuff.

What's wrong with everything being first-class objects (says the Python Programmer)?

In some languages (Ruby comes to mind) it's legal to say 3.times(blurf) and such like that; I never really saw the point (it seems like it overworks the interpreter/compiler, forcing it to make distinctions between floating point numbers and dot-construct-using integers), but it has been done.

But anyway, why do we need to define our own version of '3'? 3 is an object; but does not need to be defined as one in the context of the Nomic, as it is defined as one in the larger scope of Real Life.


spoon-discuss mailing list