Be lazy. Machines should serve human being. Often programmers serve machines unconsciously. Let machines serve you. Do everything you can to make you lazy. (Yukihiro Matsumoto)

Object Orientation and Ruby

Ruby is an object oriented programming language.

When I was in college in the early 90s, OO was a peripheral subject. We learned about object oriented programming, but it was reagarded as a side show. There were special cases and certain applications that benefited from OO, but it wasn’t the main event. We were focused on procedural programming, in highly structured languages like Pascal or lower level languages where we could get real speed advantages like C.

OO has now become a fundamental approach in most high level programming languages. Ruby was developed, in part, to integrate object orientation throuhgout the programming language. Everything in Ruby is an object, for better and worse.

Even a number.

So you can write code like:


Which takes the object “1”, which is an instance of the Fixnum class, and runs the “+” method on it which adds the argument, in this case, “2”. The result is “3”.

Now, that’s kinda funny, so you can also just write:

1 + 2

and all works just fine. I suppose that is a convenience when dealing with numbers, but it is good to always remember that everything is implemented as an object.

Generally speaking, objects contain both their data structures and their code. In the parlance, the data are called “attributes” or “properties” and the code are called “methods.” On occassion, it is useful to reference the instance of the object in a method, in order to operate on the instance itself. For that, you use the keyword self. Generally, self refers to the current object. But self can also refer to the current class. Ruby uses context to distinguish what exactly self means. So, for example:

class Person

    @name = ''
    @@count = 0
    def name
    def initialize
    def self.count
        @@count += 1


In the example above, refers to the instance variable @name. In that case, self refers to the instance created by initialize. However, in the method declaration def self.count self now refers to the class since using self in a method declaration makes the method a class method.

This is both confusing, and intuitive. It’s confusing because the word ‘self’ is being used to refer to two different things. But it is also intutive, so long as you consider the context. When you are defining a class, at the top level of that class, self should refer to the class itself, like in the method example. But, within an instance method, self naturally refers to the instance of the class, since the method will be used and ‘owned’ by instances.

Well, ok, maybe it isn’t super intutive. And there is a good chance that I got the details a bit wrong here….But that’s the idea.