To Paraphrase: I'm Sorry I Invented The Phrase Object-Oriented Programming!

Today is October 15th, 2007. Very few people celebrate this day, but I think its an important day to remember... because today is the ninth anniversary of the day that Alan Kay, inventor of the computer term "object-oriented programming" (OOP), apologized for inventing the phrase on the Squeak developer's list.

Apparently he was a bit sad to see how the term had become abused:

I'm sorry that I long ago coined the term "objects" for this topic because it gets many people to focus on the lesser idea. The big idea is "messaging"...

The key in making great and growable systems is much more to design how its modules communicate rather than what their internal properties and behaviors should be. Think of the internet -- to live, it (a) has to allow many different kinds of ideas and realizations that are beyond any single standard and (b) to allow varying degrees of safe interoperability between these ideas.

There's a theory in linguistics (Sapir-Whorf hypothesis) that says your ability to understand something is severely limited by the words you use to describe it... Some Amazon tribes don't have words for numbers, which limits their ability to understand how much they have...

For developers, who spend a lot of time in a 100% abstract world, using words severely limits how you can think about an abstract problem. By definition, abstract problems do not lend themselves well to existing language structures... and if you live and breath abstract problems, naming things significantly restricts how you can think about them. Now, sometimes its a great idea to limit abstract thought with concrete words... such as verbose variable names. Other times, it can be severely misleading or impair one's ability to understand the actual problem... such as verbose variable names.

There are a lot of linguistic traps in programming that affect how you can solve problems... you're dealing with a world that cannot be easily visualized or verbalized. This means that in order to communicate ideas between developers, you're always using bad analogies. These analogies get worse and worse as you try to communicate them to non-programmers. As long as everybody agrees that words like "objects" and "layers" and "logic" are necessary evils, we're OK... but when people start obsessing about the right way to do "pure object-oriented programming", or they start talking about "business logic" as a layer, then we have major problems.

A lot of the Java object-oriented patterns books make people unlearn what they have learned... the right way to use factory objects, interfaces, inheritance, etc., they all seem to be attempts at messaging. Either informing somebody of what they can do (inheritance, strategy), or informing somebody how their behavior can be modified (inversion of control, dependency injection).

The latest pony in this show is probably Google Guice: a slimmed-down version of the somewhat bloated Spring Framework for Java. They claim "pure dependency injection", so much so that you never need factory objects or static methods ever again! Sadly, my sources say the truth is vastly different.

Of course, Python and Ruby fans would mock people for using a language like Java that needs so many patterns in the first place... whereas with scripting languages and data-driven programming, such ideas are nearly intuitive...

Regardless, I'm glad things are moving in the right direction... All this "giving things names" does help when it comes to communicating how software works, plus people seem to be aware of the inherent dangers of verbalizing code... the "patterns for patterns sake" crowd seems smaller than the "objects for objects sake" crowd, which is A Good Thing ©

Anyway, happy Ninth "I'm Sorry I Invented The Term Object-Oriented Programming" Anniversary!

I wonder what the software world will be like on the tenth anniversary?

comments

Guice

You definitely still need factories with Guice, just fewer of them. With plain Java, you have to decide up front: do I call the constructor directly or do I write a factory? If I call the constructor, and I decide later that I need more abstraction, I have to write the factory and then go back and change all the usages. If I write the factory up front, I'll probably write a lot of unnecessary abstraction. With Guice, you can have an object injected instead of calling a constructor. Then, you replace that object's class with an interface and introduce a factory later, but you don't have to go back and change any of the usages.

'bout the same as Spring

I've heard good things about Guice... mainly that it does dependency injection with a lot less overhead when compared to Spring. I'm working on a project right now that uses it, but I'll be doing dependency injection with a custom framework.

Anyway, the video I linked to in the blog post was a Google Tech Talk about Guice from (I think) its inventors. They were bragging about how you no longer need Spring or static methods with Guice. I'm sure it possible to get rid of all factory objects, but its probably not desirable.

I invented Guice. :)

You definitely still need factories, but you write a lot less code, and you can separate scoping code from object creation/resolution code. The video you linked to is really for people who are already sold on Guice. You might try this more introductory video which compares Guice to the factory pattern directly.

well, shut my word hole...

Thanks for the extra video, Crazy Bob! I'll take a peek when I get a chance.

A few questions: do you agree that dependency injection patterns are mainly a Java attempt to be flexible like a scripting language? Do you think its a good idea to do that with Java, or is it merely prolonging the pain of switching to something else?

I think there's a lot of life left in Java, and the Java 1.6 additions of scripting languages is a good step... but picking JRuby as the reference implementation might not be such a hot idea. I would have gone with Jython: less black magic. Django can keep up with Rails any day of the week ;-)

Scripting languages are a non-starter for me.

I like static type safety too much. I do think we should solve some of these problems at the language level, but it's certainly safer to experiment more at the library level before casting an idea in stone at the language level.

I think Guice takes away one common reason for switching to a scripting language. You no longer need to write tons of factory code up front just in case you need some abstraction later. Now, if we could just get closures, I'd be all set. ;)

flexibility injection

I can see why you'd think that way... scripting people like highly mutable objects and choices between static libraries. Dependency injection appears to try to achieve the same flexibility goals, but without needing to leave the strongly typed universe. However, with Java you still need to anticipate which objects to make mutable before all future needs are known.

My big beef with "flexibility" is that people try to anticipate tomorrows needs today. That either leads to overly complex code, or flexibility in the wrong place... to fully compete with a scripting language, you need a way to add flexibility at any future date. Guice alone cannot do that... you need two things:

  1. A dependency injection mechanism (Spring, Guice)
  2. A custom classloader that automatically binds to every object, so you can override a class and insert new flexibility

If today's developer doesn't use Guice the right way today, a future developer can add it the right way, but without disturbing the existing code base... IMHO, the goal is flexibility injection, not dependency injection.

If somebody could make such a beast that's, easy, secure, and fast, then Java could keep up pretty well with the flexibility of Python...

I am a ruby guy myself since

I am a ruby guy myself since 5 years and for me, not only the beauty of the language but also the thinking pattern within it are extremely great.

However, I feel that a language that should focs on OOP and _especially_ about messages, we would need a new language that focus exactly on this. And I think it would have to be about prototype objects, not about class based structure.

Maybe Alan Kay could help design a new "scripting language" that includes a beautiful syntax (like in ruby and python), a huge focus on behaviour, pattern and messages, and a strong focus for the future development (unlike perl which seems to have grinded to a huge halt everywhere)

Python with patterns?

The beauty of late-binding languages like Ruby and Python is that you can modify the functionality of somebody else's library, or somebody else's object. In some cases, you can modify the core language itself.

This is great if you are a good programmer, or if you are the only user of the program you create. In a large code base, or if you have novice programmers in your team, this can cause serious problems...

Breaking encapsulation should be done with care, and only if you understand the full implications of what you are doing. Java and C# force you to use interfaces and subclasses to do that stuff -- which I don't care for so much -- but at least it's "safe." Tools like Guice help you inject new subclasses deep into existing codebases, which can be both good and bad, but you need to Guice-enable your codebase first.

Frankly, I don't know what the solution is... maybe have Python "modes" where you can turn off late binding? Maybe force people to "sign" their classes and methods, and have a database of "trusted hackers" that have passed their "late binding 101" exam?

Recent comments