Enterprise 2.0: Ignore the Fads, Follow the Trends

A few years back, Andrew McAfee "coined" the term "Enterprise 2.0." Recently, he's been criticized on the web here, here, and here, for his definition... Critics are saying his definition is outdated, unhelpful, and flawed. Some of this criticism is a tad harsh, but a lot of it is valid. McAfee responded by re-stating what E2.0 is:

Enterprise 2.0 is the use of emergent social software platforms within companies, or between companies and their partners or customers.

Kind of light on the details, eh? He continued to define related terms like "social software", "platforms", "emergent", and "free-form"... which fleshed out the definition a bit... but still, I'm left with a big question. How is any of this actually helpful??? It doesn't mention technologies... it doesn't mention purpose... it doesn't mention value. Based on this definition alone, there's not really a compelling reason for anybody to get excited about it. Luckily, because of the Web 2.0 cool-aid, anything with a 2.0 after it will generate buzz, so people latched on.

Let's contrast this with the definition of ECM by AIIM:

Enterprise Content Management (ECM) is the strategies, methods and tools used to capture, manage, store, preserve, and deliver content and documents related to organizational processes. ECM tools and strategies allow the management of an organization's unstructured information, wherever that information exists.

Its not perfect, but it should be pretty dang clear to any businessperson what problems ECM solves, and what every day tasks will be easier if it is done right. It also makes it obvious that its about strategies and methods; not just tools and technologies.

I frequently lament that anybody is trying to define what Enterprise 2.0 is, before we even know what it is. The 2.0 clearly means that it is intended for the "next generation" of enterprise software... but what is the next generation of enterprise software? If it's nothing more than enterprise social software -- which is what McAfee says -- then why on earth do we also need the term "Enterprise 2.0"? If its just blogs, wikis, and next generation collaborating tools, then we already have a term: Web 2.0. In either case, the phrase "Enterprise 2.0" is useless.

Now, if Enterprise 2.0 is truly meant to define the "next generation" of enterprise software tools, then the term will one day become useful. However, since these tools are still being envisioned and designed as we speak, a definition is still fairly useless... since we don't know what Enterprise 2.0 is yet!

If anything, the definition of "Enterprise 2.0" should reflect the trends in enterprise software, not just the fads. Ignore blogs and wikis. Shun social software. Instead, take a good, hard look at the broad trends that will have a major effect over the next 10 years. Here is a small sample:

  • The never-ending increase in computer power: storage, network bandwidth, processor speed, and cloud computing... there will soon be another tipping point like there was in the early 1990s.
  • Retiring baby boomers, who are taking a lot of institutional knowledge with them en mass.
  • The millennials, who have never known a world without the internet, and who are natives to online collaboration.
  • Globalization: more competition means you need better tools to test out innovations. Companies need to fail faster, and learn better if they are to survive.

What do all these trends mean for Enterprise 2.0 software? It's hard to say for sure... but what is clear is that more and more of the most important data and software will emerge on the "edge" of your networks. Why have a central repository at all when the average laptops are powerful enough to run their own content management systems? The average user now has tremendous power to create content, and run easy-to-install collaboration tools. The genie is out of the bottle my friends... all we can do now is try to control the damage. Identity management, enterprise search, and distributed information management can help with security and content... but for the application proliferation problem, I'd bet on enterprise mashups.

As the baby boomers retire, you can forget the idea of teaching them new software so they can share their knowledge. No way, no how, ain't going to happen. Instead, you need a new system for capturing "people" knowledge as effortlessly as possible. My idea is to just rip-off Robert Scoble. He made a name for himself with nothing more sophisticated than a camcorder and some editing software. You want knowledge from technophobes? Why not engage them in one-on-one taped interviews? Low tech people-oriented solutions are frequently the best option for capturing content and context, although you will need something like an enterprise YouTube for consumption.

As the millennials enter the work force -- what some people call the "gamer generation" -- what will their needs be? The obvious solution is that they want something like Facebook for the enterprise. News flash: there already is Facebook for the enterprise... it's called Facebook. More compelling is the idea that employee management and business process management will evolve into enterprise simulation software. Something like "SimCity Enterprise Version". Software like this will need to be seeded with a ton of historical data, information about your processes and employees, and information about the current market. Then, you can run a simulation on the "what if" scenarios in a world of interdependent agents. This may seem far-fetched, but there is a lot of software out there right now that solves one specific piece of this puzzle... it's just that nobody has put all the pieces together yet.

We don't need another word for Enterprise Social Software... nor do we need to ride the coattails of Web 2.0 to sell the same old application with a Wiki bolted on. However, we do need to be aware that the enterprise will change a lot in the next 10 years: and not because of fads, but because of trends.

Web Form Tip: Add Excel-Like Calculation To Input Fields

When I'm filling out web forms -- especially ones with financial data -- I find myself frequently missing the ability to use Excel-like math syntax. For example, you could type this into an Excel field:

= 111 + 222

And the moment you moved to another cell, it would calculate the answer, and place 333 in the cell for you. This is extremely handy, but alas, very few web sites allow this feature. So much power in web browsers, and yet this little touch of usability is relatively absent. It reminds me of the scene from Futurama, where Phillip J. Fry and Bender the surly robot are trying to hammer out their monthly budget:

    BENDER: Now to figure out how much money I'm raking in off those twerps!  
                (Scribbles out some numbers with a pencil and paper) 
                Awwwwww, I need a calculator.
    FRY:    You are a calculator!
    BENDER: I mean a good calculator.

In an effort to help make web sites more like "good" calculators, I'd suggest adding some simple JavaScript to turn any number field into an Excel-calculator field. It's pretty simple, really... just capture the "onBlur" and "onKeyPress" events. If the user moves to a new field or hits "return", the code evaluates if the cell begins with a "=" character. You can try it out in the fields below... The relevant source code follows.

Value 1

Value 2

Value 3

	// create the always useful 'trim' function
	function trim(str) {
		return str.replace(/^\s+|\s+$/g,"");

        // check to make sure the JS code is 'safe'
        function isMathProblem(str) {
                return ! str.match(/[a-z,A-Z]/);

	// check to see if the 'return' key is pressed
	function isReturnKeyPressed(e) {
		var isReturn = false;
		var characterCode = 0
			characterCode = e.keyCode;
		else if(e.which) 
			characterCode = e.which;
		if (characterCode == 13)
			isReturn = true;
		return isReturn;
	// evaluate the math, if it starts with a '=', ignore errors
	function doExcelMath(field, e) {
		if (typeof e != "undefined" && !isReturnKeyPressed(e))

		var val = trim(field.value);
		if (val.charAt(0) == '=' && isMathProblem(val)) {
			val = val.substring(1);
			try {
				val = eval(val);
				field.value = val;
			catch (ignore) {
<form name="excel-webform" method="get" action="#">
<b>Value 1</b> <input type="text" name="value1" onBlur="doExcelMath(this)" onKeyPress="doExcelMath(this, event)"></input><br />
<b>Value 2</b> <input type="text" name="value2" onBlur="doExcelMath(this)" onKeyPress="doExcelMath(this, event)"></input><br />
<b>Value 3</b> <input type="text" name="value3" onBlur="doExcelMath(this)" onKeyPress="doExcelMath(this, event)"></input><br />

If this doesn't catch on, I might have to make a Greasemonkey script instead to cram this code into every web site I use...

NOTE: it can be risky to allow a user to call the JavaScript 'eval' function on arbitrary input data. They could accidentally munge up their page, or insert cookies into their browser. In most cases this will not lead to a successful cross-site-scripting attack... but depending on what you keep in your cookies, you should run some penetration tests to make sure nothing bad can happen.

I'm Moving To Seattle!

Yes, the rumors are true... Michelle and I have decided to move from chilly Minneapolis, to the temperate and coffee-infested Northwest.

We've been planning this move for a number of years... the timing made sense because of the housing market crash. This has helped make the Seattle real estate market go from insanely expensive to just laughably expensive...

We'll both be keeping the same jobs, as well. That's the beauty of the interwebs... you can do anything from anywhere, as long as you have the correct technology.

I'm going to miss a lot about Minneapolis: my family, my friends, competent yet non-cocky software developers... but I've always wanted to live somewhere else. Seattle was always one of my top 5 favorite US cities, along with San Francisco, Chicago, Boston, and Minneapolis. I've met some great people out there, so I know I'll be OK.

I'm certain I'll be coming back frequently. I have several clients and business ventures here in Minneapolis, and I'll need to keep an eye on them. But the big question is, will Luke drop my blog from Central Standard Tech because I'm now splitting my time in two cities??? I hope not... ;-)

Webcast: Site Studio Performance Tuning

UPDATE: My presentation on Site Studio Performance Tuning is now posted online.

It will be hosted by the Independent Oracle Users Group (IOUG) I'm going to be talking about general web site performance challenges; some of which will probably surprise you. I'll also cover what kinds of hardware and software that will speed things up, as well as little-known Site Studio features that you can take advantage of.

If you can't make it, don't worry! We'll be posting it IOUG archived webcasts page as well.

Quote of the Day

"... the problem with object-oriented languages is they’ve got all this implicit environment that they carry around with them. You wanted a banana but what you got was a gorilla holding a banana AND the entire jungle." -- Joe Armstrong

Ahhh... them Lisp bigots never change! Until they turn into Erlang bigots... but the guy has a valid point.

Folks who write Ruby and Python based web frameworks kind of understand the "banana problem" and work very hard to overcome it. They use principles like late binding, code injection, Don't Repeat Yourself, etc... But a lot of Java frameworks for web sites seem to not care about the banana problem at all... which leads to these kinds of layers:

  1. Database table
  2. XML Schema definition of table
  3. Auto generated Java objects with attributes matching table columns
  4. EJB Business Object Interchange Layer
  5. EJB Business objects
  6. GUI Interchange Layer
  7. HTML Scripting Form
  8. Finished HTML Page

That's pretty much like saying... "Well, as long as we dragged the entire Jungle along with us, we might as well make you swing from every tree before we give you a banana!"

Oh, and watch out for the Gorilla...

Inventor of Oracle UCM Named "Oracle Innovator"

This is nice to see... my former boss Sam White was named an "Oracle Innovator:"

That's pretty cool... only 23 people in all of Oracle are "innovators," which is pretty impressive considering the company probably has 100,000 employees. That's like the top 0.02%!

I personally thought this product was innovative because of how Sam took a very holistic view of the problem. Way back in the 1990's he saw how people were misusing Java as an "applet" platform... when it was really excellent at being a server. He invented this component plug-in architecture that was way ahead of its time. Only in the past few years have people recognized this model and labeled it as the "Inversion of Control" pattern... but few still understand its power. Not to mention that the entire system was based on web services, before there was even a word for it!

Sam also eschewed "object oriented programming" in favor of "data-driven programming," which was also against the grain. People sometimes look at me sideways when I bash object oriented philosophy... but once you work a lot with "pure" objects you see what a maintenance mess it can be. In addition, Alan Kay -- the inventor of object oriented programming -- also agrees that the currently philosophy is not what he intended and easily leads people into the weeds.

Congratulations, Sam!

Book Review: A Whole New Mind

In three words? Awful. Awful. Awful.

The central premise of this book is that people with slightly more right-brain skills will dominate the work force in the 21st century... or at least be much more important than the past 20 years. I bought this book because that is a premise I agree with, so I was curious to see how he demonstrated his point, or any advice he could offer.

What I got was page after page of uninformed conjecture, hyperbole, cliches, and self-important blather. His premise? Left-brained work has dominated the industrial age, and the information age, but the next "phase" of human development is what he decided to label the "conceptual age," where right-directed people will dominate.

Really? You really think that? That's a hell of a statement... I hope you can back it up with some hard data... statistics, job growth numbers, etc? Anything?


He claims the drivers towards the "Conceptual Age" are Abundance, Asia, and Automation. That's it. No further proof. Let's take these one at a time:

Firstly, because of "Abundance", people are looking for better designs, even for ordinary household tools... thus designers become important. And apparently this is a new idea??? He believes that the words from the latest CEO of GM -- who said his job is to produce works of art that people drive -- as being somehow monumental. Oh my god! GM designs cars! They now care about "form" as well as "function!"

Really??? You really think that's a new thing? So I guess then those fins on a 1956 Chevy are there for aerodynamic purposes... and the mountains of chrome were there to make it more visible at night. Apparently the author is equally ignorant of the real drivers of the "left-brained" industrial revolution in the 19th century: the production of cheap textiles for clothing. YEP! The industrial revolution existed for the benefit of fashion designers and other "right brained" people who were tired of the ordinary abundance of the tunic. And how much of the computer revolution existed because people wanted a more "personalized" computer experience for their home or business? Ever hear of the iMac???

The author should try to do some research once in a while...

Secondly, because of "Asia," a lot of left-brian jobs -- computer programming, accounting, and legal -- are moving to Asia. Whereas right-brian jobs that require artistic design, communication, empathy, play, and meaning stay right in the USA. Since these jobs are "high-touch" jobs, they can't be outsourced.

Really??? You really think right-brain jobs cant be outsourced? I got a graphic designer in the Philippines who says differently. I got a dozen "empathy hotlines" you can call if you're feeling like killing yourself, and they'll do a hell of a lot better talking you down from the ledge than your friends or family. I also know of some really good customer support centers in India who are highly trained in empathic communication. Ever hear of teleconferencing or telepresence? Right-brained jobs are just as easily outsourced with the right technology.

Jobs are moving to Asia for one basic reason: SUPPLY AND DEMAND. Nothing more. Most American businesses prefer American workers, simply because culture differences, currency exchange rates, and time zones are a pain to deal with... but Asian workers are so much cheaper that they are worth the extra pain. However, these wages are only low in Asia because Asian industries are not big enough to demand local software developers, lawyers, and accountants. Once Asia becomes more industrialized, Asian businesses will be demanding "left-brained" Asian talent... which decreases their supply... which drives Asian wages up... which makes American talent more attractive to American businesses again.

This is just cyclical unemployment on a global scale: no more. Again... some research by the author would have been nice...

Thirdly, because of "Automation," those who just follow a well-defined process will be easily replaced by robots, computers, or Asians (apparently). In other words... technology eliminates low-skill jobs. SHOCKER! But of course, this isn't actually true. As any economics professor will tell you, technology is disruptive, but it doesn't eliminate jobs in the long run. The simple fact is that workers who learn how to use the new technology become more productive, and therefore more valuable to their employers! Yes, job responsibilities shift around a bit, but overall productivity increases, which creates more jobs in the medium term.

Then the author goes on to the second section of the book, which contains anecdotes about what skills will be important in the 21st century: design (agree), story (maybe), "symphony" (give me a break...), empathy (big agreement there), play (agree), and meaning (agree). The stories are good reading, but they are never supported by any hard data. There is evidence of a fad, but no evidence of a trend.

The single saving grace of this book are the right-brain exercises. They are pretty fun ways for a left-brain-leaning person to step out of their comfort zone and flex the right brain a little. If you find this book in the bargain bin for $5, then its worth it just for the exercises.

Otherwise, you'll probably want to avoid it...

Blog, or Blorphan?

Blogs, Wikis, and other Web 2.0 goodies have an important place in a broader enterprise content management strategy... but some caution is advised. As I mentioned in last year's talk "Enterprise 2.0: How You Will Fail," I think it might be more important to focus on the practical realities.

The free-flow of information is great and all, but does it translate into actual productivity? Or are you just creating faddish tools that will eventually be abandoned by users, after the novelty wears off?

Let's take blogs for example... Technorati's State Of The Blogosphere 2008 report claims that there are 133 million blogs in the world... Sounds great so far... but only 7.4 million (5.6%) of these blogs posted an article in the last 4 months! A mere 1.5 million (1.1%) posted an article within the last week, and about 900,000 posted in the last day (0.68%).

If these numbers are reflective of what you would find in a corporate blogging initiative, the outlook is fairly bleak. Assume you have a large push to get your employees blogging, and you succeed in getting 1000 bloggers in your company. If these statistics hold, that means that only 6 blogs out of 1000 will have useful, up-to-date information! Another 50 may have useful information, but it could be up to 4 months old... and possibly stale.

The rest of them could very well languish as "blorphans." One or two posts initially... but then only updated when the author is bored. In general, these posts will be tiny gems of knowledge strewn about your enterprise; usually outdated, and frequently without context.

So much for using blogs to measure the "pulse" of your company!

If you start a corporate blogging initiative, please do no attempt it without a strategy for giving people the tools and encouragement they need to keep going:

  • Have lots of helpful info on how to blog, perhaps coupled with a training program.
  • Give incentives for blogging... not monetary, but have public rankings of hot topics, hot bloggers, most linked content, most forwarded content, and the like.
  • Have a "blog for blogs," where people can exchange tips on blogging, and teach each other on the benefits of blogging.
  • Have a "president's club" for bloggers, elected by their peers, for bloggers that genuinely helped them. This could be for the best tips and trick, best breaking news, or the best analysis.
  • Use blogging tools that are easy to use, and which allow people to track their popularity, and how people tag their blog.
  • Rate improvement in blogging skills on yearly employee review forms, and be sure to give them time to blog.

Most people agree that public blogs help companies by making them more "transparent." Even if customers love your products, they will always have the fear that you might "go away" and not be able to help them in the future. Blogs from real people with real passion can help your customers feel more connected to the "pulse" of your company... even if that "pulse" is filled with stale information.

However... for internal people, the best way to keep everybody up-to-date is likely a more formal knowledge sharing process. Or you can just stick to rumors an innuendo, since company rumors are 80% correct anyway...

How To Make A Decision

I recently finished "How We Decide", which tries to answer the question how do people make decisions? Contrary to popular belief, human decisions are rarely -- if ever -- "rational." Almost all of the decision-making-process lies in our "emotional mind."

How can this be? Our minds are incredibly powerful when it comes to reasoning and logic... Why is it not engaged when it comes to making a decision?

In essence, the rational part of our brain is relatively new -- in the evolutionary sense. As such, it cannot compete with the emotional side, which has been making critical decisions for millions of years. The emotional side is capable of tremendously complex analysis of systems with hundreds of variables in a split second... for example, should I jump out of the path of a moving car? Yes! The rational mind can make these complex calculations as well... however its much more slow. This is because the rational mind can only juggle a few variables at a time, and it is lousy at "knowing" which variables are important and which aren't... Which direction should I jump? Should I plant my right foot first, or my left? When I land, should I roll to avoid injury? What if I get my shirt dirty?

Too much logic leads to a condition called "analysis paralysis." It is common when people try to make a decision with too much data. It is also common amongst people with damage to the orbitofrontal cortex -- which is critical in generating emotions. Far from having Vulcan-like decision making powers, these poor individuals can take hours just to decide what to eat!

The "emotional" part of our brain has served us well for millions of years... and it is great at helping us quickly make decisions about survival and self-preservation. It does so through a complex series of neural pathways and dopamine receptors that make us feel good when we make a "right" decision. The feeling is similar to what we feel when we exercise, have a healthy meal, or have sex. It is "good" in an evolutionary sense to not only do these things, but also to decide to do these things. When a person has lots of experience making certain kinds of decisions -- such as stock broker picking stocks, or a fireman putting out fires -- they engage the emotional mind to get a "feel" for what the "right" decision might be. In fact, its much faster to go with your gut, than to try to weigh all possible variables.

Your emotions are excellent at sifting through tons of information, noticing the important details, and ignoring the noise. These hard-wired pathways are lightning fast at making decisions, because they piggy-back on a system that had to be fast to ensure our survival. With the proper training, a person can "feel" the correct decision, even if they can't explain it.

However, this can also be a problem... using the emotional mind means you will be suppressing details that "feel" irrelevant. However, that is only because those details were irrelevant in the past. It is perfectly possible to "feel" right, even when you are 100% wrong.

The key is to know how to engage the rational mind, and how to engage the emotional mind. Only then will you consistently make good decisions.

"Simple" Problems Require Logic

The rational brain is easily overwhelmed... if you give it too many variables, then it won't be able to keep track of which variables are more important than others. Depending on the mind, you may be limited to as few as 4, or as many as 9 independent variables. This can increase with lots of training, but its a good rule of thumb.

So, we should not engage the left-brain in the decision-making process unless the problem is "simple." Buy this, we mean that the problem can be reduced to a mathematical formula with an obvious "right" and "wrong" answer. The term "simple" is misleading, because it includes problems like the NASA shuttle launch checklist, and mathematical problems so complex you'd need a PhD to solve them... nevertheless, since we can formalize the problem into a mathematical formula, it is "simple."

When a problem can be reduced to a formula, you should avoid instinct as much as possible... they can lead us astray on simple problems. Like, should I play the lottery, should I rebalance my stock portfolio, or is the shuttle ready for launch?

Once you do the math, the solution is clear... so keep feelings out of it!

"Complex" Problems Require Emotion

Now... choosing between different breakfast cereals? That's a hard one!!! How many factors will play into this kind of equation? Obviously, price, nutritional data, and the ingredients can be plugged into a formula... but there are other factors like flavor, brand, and novelty that can't be easily quantified. Also, if you dont' "feel" good about your decision, you'll wind up regretting it. So how should you choose?

The rule-of-thumb is that the rational mind is only good with four variables. Therefore, it is perfectly fine to engage logic when purchasing things that are all very similar. When buying a can opener or paper towels, it really comes down to quantifying "price" and "quality."

However, when making decisions about something with dozens of hard-to-quantify variables -- like strawberry jam or cold cereal -- you're better off trusting your "gut" instinct.

Numerous studies have shown that when people try to choose their favorite kind of jam, or their favorite work of art, or the best house in their price range, they usually agree with the "experts." If they are told to just go with their gut, they are frequently right on. However... if they are asked for their opinion plus an explanation, the whole thing falls apart!

The mere act of engaging the left brain clouds the issue, and they suddenly make bad decisions. In order to make good "complex" decisions, its important to distract the left brain so it doesn't try to hijack the process. Experts are usually capable of explaining their decision making process, but the untrained should stick with their gut.

New And Novel Problems Require Reason

The emotional mind is good at making quick decisions, but it is limited to past data. Only the logical mind can look into the future to plan; only the logical mind can properly put new information into the proper context.

Therefore, when faced with a new kind of problem, you will be tempted to "go with your gut". However, that can lead to problems, since your gut is only aware of instinct and your own history.

When you "feel" the right decision, you must remind yourself that this is a new kind of problem, which likely requires a new kind of solution. Use your rational brain to look for errors in judgement, and force yourself to explain your "gut" feelings. It is likely that you will uncover important data that you discounted as "irrelevant."

Certainty is the Surest Sign you Made an Error

The world is a big giant mess... and the brain is very, very uncomfortable with uncertainty. When your brain is face-to-face with new data that contradicts existing information patterns, it has a tough time feeling "certain" about it. In order to ensure proper right-brain functions, you need to work through this new data, and be ok with being "uncertain" for a while.

Unfortunately... your brain finds it a heck of a lot easier to just invent a "rational" reason why the new data can be ignored... so usually it does exactly that! It just feels soooooooooo good to be "certain", that our brains crave that feeling, even if we're dead wrong!

This problem is most notable when it comes to political partisans. Drew Westen scanned the brains of voters in the run-up to the 2004 US elections. He had three groups: hard-core Republicans, hard-core Democrats, and independents. He then showed them four clips: two when Bush obviously contradicted himself, and two when Kerry obviously contradicted himself. The independents noticed both sets of contradictions... however, the hard-core believers only saw the contradictions of their political enemy! They were dead-certain that their guy made perfect sense, and the "other guy" was illogical.

The brain scans of the subjects were more interesting still... The independents engaged the rational parts of their brains the whole time, which is why they spotted the logical contradictions. However, the hard-core believers did not. They used purely their emotions when asked about the contradictions. They only engaged the rational parts of their brains to help "reason away" the obvious contradictions made by their candidate. As Ben Franklin said:

"So convenient a thing it is to be a reasonable creature, since it enables one to find or make a reason for everything one has a mind to do."

Philip Tetlock performed an even more interesting study about the predictions of political "experts." He would give them a question about the future, with three possible answers. Then, years later, he would see how often they were correct. It turns out, not very often! In fact, the "experts" were correct less than 33% of the time!

That's right, folks... a drunken dart-throwing monkey is a better predictor of world events than most professional "analysts." To make matters worse, the most popular pundits were wrong the most often!

Why? Because of "certainty." These people felt like experts... they felt they knew the right decision... and they are literally addicted to the rush of feeling "right." No way they will be giving that up any time soon.

Once you "feel" certain that you have made the right decision -- especially about a new problem -- then I guarantee that you have ignored important data.

The Best "Deciders" Analyze Their Decision-Making Process

The Tetlock study did highlight which experts were actually useful... almost uniformly, the best experts create "testable hypothesis." This means that they were quite aware of the limits of their own decision-making powers. They knew they had biases. They knew they had incomplete information. They knew that this was a new kind of problem with new kinds of solutions.

Therefore, they used their gut most of the time, but they used their rational minds when analyzing their own limitations.

When a good decision maker feels certain, he will stop and say, "wait a moment, I'm under-thinking this." That is the time to look at what is so certain, and try to poke holes in it. Good decision makers create testable hypothesis about what they "know", and then they re-evaluate their position at a later date.

Good decision makers are also aware of emotional traps that can prevent you from making the right decision. These include things like loss-aversion, which means that losses "feel" worse than wins. You'll never be a good stock broker or a good general unless you are aware of how loss-aversion can prevent you from making the right decision.


When making decisions about "simple" problems, you need to engage your rational mind. Be aware of emotional traps, and try to distill the problem down to a mathematical formula. Simple Pro / Con lists frequently help, when data is hard to quantify.

When making decisions about "complex" problems, you need to distract your logical mind. What house should I buy? What furniture should I buy? Which candidate should I vote for? Put the question in your head, and then engage the right brain. Go see a play, watch a movie, listen to music, or take a nap. Then, make a "snap" decision based on emotion. Odds are, you'll be much happier with the result.

If you want to be an expert, the most important things are experience, and humility. Certainty is the enemy: there are no 100% guarantees in this world, so stop thinking like there are. Go with your gut at first, but constantly engage logic to validate the decision making process. When you are satisfied that your decision making process is as good as it can be, then its time to engage the right brain again. Distract the left brain, and then make a "snap" decision.

If you do this, then odds are you'll be right more often than a drunken dart-throwing monkey... in which case you're doing better than most paid "experts."

Comments Turned Off For A While...

I've been getting WAAAAAAAAAAY too much comments spam lately, so I'm shutting off the comments feature for a while. I experimented with numerous kinds of CAPTCHAs, all of which failed. I'm not sure if this is human comments spam, or what.

I'm going to experiment with a few other options... If you'd like to leave a comment, register, and send me an email. Only approved registered members can comment at the moment.

Or, you can always link back to my post from your own blog -- or Twitter account -- and leave a message that way...

New Site: samplecode.oracle.com

For a long time, folks have been asking for a SourceForge-like site for Oracle consultants where they could share free code snippets. I've been trying to get one of these going for a while... but I knew it would be nothing without Oracle branding and an internal push. Well, Oracle recently announced this site:

You need to be either an Oracle employee, or an OTN member in order to use it. It's backed with Subversion (yay!), so you'll need hat to contribute. The number of projects is still fairly small at the moment... and it doesn't have a category for Oracle UCM yet. However, once it does, I'm sure we could get the number of projects there up to 20 or so ;-)

A Modest Proposal For Bug Bounty Bonuses

Every software manager wants to reduce the number of bugs in their code... but debugging code is a painful process, and developers don't enjoy it. So what is a software manager to do?

A common mistake is to put a "bounty" on bugs... say, $10 for every bug found. This is a rookie mistake, because it won't take long before developers intentionally insert bugs so they can be found, fixed, and collect their bounty. Other managers tried to game the system differently... such as only giving the bonus to testers. But, this leads to back-channel markets where developers tip off the testers, and get a kickback.

In general, development managers use non-monetary encouragement to get their team to fix bugs. This can be public shame -- who broke the build, who has the most bugs in their code, etc. -- or it can be public praise -- who has the least bugs, who fixed the most bugs, etc.

But part of me thinks that there has to be a way to make a bug bounty game that is less prone to abuse... so I came up with this:

  1. Have a code review about once per quarter; set aside two weeks or so. Do not tell the developers when this code review will happen.
  2. Set aside a fixed bonus that each member of the team will get that quarter, say $1,000.
  3. During code review, you get $10 for each severe bug you find in somebody else's code... add this to the bonus.
  4. If somebody finds a bug in your code, you lose $20 from your bonus.
  5. If you fix a bug in your own code, you only lose $10.
  6. This process will continue until the median bonus on your team hits some kind of trigger value, say $900.
  7. After the trigger value is hit, there would be some kind of minimum bonus that you would give. If some new code had a ton of bugs in it, then the developers would get the minimum... say $800.
  8. After the code review is complete, everybody gets their bonus, along with a chart showing who got what for a bonus.

The idea here is to inspire some kind of healthy competition to fix other people's bugs... but also to minimize the number of bugs in your code, and to gain familiarity with the code of others. Naturally, developers will always come up with some kind of strategy to game the system... so lets look at how a dev team of 10 might behave:

One strategy is everybody does the minimum... they each fix 10 of their own bugs, the average drops to $900, and everybody gets a nice bonus. This is fine with management, because 10 bugs got fixed.

Another strategy is not to play at all... Well, then the bonuses never fall below the $900 trigger median, so nobody gets a bonus. The cash rolls into next quarter's code review.

But... what if one person tries a different strategy? Say, everybody is lazy and does nothing. So, the one bug fixer will fix 6 bugs in the other 9 developer's code. That's an extra $540 for him, and it brings the median down to nearly $900. He gets $1540, everybody else gets $900... and everybody knows this. Hopefully, this will encourage a good mix of lazy and energetic bug fixers... which beats the alternative.

Will this discourage people from writing tricky code before a code review? For example, let's say somebody is working very hard on some Big New Feature, and it won't be complete for another month after the code review. Everybody gangs up on him to find bugs, since that part of the code is rife with them. They each find 10 bugs, dragging his bonus down to negative $1800!!! Of course, this triggers the median being below $900, so everybody gets a bonus. The poor developer of the Big New Feature gets the minimum $800 bonus, and everybody else gets $1100.

That's a slight disincentive... but you can fix it with having the code review at a random time in the quarter so nobody can predict it. This will discourage people from writing code modules that will take 3 months to complete... but I see that as a positive side effect.

What if two developers conspire; one makes the bugs, the other finds them? One of the commenters brought this one up. I could make bugs that are hard to find/fix, then I tell you how to do it. You find 50 bugs in my code, and drive my bonus to $0. You get an extra $500, and the median falls below $900 for the team... Total for you is $1500, total for me is the $800 minimum. We split the difference, and walk away $350 richer. That would work once... maybe twice... but then you'd have to explain to your boss in your yearly review why your code is so loaded with ghastly, obscure bugs! In which case, its highly doubtful you'd get a nice raise. If you want to make $350 once, and sacrifice a $3500 raise a year from now, be my guest!!!

There is a slight risk that developers would horde bugs. Say, they would find a nice juicy bug in somebody else's code, but not tell anybody until the next code review. No problem... as long as the bug gets fixed. That's why you want these quarterly. However, there isn't much advantage to hiding bugs in your own code... but there is a slight advantage to sneaking in a bug into somebody else's code. But, as long as you have source control, it's easy to track down who made what code addition... so unless you have team members hacking into each other's accounts, this really isn't an issue.

Finally, some developers might find it demeaning. If the same people keep getting the minimum bonus, that might affect their morale and productivity. However, getting the minimum is a sign that this person probably needs help. A senior developer should mentor the poor newbies, show them how to write bug-free code, and help them find bugs in other people's code. Maybe even help them track down 5 bugs in other people's code, and hang on to them until the next code review. This is where bug hoarding can play a positive role for morale... And besides, if the developer consistently gets the minimum bonus even with mentoring, then you might want to reconsider whether this person is a positive addition to the team...

So what do you think? Is this a bug bounty bonus game that might actually work???

The W3C Kills XHTML in Favor of HTML 5!

File this one under its about frigging time!

The W3C has announced that it is dropping support for the XHTML 2 working group, in favor of HTML 5. Now, I'm a big fan of HTML5, but I don't see this as necessarily good news...

In the past, the W3C tried to dump HTML entirely in favor of XHTML... Here's an idea! You know all those web sites you just made? Well get ready to do it all over again... because with new and improved XHTML your rewritten pages will look and act exactly the same! Obviously, XHTML failed utterly to gain in popularity. Probably because the W3C is just plain awful at making specifications that are actually useful...

Not to be left out, the HTML faithful decided to create the Web Hypertext Application Technology Working Group (WHATWG) to try to continue development... they were upset that HTML was neglected and stuck on version 4.01 since 1999! This group included folks from Apple, Mozilla, and Opera, with some help from Google. That working group came up with some great ideas and actual functioning technology, so the W3C finally caved and made a working group for HTML 5 back in 2007... Fast forward 2 years, and the W3C finally realizes that XHTML 2 was going nowhere, so it was time to ditch it in favor of HTML 5.

Now... part of me feels this is bad... because now that they can no longer ruin XHTML, the W3C can just focus all of its energies on ruining HTML. I'm also irked by the justifications the W3C put forward about dropping XHTML:

HTML and XHTML 2 working groups were formed by W3C in March 2007. "Basically, two years ago we chartered two working groups to work on similar things, and that created confusion in the marketplace," said Ian Jacobs, W3C representative.

Yeah, right... they were doing "similar things." Not even close, Ian. Allow me to translate: the W3C doesn't have a clue what people need the web to do, so we're going to allow Apple and Mozilla to figure it out for us, then we'll claim credit!


Let's hope the good folks at WHATWG can see through this garbage, and don't let the W3C ruin HTML 5.

Email Patterns Can Predict The Health Of Your Company

As I mentioned previously and in my latest book, data mining your corporate email can yield some pretty interesting information... even if you don't read the contents. My angle is that by analyzing who emails whom and when, you can get a sense of who is "friends" with whom... and by doing so you can hit the ground running with any Enterprise 2.0 social software initiatives.

One nugget that I never thought of was how the emergence of email "cliques" can determine whether or not your company is in serious trouble... Two researchers -- Ben Collingsworth and Ronaldo Menezes -- recently analyzed the email patters at Enron to see if there were any predictors of the impending doom. Initially, they thought they would find interesting changes immediately prior to a large crisis... However, what they found was that the biggest change in email patterns happened one full month prior to the crisis!

For example, the number of active email cliques, defined as groups in which every member has had direct email contact with every other member, jumped from 100 to almost 800 around a month before the December 2001 collapse. Messages were also increasingly exchanged within these groups and not shared with other employees... Menezes thinks he and Collingsworth may have identified a characteristic change that occurs as stress builds within a company: employees start talking directly to people they feel comfortable with, and stop sharing information more widely [prior to a crisis].

Interesting stuff... although this is only one data point. The increase of "active email cliques" is probably a good indicator of the amount of stress and negative rumors in your company, or in a specific division. However, as an actual predictor, it might not work so well. It will be difficult to know for sure, because its really difficult for researchers to get access to random corporate emails.

Also, if you institute any kind of email data mining system, people will alter their behavior. These email cliques will simply go offline if think that big brother is watching... they will probably leave some kind of a trail, but it will be more subtle, and lead to lots of false positives.

Ultimately, as a manager you're probably better off just talking with your employees to see if they are demoralized... because spying on them might only make matters worse.

(Hat Tip: Nat Torkington)

Boy... I can't Wait To Use This W3C Standard...

Thanks to Reddit, I came across a new-ish W3C standard. Its a a query language for XML called XQueryX, designed to augment XQuery. I'm not a huge fan of XQuery, so I was looking forward to see how they might have made it simpler... but alas, XQueryX is vastly worse.

Their section on XQueryX examples nearly floored me... Lets say I had a big XML document containing books, publishers, and dates. Let's say I wanted to ask, "What books were published by Addison-Wesley after 1991?" The existing XQuery way to do it would look like this:

  for $b in doc("http://bstore1.example.com/bib.xml")/bib/book
  where $b/publisher = "Addison-Wesley" and $b/@year > 1991
    <book year="{ $b/@year }">
     { $b/title }

A little odd looking, but fairly readable. Now, let's see what I would have to type in using new XQueryX standard:

<?xml version="1.0"?>
<xqx:module xmlns:xqx="http://www.w3.org/2005/XQueryX"

AAAAAAAAAAAAAAAAAAGH! This has got to be one of the dumbest ideas the W3C has EVER come up with! And trust me, that category has some stiff competition! As jib noted:

What's the point of a ridiculously verbose textual language which isn't designed to be touched by humans? It's combining the worst features of human-readable and non-human-readable formats.

Amen, jib... This standard almost tops the idiocy of the W3C standard on emotions...


HTML 5 Versus Flash/Flex

There's been some chatter lately about how the next version of HTML 5 might make Flash irrelevant. And not only Flash, but also Adobe Flex, Microsoft Silverlight, and Oracle JavaFX might similarly become useless.

This is because the latest version of HTML has a lot of features that were previously confined to advanced animation plug-ins... the three I like the most are:

  • The <audio> and <video> elements, which allow for embedding rich media directly into the browser; the #1 use case of Flash.
  • The <canvas> element, which allows for images and vector-graphics to be directly rendered with JavaScript, which allows simple animations; the #2 use case of Flash.
  • Offline data storage so your users can keep a 5Mb database offline, manipulate data, and re-sync the data later; an uncommon use case, but vital for rich internet application that you can use on an airplane.

These features have been necessary for a long time... and even though HTML 5 is not yet a finished standard, most of it is already supported in major browsers: Firefox 3, Internet Explorer 8, and Safari 4. This means that you can create a HTML 5 application right now! Probably the most famous HTML 5 application out there is Google Wave for email, which we are all just dying to try out!

I feel that this kind of competition will be healthy... I'd wager that 90% of what people currently use Flash for could just as easily be done in HTML 5. Also, by being standards compliant, you'll have fewer concerns about vendor lock-in. What happens if Adobe gets into trouble, then is bought out by Computer Associates? No more Flash for you!

However, there is still a problem... currently HTML 5 compliant browsers are only 60% of the market... I know quite a few enterprises that are still on IE 6, fer crying out loud... Flash has the distinct advantage of working on older browsers, and has about a 95% market penetration. Although, last year at this time only 5% of users had a HTML 5 compliant browser, so maybe by May 2010 HTML 5 will be as popular as Flash?

Hard to say...

UPDATE 1: Well, it's now May 2010, so I redid the numbers... and according to the browser numbers from W3Schools about 75% of the market is using HTML5 compliant browsers. Now that Google has dropped support for IE6, I'd wager this number will be close to 95% in May 2011...

This question came up recently in the content management universe... a few weeks back EMC/Documentum unveiled their latest UI at the Gartner conference on Portals and Collaboration... and it was a pretty slick Flex-based UI. A daring move... However, slick UIs don't need Flex. Billy and I got a demo from Jason Bright about Media Beacon's latest app. It was very flashy, and uses pure HTML, CSS, and JavaScript. As Jason told CMS Watch:

"Flex, like ActiveX, Silverlight, and Java Applets before them are, in a sense, replacements to the browser. Each replaces the web browser in a proprietary way. While I love Flex as a technology, I do not think it is a good strategic decision to throw out the traditional browser for a new client-server model no matter how attractive"

The problem boils down to this: there are millions of people dedicated to making the web better; but only one small part of Adobe is dedicated to making Flash better. The same holds true for Silverlight and JavaFX.

If I were writing a one-off rich internet application, I might choose something like Flex, because Flex development time is half what it would be for a similar HTML/CSS/JavaScript app. There are so many browser bugs, and oddities in JavaScript, that its always a long slog to debug it. With the possible exception of the Google Web Toolkit, there really are no good ways to easily design a flashy HTML/CSS/JavaScript application... whereas designing application with Flex is relatively simple.

But... if I were making an application for resell, or one that I intended to have other people maintain, I'd be more hesitant to use anything but web standards. HTML 5 is right around the corner; product development cycles are long; and HTML 5 browsers could reach 90% market saturation in 12 months.

All things considered, the best option now is HTML 5...

UPDATE 2: in case you have been living in a cave, and missed the launch of Apple's new iPad, you might have missed the fact that the iPad will not support Flash or Flex. I'm uncertain whether this new device will really take the world by storm, but if it does, it will be one more reason to switch to an HTML 5 code base.

UPDATE 3: it appears that Steve Jobs has gone on records about why the iPad and iPod will NEVER support Flash. Steve-o brings up a few more reasons I did not cover here: Flash is a power hog, it doesn't support "touch" interfaces, and it crashes a lot. Steve Jobs ends with a plea: Adobe should use its brainpower to make a cross-platform IDE for HTML5, and stop trying to cram Flash down our throats. If they don't, then the "next Adobe" certainly will...

The Revolution Will Not Be Televised; It Will Be Tweeted -- Epic Win

This photo is of an Iranian protester helping evacuate an injured cop, and get away from an angry mob... As Sullivan said:

How To Tell Who The Good Guys Are? They're the ones who sometimes rescue a beleaguered riot policeman.

Skip the mainstream media... Go to Andrew Sullivan's blog, The Big Picture, or #iranelection on Twitter... something unbelievable is happening...

Joel on Platform Vendors

A while back I blogged about the lack of Oracle UCM "vertical applications". A vertical application is an add-on to an existing product or platform, but one that is industry specific. A lot of Oracle UCM consultants have created very general add-ons, and have sold them along with their services.

On occasion, Oracle implements one of these general features, and the add-on product becomes obsolete. Unsellable... and this can cause some grumpiness... but it doesn't have to be this way.

Joel on Software has recently had a similar rant about people who make add-ons to platforms... but in this case, he's referring to the iPhone. Similar to Oracle UCM, the iPhone is a platform... so you'll get some folks who just "fill the gaps," and others who create entirely new markets. A lot of gap-fillers had their profits crushed when the new iPhone OS rendered their add-ons obsolete. Some quotes:

A good platform always has opportunities for applications that aren’t just gap-fillers. These are the kind of application that the vendor is unlikely ever to consider a core feature, usually because it’s vertical — it’s not something everyone is going to want. There is exactly zero chance that Apple is ever going to add a feature to the iPhone for dentists. Zero.

Or, more succinctly, as Dave Winer once said:

Sometimes developers choose a niche that’s either directly in the path of the vendor, or even worse, on the roadmap of the vendor. In those cases, they don’t really deserve our sympathy.

Yes... If you make a general add-on to Oracle UCM, you have a wider possible audience... but that doesn't mean you'll be able to sell to it all! You'll have a tiny bit of market penetration, and then one day Oracle will just write a clone of what you did.

When it comes to add-ons to platforms, verticals are almost always more profitable. The market might be smaller, but it is much easier to highlight the need to your market, and the competition is less. If you make something good, odds are you'll be able to sell it for a looooong time.

"Web 2.0" is the Millionth English Word???

Well, isn't this convenient... according to the Global Language Monitor, the phrase "Web 2.0" has become the one-millionth word in the English language... narrowly beating out "Noob," "Slumdog," and "Cloud Computing."

Firstly... yes, English does have more words than any other language. The British Empire kind of spread English everywhere... and unlike French and Spanish, English acts like a sponge, absorbing every word it can find! Taboo, Tatoo, Tortilla, you get the picture.

But... I call shenanigans. I think this thing was rigged to get maximum press coverage. "Web 2.0" is not a word, its a phrase. Also, it has been around for about 7 years now, and was hugely popular in the technology field for the past 5. It is a much more common phrase than "Cloud Computing." The word count folks claim that it needs to be mentioned 25,000 times before its an "official" word... But the New York Times alone mentioned it on 2,700 occasions! I'm sure a survey of other sites would demonstrate that this word hit the 25,000 sweet spot many years ago...

Others are likewise skeptical:

Part of what makes determining the number of words in a language so difficult is that there are so many root words and their variants, said Sarah Thomason, president of the Linguistic Society of America and a linguistics professor at the University of Michigan... Thomason called the million-word count a "sexy idea" that is "all hype and no substance."

I'll agree there...

How Software Engineers Think...

A Software Engineer, a Hardware Engineer, and a Departmental Manager were on their way to a meeting in Switzerland. They were driving down a steep mountain road when suddenly the brakes on the car failed. The car careened almost out of control down the road, bouncing off the crash barriers, until it miraculously ground to a halt scraping along the mountainside. The car's occupants, shaken but unhurt, now had a problem: They were stuck half way down a mountain in a car with no brakes. What were they to do?

"I know," said the Departmental Manager. "Let's have a meeting, propose a Vision, formulate a Mission Statement, define some Goals, and by a process of Continuous Improvement find a solution to the Critical Problems, and be on our way."

"No, no," said the Hardware Engineer. "That will take far too long, and besides, that method has never worked before. I've got my Swiss Army knife with me, and in no time at all I can strip down the car's breaking system, isolate the fault, fix it, and we'll be on our way."

"Well," said the Software Engineer, "before we do anything, I think we should push the car up back up the road and see if it happens again..."


(Hat tip Dreaming in Code by Scott Rosenberg)

Recent comments