So I was reading up on John Newton's impressions of the Enterprise 2.0 conference a few weeks back... he was frustrated by the lack of a unifying definition of just what it was:
this doesn't mean that there was a lot of clarity on the meaning of the term Enterprise 2.0 at the conference. Although Web 2.0 had no less than Tim O'Reilly and John Battelle to define what that term means (barely), Enterprise 2.0 has no such authority. Consensus says that it is just Web 2.0 for the enterprise. However, researching the concept a couple of years ago, E2.0 is about taking the social aspects of Web 2.0, collaboration, social networks, user contribution, wisdom of crowds and social tagging and voting and applying it to information, documents and content in the enterprise
Interesting... blogs allow anybody to speak on a topic, and report news... Wikis allow anybody to take part in creating an authoritative knowledge repository... social networks allow people to bypass hierarchy structures and get things done by making "friends" and "connections" that want to help you.
Web 2.0 fundamentally means the end of the expert, but it took two "experts" to define that.
How deliciously ironic...
In contrast, since there is no accepted "expert" telling us what Enterprise 2.0 is, and since we're all just a bunch of amateurs fumbling towards the right answer, then Enterprise 2.0 is actually more Web 2.0 than Web 2.0. We know a fundamental change is occuring, we just aren't quite sure what it will look like when we're done.
Well, then... I guess the right thing to do is sit back, and lets these guys fight it out for a while. Let the self-anointed ones battle for mindshare, let the answer present itself, and then come up with a definition.
I'll throw in my two cents next week...
I love of energy... I always thought environmentalists got it wrong about energy. The problem isn't overconsumption, its unsustainability. So, go ahead and drive your Hummer, as long as it runs biodiesel from sources like algae or bacteria. If Big Oil was sharp, they would stop denying global warming, and embrace new carbon-negative oil technologies before the high tech venture capitalists steal all their business...
To add insult to injury, it seems that some prominent scientists want to put Big Oil on trial for global warming. At first, I believed that these kinds of trials would go exactly nowhere. Until I found out about one case backed by a dream team of trial lawyers: Steve Berman and Steve Susman.
The former was the lead lawyer representing 13 states against Big Tobacco in their historic defeat in the 1990s. The latter was the man who defended Big Tobacco. Now, they have teamed up and are taking on Big Oil, with pretty much the same strategy...
The Atlantic outlines the logic of the case quite well. There have been dozens of lawsuits against Big Tobacco, dating as far back as the 1950s. The plaintiffs were all the same -- people who got addicted to cigarettes, and got health problems, and were now suing the tobacco industry for selling an unsafe product. Early anti-tobacco lawsuits all ended the same way: the judge would declare that every consumer product has some danger, but its not the judge's responsibility to decided an acceptable level of safety.
Defining what is an "acceptable level of safety" is up to Congress... who are always on top of things...
This of course led Big Tobacco in the past -- just like Big Oil right now -- to funnel millions of dollars to "skeptical" scientists, and use them to pass off PR as genuine research... and use that to influence congress and the media into inaction. Not to mention the millions in campaign contributions, free trips, lobbyist jobs, etc. etc. etc.
Unfortunately, Big Tobacco finally realized the flaw in that plan:
- When you pass of PR as genuine scientific research, it is a lie.
- When you lie about consumer products you sell, it is fraud.
- When you defraud consumers, class action lawsuits are not far behind.
- When you get sued, you have to produce old memos, emails, and data relevant to the case... which are usually very incriminating
The Steves' plan is not to claim that oil is causing "too much harm." The plan is to prove that Big Oil used both licit and illicit means to downplay the actual harm of their product, whatever that harm may be. Essentially, when companies engage in fraud, they make it impossible for a consumer to make a reasonable choice about whether or not to use their product... and congress has a long list of laws against that...
Essentially, even if oil is 90% safe, if the Steves can prove that Big Oil claimed it was 95% safe, and that Big Oil downplayed evidence to the contrary, then Big Oil is guilty of both fraud, and conspiracy to commit fraud. That exact tactic brought down Big Tobacco, and it seems like it would be pretty easy to do the same to Big Oil...
I, for one, am curious to see how all this pans out...
The third installment of Where The Hell Is Matt is available on YouTube... and its the best one yet:
I first heard about this from April, who works at Google with Matt's girlfriend. This guy Matt has a dorky little "annoying dance" that he would use form time to time. One day, he quit his job, and traveled around the world with some friends. At the request of a buddy, he did his annoying dance on the streets in Vietnam, and he filmed it.
Then he kept doing it... all... over... the... world...
He put his first video up on YouTube, and it slowly became a huge hit. The Stride Gum offered to pay for his plane tickets, and let him make a second video. I especially like the outtakes. As you can see, he took a slightly different approach for the third one... I liked India and Korea the best.
AIIM sent me an email about their new social networking site for ECM folks, named Information Zen. Its built on top of Ning, like a lot of other community sites I belong to. Mancini is on there, and its probably only a matter of hours before Billy is up there too.
I like it a lot more than the standard AIIM site... I hope they move more of their content over. They have videos, groups, and forums, all broken down by ECM aspect: records management, enterprise search, content management, eDiscovery, etc.
Should be a good place to get community help with strategic ECM questions... it also might be good for unbiased information about ECM vendors: how tough is it to set up, deploy, maintain, customize, etc.
Seems to be growing fast... I joined, then I wrote this blog post, and in that time they got 6 new members! Over 600 members in a few hours... not bad, AIIM!
So I was finally updating my Tortoise SVN client for Subversion... that annoying little windows has been popping up for months, so I finally clicked on it. It took me to their blog... which at first confused me a bit:
Notice the inherent problems with using Google Ads on a site with poor usability! This is supposed to be the splash page for the upgrade, and what happens? I'm greeted with two links to download Subversion clients, neither of which are Tortoise SVN! Once you click the link to read more about the blog post, you get some helpful download links.
But I gotta say, those two other subversion clients probably steal a LOT of traffic and downloads from Tortoise SVN.
Is it just me, or are URLs totally backwards? For example, take this email address:
Nothing too odd... the email is is going to bob, who works in finance at the company. Not many folks do email addresses like this, they might instead do email@example.com, but I did it that way to compare it against a typical URL:
Nothing too odd there, eh? You are going to the blog for the company, the article named my-hands-are-bananas, published in June, 2008.
What always bugged me is how they mixed up the order. A URL is supposed to be directions to find information... and directions always start off general (head east on I-94) and end up very specific (turn off the paved road and stop at the fifth pink trailer home).
But URLs totally mix up the order:
Putting directions in that order makes about as much sense as these directions: turn left at reception, go to this company, go to France, then make a right.
A properly consistent URL should actually be structured like so:
Adding to the oddness... things like .com and .org are called top-level domains. Yeah... it really makes sense to call something "top" when actually its on the "bottom."
Louis in the comments suggested that maybe this would be even better:
That would would sure make type-ahead URL matching a hell of a lot easier...
Attention internet: please change.
Science Daily reported on a new paper on this topic called “The Hidden Perils of Career Concerns in R&D Organizations”. The problem can be summed up like this:
- Great developers want to demonstrate their talent, so they create highly sophisticated code that does amazingly complex things, which might not actually be what the customer needs...
- Terrible developer want to mask their incompetence, so they create enormously obfuscated code that nobody can understand so they must be called upon to make any changes...
Well that sucks... according to the authors, both groups apparently have a strong incentive to make overly complex code! At least they do so in the majority of software development firms. I'd add a third point:
- High maintenance developers need validation that they solved a tricky problem, so they force other developers and users to do complex configuration and initialization of their code to force them to appreciate the complexity of the problem...
They say the solution is more short-term incentives tied to the success of a project. I got not problem with that... but when it comes to code complexity, "success" cannot be determined for years after product delivery. Only after people need to patch, upgrade, and modify the solution can you really tell how successful you were.
I'm a fan of the old fixes: peer review, customer usability tests, and no code ownership. All three encourage simplicity, and discourage needless complexity.
UPDATE: Lively debate in the comments thread... so I wanted to update with my latest revelation. Great developers only write overly complex code when they don't get recognition of their talent. If they don't get verbal or monetary recognition from their manager and/or peers, they will seek out ways to prove their excellence. In other words: bafflingly complex code. They also do so because of honest mistakes: their code makes sense to them, so they believe it makes sense to others. The curse of knowledge, if you will.
So what's the ultimate solution?
- Peer review all code for new developers: both great and not so great. The time you spend up front will more than make up for easy code maintainability in the future.
- Have a training program in place to mentor the less skilled developers. Make sure they know they add value to the team, its just that they don't have enough experience yet to solve the tough problems.
- Make sure your highly skilled developers get the recognition they deserve... especially if they are working on a project that is beneath their skill level.
- Let highly skilled developers spend some spare time helping out open-source side projects, if their current task is too tedious or too simple to occupy 100% of their time. That will give them the recognition they need, and you still get your project on time.
- Transfer ownership of code about every six months. This will ensure that code makes sense to everybody.
- Force the developers to watch people try to use their solutions. In silence. Let them see hard proof how tough their systems are to use, maintain, or customize, to encourage them to solve the usability problem as well.
This post is already too long... I may expand on this at a later date.
All right... the Twitterverse is all up in arms about how crashy it is, and the lack of a business model... well, at least Jake and Radar are... so I figured I'd throw in my 2 cents, and solve both problems at the same time:
How to make Twitter crash less:
- Ditch Rails.
- Ditch Ruby.
- Rewrite it for Python / Django.
- Use Google App Engine for hosting.
Done and done. Pownce has proven that it way easy to redo everything Twitter did (but better) using Django... and in a remarkable short amount of time. Plus, if you use Django, you can port your entire system to Google App Engine, and get insane scalability and uptime for cheap. Google might even be a willing partner for such a high-profile client with such widely known scalability problems...
I always thought Rails was the wrong tool for Twitter... I'm sure the pragmatic programmers would be all up in arms if Twitter ditched their favorite tool... but who cares? Using the same tool for every job is woefully unpragmatic. "But Rails can do it! Rails can do it!" Ugh... At times like this I let Chris Rock do the talking:
Sure, you can do it, but that doesn't mean it should be done! You can drive your car with your feet if you want to, that doesn't make it a good idea!
Now, regarding the business model, there are these options:
- Charge $10 per year for people who tweet more than 5 times per day.
- Engage businesses, sell them "Twitter Appliances," and train them how presence can boost communication and productivity.
Seems pretty damn straightforward to me... at least, that's what I'd do if I had a brand like Twitter. Move into more of an evangelist model, teach people to collaborate with presence, and get into the enterprise before somebody else beats you to the punch. Heck, they could even sell enterprisey books, and be the first "sexy" enterprise app. I'm baffled why they haven't already done so.
In the meantime, I've moved on. Check me out on Friend Feed.
UPDATE: Garrick posted on another twitter business model... The scalability problem is not due to the number of tweets per day, but in the number of followers you have. Some people have thousands of followers, so one tweet per day from a popular person consumes more resources than a friendless one tweeting every hour. Therefore, perhaps you should charge people to be followers? I'm not 100% sold, because that would discourage popularity. Its also vulnerable to Twitter syndicators like FriendFeed... Why should everybody pay $10 to follow Scoble on Twitter? Just follow his FrendFeed instead.
I first heard that Larry Ellison was the inspiration for Iron Man from fake Steve... apparently Robert Downey Jr. studied video tapes of Larry in order to develop his billionaire persona... complete with goatee, mussed hair, Jesus hands, and everything! Skeptical? You can view video evidence yourself.
Well... it now seems that Oracle is getting in on this reality blur as well...
In my mailbox today, I got the Oracle partner newsletter about a cross-promotional campaign with Marvel. They are promoting the new Marvel Trilogy, starring Iron Man. The tagline is Hardware by Marvel, Software by Oracle.
Since Marvel did the graphics, the advert looks pretty nifty. Its a nice deviation from the standard Oracle marketing material: red, white, and boring... but this is just gonna make conspiracy nuts suspicious.
So what do you think Larry really does in his spare time?
I wouldn't be surprised if he had his own flying suit... but I'd be pretty shocked if it turned out he used it to battle warlords in Afghanistan...
I just love the exasperated, one-eye-open gaze...
Oracle is now is doing a quarterly customer webcast to keep folks up to date about the latest changes in the product line. The next one will be June 5, 2008 at 9:00 a.m Pacific Time. If you'd like to attend, you need to register with Intercall:
Its for customers and partners only... so be sure to use your company email address... you also might want to read more about getting Stellent ECM announcements...
Apologies for the esoteric post, folks... but this is kind of important... Two folks from Yahoo, plus two folks from UCLA, have just released a paper on the ACM about a new kind of parallel algorithm: Map - Reduce - Merge.
If you don't know about MapReduce, its the algorithm that makes most of Google possible. Its a simple algorithm that allows you to break a complex problem into hundreds of smaller problems, use hundreds of computers to solve it, then stitch the complete solution back together. Google says its excellent for:
"distributed grep, distributed sort, web link-graph reversal, term-vector per host, web access log stats, inverted index construction, document clustering, machine learning, statistical machine translation..."
bla bla bla... but MapReduce can't do joins between relational data sets. In other words, its great for making a search engine, but woefully impractical for virtually every business application known to man... although some MapReduce-based databases are trying anyway (CouchDB, Hadoop, etc.)
UPDATE: Some Hadoop fans mentioned in the comments that MapReduce can do joins in the Map step or the Reduce step... but its highly restrictive on the Map step, and sometimes slow in the Reduce step... joins are possible, but sometimes impractical.
Well... this latest twist from the Yahoo folks fixed that: they claim MapReduceMerge now supports table JOINs. No proof as of yet, but there are a lot of folks staking their reputation on this... so its a fair bet. The Hadoop folks seem to be experimenting with MapReduceMerge... so if they spit out some new insanely fast benchmarks, my guess is that this is for real...
What does this mean for relational database like Oracle? Uncertain... but I did hear a juicy rumor about 15 months back: some guy from Yahoo sat down in a room with Oracle's math PHDs, and spent a day discussing an algorithm for super-fast multidimensional table joins... like sub-second performance on 14-table relational queries, with no upper limit. My sources told me the Oracle dudes were floored, and started making immediate plans to integrate some new stuff into their database. The Yahoo connection made me think this might be the MapReduceMerge concept...
Coincidence? Perhaps... but a juicy rumor nonetheless.
Well, this is unfortunate... CMS watch is reporting a rumor about an Oracle Wiki incident. An Oracle partner named Sten Vesterli posted some less than positive feedback about WebCenter on the Oracle Wiki... was promptly flamed by an Oracle product manager, then had his postings removed:
I placed some of the description and the pro/con discussion from my upcoming paper comparing Oracle development tools on the Oracle Wiki. And just like when I posted something not unambiguously positive about Oracle WebCenter on the Wiki, I was immediately flamed by an Oracle product manager, and any trace of negativity edited out of one of my pages.
Oops... looks like a Web 2.0 malfunction.
Second, Sten Vesterli is an Oracle ACE Director, like me. That means we have multiple channels for criticism if we don't like the feature set of the product. We're expected to extend Oracle some level of professional courtesy when we give criticism. I occasionally point out the flaws in Oracle products, but I almost always offer a workaround, and I don't put them on places as high profile as the Oracle Wiki... Naturally, some folks at Oracle would feel Sten was being a tad rude...
But ultimately, a wiki is the wrong place for criticism. Criticism almost always contains judgment, which by definition violates the neutral point of view policy that is on all wikis -- even Wikipedia. As Justin Kestelyn says:
A wiki is not the place for opinion, because opinion does not invite editing, only response.
The wiki was probably the wrong forum for Sten. Want to rant about WebCenter? Then your text belongs on a blog. Oracle's policy should simply be that: criticism belongs on your blog, not on our wiki, or any wiki. Then they should monitor pages that are "hot topics," and delete anything that looks like a rant. Clean and simple.
Hopefully Oracle doesn't try to lock down access to the wiki because of this drama...
UPDATE: Justin got in touch with Sten to figure out what really happened, it didn't seem to involve WebCenter, and CMS Watch blew it all out of proportion... The wiki is thankfully back to business as usual.
Microsoft has been pushing a new XML standard for word processing, OOXML. Its generally regarded as unnecessary, not to mention overly complex and weird... so much so that not even Microsoft Office 2007 passes conformance tests.
Anyway, the world was a bit shocked when Norway voted YES to make it an ISO standard... OOXML looked dead in the water, until this shocker gave it new life... so one guy on the 30-person committee decided to give the inside scoop:
...Halfway through the proceedings, a committee member had asked for (and received) assurance that the Chairman would take part in the final decision, as he had for the DIS vote back in August. It now transpired that the BRM participants had also been invited to stay behind. 23 people were therefore dismissed and we were down to seven. In addition to Standard Norway’s three, there were four “experts”: Microsoft Norway’s chief lobbyist, a guy from StatoilHydro (national oil company; big MS Office user), a K185 old-timer, and me. In one fell swoop the balance of forces [about rejecting OOXML] had changed from 80/20 to 50/50 and the remaining experts discussed back and forth for 20 minutes or so without reaching any agreement...
...The VP thereupon declared that there was still no consensus, so the decision would be taken by him. And his decision was to vote Yes. So this one bureaucrat, a man who by his own admission had no understanding of the technical issues, had chosen to ignore the advice of his Chairman, of 80% of his technical experts, and of 100% of the K185 old-timers. For the Chairman, only one course of action was possible.
Sounds like election fraud to me... if true, this could cause a pretti nasti backlash.
UPDATE: It looks like Brasil, India, and South Africa are challenging this vote. OOXML looks even deader:
Microsoft said last week that it does not expect to make its current generation of office productivity software, Office 2007, compliant with the ISO/IEC version of the OOXML standard... Instead it will issue a patch allowing that software to read and write files compatible with the rival OpenDocument Format, which has already been adopted as standard ISO/IEC 26300.
I warned you... them moose can get pretti nasti...
So there's this whiny little gated community here in the Twin Cities called North Oaks. When I was a kid, I had a white hot hatred of North Oaks... and now I feel justified. Apparently, they're suing Google to keep them off the maps.
Who do they think they are? Area 51?
Apparently, all the roads are privately owned, and there's a no trespassing notice for anybody who ventures into the town of 4,500 people. Google does remove maps for military bases, but thei is the first time an entire frigging town has requested removal.
I'm curious to see how this pans out...
Every wonder why erroneous loudmouths get more airplay than the rest of us? I'm not talking about radio shock jocks, or political pundits, but technology bloggers as well. When you let your emotions run wild, and make crazy (probably false) posts, you usually get a bigger fan base.
Why the heck would that happen? Why do blogs with valid, rational discourse languish, whereas those who are wrong, wrong, just plain wrong get lots of viewers and comments?
Jeff Atwood over at Coding Horror had a recent epiphany along these lines. His technology blog is a little haphazard, filled with lots of good nuggets -- as well as plenty of corrections in the comments. Jeff conforms to the philosophy of strong opinions, loosely held. He says he's not an "expert," he's an amateur. But since software is such a new industry, pretty much everyone is an amateur... And, unlike most folks in the software industry, he's not afraid to admit it.
I think it goes a bit deeper than that...
When Jeff says something that is just plain wrong, it makes people angry, which makes them do something. They try to be the first to correct him in the comments, or it starts a conversation on other blogs that link back to him. His writing is humorous, and I've linked to some of his more controversial posts (such as Rails Is For Douchebags), but that doesn't mean his opinions are valid...
You don't get a popular blog by being correct: you get it by being wrong in a way that makes people react. If you were right, you'll get a decent reputation, but your zone of influence will be smaller. People easily forgive you for being wrong... but they never forgive you for being right.
Linkbait, flamebait, trollbait, whatever you want to call it... it works wonders to boost popularity.
UPDATE: Just to be clear, I like Coding Horror, and I mentioned in the comments below. I just wanted to make the observation that generating an emotional response seems to be the better path to blog popularity... for what its worth.
CMS Watch has some interesting reflections on EMC World and Documentum... apparently, EMC still has decent Enterprise Content Management products, but there's a real lack of enthusiasm in EMC about the whole thing:
Under the covers there remains some good technology and some good technologists, but there just doesn't seem to be the enthusiasm in the rest of EMC to really get behind it. One way of classifying these two groups is that they consist of the remnant Documentum products (built and acquired) over the years. We see many elements of the collaborative DM that Documentum majored on in the past in today's Knowledge Worker division, alongside the updated eRoom offering. In the Interactive Media group we see the old Bulldog DAM products given a fresh coat of paint. Both looked fine in the demo, but in talking to broader EMC sales staff, there was little interest or knowledge of these areas.
The CMS Watch article is also an interesting intro to Content Management and Archiving (CMA)... which seems to be the path that a lot of Enterprise Content Management vendors seem to be taking. Oracle's plan to achieve CMA is with a nice blend of Stellent and their Universal Online Archive... I'll go into more depth in my next book ;-)
As Billy noted with some statistics, archiving is a big deal for a complete ECM solution... It seems like some folks at Documentum "get it," but the jury is out whether the EMC folks will listen...
I first heard this haunting song in Donnie Darko:
Painfully melancholy, but powerful... enjoy.
A few weeks ago I gave a talk about Communication For Geeks at the Minneapolis MinneBar conference. I strongly believe that the majority of software failures are communication failures, and if geeks want to be a part of fun, successful projects, they had damn well better learn how to communicate... because most managers clearly can't.
It was a surprisingly popular talk: I had twice as many attendees as I had handouts...
Anyway, on Friday, I got an interesting call from one of the attendees, Kelly Coleman. He was excited to tell me about a situation where he used one of my tips to better communicate with one of his friends... Kelly took to heart one of the most important lessons geeks need to learn: use empathy before education! I was really happy to hear about it, so I though I'd repeat the lesson here in case others might benefit:
Empathy is not Sympathy!
A lot of people confuse empathy and sympathy... I do it myself a lot. Sympathy is feeling what somebody else feels through you. When you are being sympathetic, you're not really helping much, because you're making the situation about you... In contrast, empathy is feeling what somebody else feels through them. You keep the focus on them, until you're certain they've expressed themselves fully.
To illustrate, the following would be sympathy:
Bob: I just got fired... Joe: Wow, that sucks... but don't worry, you'll be fine! I got fired a few years back, and there's always work available for talented guys like us, right?
Joe genuinely thinks he is being helpful... Joe is not being helpful! Joe isn't listening to Bob at all. Joe is rambling on about his own past, and about his theories of the job market. He's trying to connect with Bob, but he's using sympathy. Sympathy is dangerous, because it leaves Joe open for this:
Bob: What the hell do you know, Joe? That was years ago! You didn't have a house! You didn't have a wife and a kid to support! The job market was completely different back then! You have no clue about my problems! Get the hell away from me! Joe: ...I was only trying to help...
Bob is clearly in a lot of pain. He's afraid of a lot of things, and his good buddy Joe clearly isn't listening. So Bob lashes out, and wisely tells Joe to get the hell away from him. Then Joe gets defensive, and says something even stupider. With luck, they'll be friends again in a few weeks... but you never know.
In contrast, empathy almost always is better... it would look something like this:
Bob: I just got fired... Joe: Wow, that sucks... you must be feeling pretty scared right now, huh?
Ding! Ding! Ding! Ding! Give Joe a cookie!
See the difference? Joe didn't make it about himself... he kept his focus on Bob. He asked Bob how he was feeling, and after Bob answers, Joe should keep asking. He should let Bob vent about his situation: his wife, his kid, his house, the job market, whatever. Even if Joe knows a guy who might give Bob a job, Joe should shut the hell up until Bob's finished venting. This may only take five minutes, or it might take a whole hour. Either way, its an important part of the process. Bob will not listen to what Joe has to say, unless Bob feels Joe fully understands his situation.
Empathy before education. Always.
How does Joe know when Bob's finished venting? He'll hear something different in Bob's voice: hope. When Bob is open for suggestions, he'll say something like, "what do you think I should do?" or "have you ever been in this situation before?" Only after Joe hears this, is Bob ready to listen to new ideas, new possibilities, and new ways of fixing this problem. Only after Joe hears hope, or a direct request for help, is Bob ready to hear what Joe wants to say. If Joe wants to help Bob, Joe needs patience.
Now... empathy is not easy, and its extraordinarily difficult for engineers.
Most technical people have been brainwashed by years of "education" into believing that there's a "right way" to do everything, and that its our job to fix it. When something is "wrong," we want to dive in and tell everybody how to make it "right" again. Its a trained compulsion. This is why engineers make lousy lovers, but excellent terrorists. In both cases, its a lack of empathy that dooms us to this fantasy world of absolute right and wrong, making it impossible to see things from another perspective.
Sound like anybody you know?
As such, it will be difficult for software engineers to learn empathy... but they needs lots of practice before they can move on to even more advanced forms of communication... which I'll be talking about on a later date ;-)
Should We Always Use Empathy?
That's a tricky question... empathy takes a lot of time, and sometimes you don't have the luxury. However, it is important to understand what empathy is, so when people "fall off the wagon" you won't take it personally...
For example, blogger James McGovern decided to practice empathy which got some props from Billy... however, James' blogging style does not lend itself well to empathy... He's snarky, and enjoys to inciting fights, so he can better understand who has a better position. If everything were puppies and rainbows, I'd probably stop reading his blog. No surprise that James then went back to his old self after about 23 hours...
And lets not forget Broc Samson's mystic journey in Venture Brothers... where in a dream he learns the value of empathy and feels great... but then is confronted by his former special ops trainer:
Broc Samson: What about uhhh, humanity and empathy and all that garbage? Hunter: You're a tool, boy, a tool! Built for a single purpose by the United States who shut your third god damned eye for a good f$%&ing reason! You can't teach a hammer to love nails, son. That dog won't hunt!
Yep... an army of empaths sure would be cool... but in the meantime, we live in a world of conflict... so until everybody understands the power of empathy, its probably best to know multiple ways to deal with conflict. In order, I prefer empathic communication, principled negotiation, then Broc Samson.
In the meantime... practice giving and getting empathy. Its far more powerful than you realize.