Today I finally made it to a session... this one was on SOA and Oracle's Enterprise Service Bus. I have several concerns with ESBs, and it was nice to get some straight talk about the pitfalls. BPEL isn't the best language to tie together business processes -- as Lonnke Dikmans says, it's tricky to deal with external events.
Plus, BPEL is a declarative XML language. I like declarative languages for simple things... but it gets pretty complex for large processes. A scripting language is almost always superior for orchestrating complex tasks -- compare Ruby Rake to Apache ANT for example -- but a declarative language is easier to modify with GUI tools. A non-programmer can configure a complex application if its in a declarative language... but they do seriously bind a developer's hands.
A scripting language with annotations would be the best of both worlds... the developer exposes what is "tweakable" with code annotations... and a non-programmer can modify the annotations. A workflow designer's primary use is to spit out boilerplate code... similar to how Scaffolding works with Ruby on Rails. The boilerplate jump-starts the development, and exposes standard annotations... Done and done.
Anyway, the next generation of Oracle's ESB will be based more on Coherence... which means better support for events, and it will no longer be one single logical bus: it will be cloud-based. That means more flexible, more robust, and faster... in theory. I'm anxious to play with it next year.
Next was on to the keynote by Michael Dell. I'm really impressed with the green initiatives at this conference. Yesterday Intel was freaking out about power consumption, and today Michael Dell was presenting his latest line of low-power computers. They also started a world-wide free computer recycling program. These data center guys are really freaking out about power consumption. at present, about half of the Fortune 500 spend more to power their servers, then they do on computers! Power centers consumer 1.5% of the power in the US... and its going to get a whole lot worse.
Let me remind you: data storage is growing faster than Moore's Law. Many computer folks have to think 10 years ahead... so they know that we're going to be in trouble soon. I think computer companies will soon be a huge force for finding cheap, clean, energy alternatives... otherwise, ten years from now, their business model of bigger-better-faster-more will be in jeopardy.
Between Dell's and Larry's keynote, there was an interesting "crowd sourced marketing" stunt... They showed a big board to the crowd, asking what your thoughs were on the values of integrated IT. You could send a text message to Dell, and your message would be displayed on the big board. The messages varied, from "better ROI" and "job security," to "integrated cowbell."
Later, it was off to the customer appreciation night, with 30,000 other folks. The highlight was seeing Lenny Kravitz in concert, who played a great set... He joked, "Man, you people throw some big parties. Y'all must be doing well!"
heh... Larry certainly is ;-)
Missed all the sessions again today. I went to a publisher's workshop, which was pretty good. A bunch of project managers showed us what was super cool in Oracle's latest products, trying to tempt us into writing a book on it. I probably learned more there than in the average session anyway...
In the Keynote, the CEO of Intel showed off their latest chips. They are extremely low power, I believe it has a 70% increase in performance per watt compared to the previous generation. They also introduced the first lead-free CPU. The demos of the chips went OK... the power was low, but one of the computers crashed. However, it turned out to be a really good demo of the failover power of Oracle Coherence... which I'll talk about later.
Then Thomas Kurian showed off some nifty Fusion products, including the next-generation of web content management: Open WCM. Wicked cool... it solves so many problems. Its been in the works for almost 3 years, and they will finally be releasing it next year some time. I'm trying to get a beta version...
I hung out in the demo-pods most of the day again, checking out how the Fusion Middleware apps could possibly work together... Service Oriented Architectures are good, but they can't do everything. To do everything, you need an overlay of an event-driven architecture over a service-oriented architecture... ideally all tied together with a Turing-complete scripting language. But that means you can't "orchestrate" apps by drawing boxes and lines. Plus, you can't have too many events, otherwise your code turns into a big giant mess.
Still looking for the answer...
I didn't get a chance to go to any sessions today... I had a meeting with a publisher about a potential second book -- cross your fingers! I'll know this week if they think it has a market. Afterwards, I spent most of my time at the demo pods playing with different kinds of Oracle software. Then it was a quick drink with some Content Management customers, then back to the demo lab to help set up computers for presentations tomorrow.
Seriously... its 11:30 pm, and I'm blogging from the hands-on lab. Mondays suck.
The number of Oracle products is a bit overwhelming... especially in the applications stack. There's a ton of overlap in the JD Edwards, Peoplesoft, and Siebel product lines... not to mention the dozen different ways to implement any project with Fusion Middleware. If your goal is to always use the most innovative solution, you're in for quite a long selection process... and by the time you're done, somebody will have released something better!
Software innovation happens so fast your best option is to satisfice... you're better off with a good relationship with a people in mediocre product line, than a mediocre relationship with people in a good product line.
Tomorrow I have a half-day meeting, then I'm speaking twice. I hope to catch at least one session, but I might get another goose egg like today :-(
Before closing... I feel that I should mention that a man was murdered outside Oracle Open World last night... shot five times outside the movie theater between two of the Open World buildings. I have no glib comment here... I just find it totally bizarre that this would happen, despite dozens of police within sight in every direction. I'm not frightened... just sad and confused...
Sunday... registration, and a long meeting with the other Oracle ACE Directors in the community. I probably don't make this point very clear: I do not work for Oracle. My ACE title is honorary, because of my work in the developer community. Anyway, it was nice to finally meet Lonnke Dikmans, Jason Jones, Frans Thamura, and other blogless folk...
Anyway, Oracle was kind enough to give us the inside scoop on a lot of new technology features. I'm still unclear about what I'm allowed to talk about, so I'm gonna play it safe and wait for official announcements before I open my big yap. Overall, I'm really impressed with Coherence, cautiously optimistic about Web Center Suite, more realistic about BPEL Process Manager, and am slowly being won over by SOA Suite.
I'd also recommend that all Oracle atendees check out the Oracle Mix web site. This site was thrown together at the last second, but is a pretty cool social app for Open World conference goers...
Well, I'm hitting the road a bit early for the Oracle Open World conference... I'm a presenter for the following three sessions:
- S291738: 50 Ways to Integrate with Oracle Universal Content Management, Tuesday 3:15-4:15
- S292624: Hands-on Lab: Building an Enterprise Web Site from Scratch, Monday 12:30-1:30 and Tuesday 4:45-5:45
- S292625: Hands-on Lab: Experience Oracle Universal Content Management, Wednesday 9:45-10:45
The 50 Ways talk will be similar to the Intro to Integration talk from Crescendo this year... but I have some different examples... For this talk I integrated Ruby On Rails with the Content Server, with a clever application of JRuby, Content Integration Suite (CIS), and the obligatory voodoo magic. If you're curious, be sure to attend ;-)
For the hands-on labs, I'm only going to show up if the speaker needs extra help... so no promises!
Anyway, there are over 1500 sessions this year, so its a little hard to choose which event to attend... plus so many of them sounds exactly the same. No big deal... I usually learn more by networking with fellow geeks then by reading powerpoint slides...
Personally, I'm curious about where they are going with Identity Management, Business Intelligence, and SOA Suite. I'm going to sit on a few of those talks, and pepper the presenters with trick questions. I'm mildly curious about Web Center Suite, BPEL, and Secure Enterprise Search... but I've already seen a lot about that.
Its a little tough to schedule my day. Usually conferences send you a brochure with all tracks neatly color-coded, and logically grouped by time slot. However, Open World 2007 is trying to be "paperless," and so they launched an online Open World schedule builder. Unfortunately, its usability is significantly more clunky than a dead-tree...
Oh well, there's always glitches when you do something the first time. I'm pretty sure next year's scheduler will be much better. Otherwise, I'm might have to make a Greasemonkey script...
See you there!
InfoWorld put together a very complimentary article about the latest version of Oracle's Enterprise Content Magament suite: Oracle Universal Content Management Lives Up to the Name.
It covers a bunch of new features that the Oracle folks crammed into this version... improved SharePoint integration, better records management, the Site Manager for web content management, exposing the content refinery as a service-oriented architecture, and lots of Web 2.0 bells and whistles.
The review went in-depth decently, but a lot of the truly cool features in 10gr3 are difficult to explain. File Store Provider, some new Schema goodies, better support for super high volume sites, better architecture for developing components... those take a while to appreciate.
Makes ya proud... even tho I don't work there anymore ;-)
I've heard many a complaint from Stellent customers about Oracle's MetaLink site... Its a lot clunkier than Stellent's old support site, and pretty hard to find a specific patch, sample, or extra. Especially considering the fact that the product names appear to be in flux, so often you don't know what to look for... ECM? UCM? WCM? URM? WTF?
Anyway, here's a list of links I've collected for finding the good stuff straight from updates.oracle.com:
- All Universal Content Management Updates, sorted by release date. Includes Site Studio, Refinery, Tracker, and most add-ons.
- UCM Patches for just 10.1.3.3.1 (latest as of Nov 2007)
- UCM Patches for all 10.1.3 (should get most of what you need)
- All Universal Records Management Updates
- Information Rights Management
- Outside-In Conversion Updates
Please note, you need to use your MetaLink password here... which means only Oracle customers and partners are allowed in!
Also... as of Nov 4th, there are no updates for anything besides UCM on updates.oracle.com... those last 3 links have no results. Hopefully this is where they will be placed, otherwise I'll need to tweak those URLs in the future. For now, they are a good starting point for finding the patches you need.
Yes... the legends are true... Oracle finally released the long awaited product: Content Folios. For those who don't know, its like Folders on steroids. Essentially, its a way to group content items, folder, forms, images, web pages, anything into a group called a "folio."
Of course, you've always been able to link items together through metadata and tags... but now it's been formalized... and this bad boy has a very flashy interface.
It's not a separate product, nor its own download... it got snuck into patch 6602355... which is the Content Server 10.1.3.3.1 Update Bundle. Its only available via Oracle MetaLink, so its only available to customers and partners.
I'll be showing off more of its features next week... as will Billy Cripe, I'm sure.
UPDATE: as mentioned in the comments below, you should probably use patch 6907073, which is the Content Server update for 10.1.3.3.2.
An "oldie" but goodie: Information Revolution, on how old assumptions about organizing physical data needs to be rethought, in favor of what is possible in the digital world:
I'm not 100% in agreement... sure, multidimensional "tagonomies" are superior to rigid "taxonomies," but that's been known for ages. It's certainly not unique to the digital world. Ask any microbiologist if they are happy with the scientific classification of life forms into kingdom, phylum, class, order, family, genus, species. How is a microbiologist supposed to use such a rigid system? Some of the life forms they discover appear to be both a plant and an animal.
People have preferred flexible "tagonomies" to rigid taxonomies for decades... even these new-fangled "hyperlinks" are nothing new... unless everybody has forgotten the ancient art of cross-referencing and footnotes.
No... The digital revolution has added nothing magical or mystical when it comes to content management... It has helped findability, but a side effect has been more useless data to sift through... The primary benefit of the revolution is that it enables everyone to be "experts!" The key to propelling the digital revolution is in engaging more experts, and improving as many dialogs as possible.
The problem of "content management" is relatively the same; now begins the problem of "expert management"...
But that's merely my expert opinion ;-)
A little known fact... you can finally use Google Book Search in order to search through my book on Stellent/Oracle Enterprise Content Management. You can restrict your search by starting from the Google page for my book, or just use the form below:
I placed a copy of this form on the official page for the Stellent Content Server book. The search results aren't as good as having a PDF version of the book (available from APress Publishers), but its ten times better than the index supplied in the "dead-tree" version...
Apparently, the JDeveloper crew over at Oracle thought this startup tip was relevant:
An angel food cake will slice neatly without crumbling if you freeze it first, then thaw it.
Cute... that's exactly what I want from a Java IDE... Apparently another tip was this:
Aluminum foil pierced by a fork on a bed of ball bearings is not microwave safe.
hoooookay... I need to close my eyes and reboot before my laptop transforms into Loki.
James posted five Enterprise Content Management (ECM) questions on his blog... hoping that either John Newton or myself would reply. This time he takes the interesting approach of calling me smart instead of stupid...;-)
So let's get started...
Why do some ECM systems have a tight coupling with user data? Stellent/Oracle does not do this, but other do. I can't say for sure, but I'd bet access control list (ACL) performance is a major issue. IMHO, compared to other ways of adding security, ACLs do not scale. That makes stuff like project-level collaboration software very tricky to do right. Making it flexible, secure, fast, and enterprise scalable is nearly impossible.
If you have N employees, you have about 2N possible ACLs... 10 employess mean 1024 possible ACLs, but 20 employees means over a million! For 1000 employees, you wouldn't be able to enumerate all possible permutations on any currently existing digital storage device...
Do you really want to allow average users to design project spaces with ACLs on the fly? For workgoup-level servers that's fine, but enterprise-wide??? Holy crap, no! You'll probably need an in-memory cache of lots of user data, and some form of pre-compiled ACL expression-matching objects, not to mention the problems with latency... this means a tight-coupling between users and content is the logical sacrifice to make.
Decoupling is probably not a huge issue, but performance will suffer like the dickens...
SOAP versus REST? These days, I say nuts to them both. I'm more in favor of building service-oriented architectures with JSON web services. Ditch the entire SOAP stack, and go back to basics. However, not so basic as REST, and not so idiotic as XML-RPC. Mark Masterson suggested Atom and Atom Publishing Protocol... I think that those are great options for content delivery, but its too narrow of a pipe for content management. I don't want to tunnel context sensitive metadata display, or business process management, or several layers of customizations through AtomPub.
If I ever got behind an ECM standard, it would have to be a JSON-based web service. Simple, open, cross-platform, extensible, flexible, discoverable, mashup-ready. I would accept nothing less.
And speaking of standards, the next two questions were kind of the same... Why don't people talk about interoperability at AIIM conferences? That's a good question... as is the question why do ECM people seem ambivalent about standards?
Pie guy has weighed in on this several times... as has Billy Cripe, and... um... I don't know. I can tell you why people rarely talk to me about standards: I have a seething hatred for 90% of software standards and I'm unfortunately gifted at intimidating people.
Don't get me wrong, I love the RFCs! Those built the internet. But ECMA, JSR, Oasis, and the W3C can collectively suck an egg for all I care. Those committees don't innovate: they merely bully and claim credit for those who do. They're accountants and lawyers, not creators. Gimme a well-documented API over clunky, overengineered, committee driven, ill-conceived "standards" anyday...
I have a longer anti-standards rant in the works... which coincidentally includes anti-rules engines rants, and anti-portal server rants... it was just too ungodly long to include here...
So, the last question: where the heck are the ECM patterns? Answer: in my noodle. I got 'em rattling around in my brain like a dried pea in an empty tuna can.
Maybe some day I'll let 'em out...
(apologies to Dave Barry)
Both James Governor and James McGovern -- or is it the other way around? -- have been chatting about compliance-oriented architectures... or a way to add records management and retention management as a service to the enterprise as a whole. It should be an infrastructure component, and not require your records to be migrated to your monolithic records management system.
All I can say is, been there, done that, yawn.
I'd advise them both to check out Oracle Records Management Agents. Stellent's Records Management Team envisioned those three years ago, made them two years ago, and they are a big part of Oracle's Universal Records Management strategy.
You don't need to put your data into a records management repository to manage it like a record! You just need an "agent" that runs in your remote system -- email archiving server, file system, 3rd party CMS -- that "calls back" to the content server via SOA when specific events occur.
For example, Oracle's URM will block somebody from deleting an email from the archive if the retention policy won't allow it. Likewise, it will force a delete, if the retention policy enforces it.
If you don't want to move all your content into one single repository, fine! But you do need a single-point for defining retention policies... especially for large organizations with multiple email archiving systems.
Of course, innovative architectures are nothing new for the Stellent crew... we had SOA about eight years before there was a name for it. I guess we likewise had "COA" at least three years before anybody else knew it was important... and then there's the stuff I'm not allowed to talk about.
But, big snaps to the Stellent Records Management team... your architecture is finally being dubbed as the standard for others to follow. The names of the developers I'll protect so they don't get spammed, but they know who they are ;-)
Let's say I have a web site... should I add new features, or follow the law? Usually this isn't an either/or proposition... typically people don't have to make a choice between adding cool features to their products, and following the laws... but when it comes to web sites, it ain't so simple.
On the one side is shiny new technology: AJAX, Mashups, Adobe AIR, Microsoft Silverlight, etc. As new innovative buzzwords come about, customers demand them.
On the other side is the law: accessibility standards for the visually and physically handicapped. This is vastly more important than people understand... The web has done more than the wheelchair to empower the handicapped. Shopping, research, finding friends, being a part of a community, helping others, even building a home in Second Life. Thus, finding a site that you can't use is vastly worse than a building without a ramp... its more like a building with a disappearing ramp...
Unfortunately, a lot of these shiny new technologies break existing standards. Most of the standards about clear labels and navigation are easy to implement... but others aren't. Screen readers need you to refresh the web page in order to "trigger" the event that something has changed... however most of the new technology focuses on changing the page without a refresh. What to do?
Well, you all know how I feel about standards... screw standards! Focus on the goal -- empower the handicapped -- and innovate your way out of the problem.
Now... there has been plenty of work on finessing Web 2.0 technologies to barely follow the accessibility standards... AJAX Patters has several. I prefer the more direct approach. New Web 2.0 technologies are not accessible, because screen reader technology is almost a decade old. The laws are written to conform to ridiculously outdated software... but people will stick to it until somebody has a better idea.
My solution? Firevox! Its a free, open source screen reader extension to Firefox. It is a native plug-in, and thus anything running on the page -- including Flash -- can be configured to "trigger" a redraw event in the screen reader. It's lacking a coherent set of patterns and standards for its usage, but that just takes time.
After this, web 2.0 technologies will help the visually handicapped even more than average users! Most of us have never used a screen reader... so trust me on this one. Its unbelievably painful to have to refresh and reread a web page when only 10 words have changed. If done right, Web 2.0 could empower the blind to surf the web much faster than before.
If you're a big giant company with plenty of cash (ie, Oracle), concerned about web sites and accessibility (ie, Oracle), and no love of Microsoft (ie, everybody), then let your developers help out on the Firevox project. $100k, 6 months, problem solved.
Of course, this means that sites may need to say Best Viewed with Firevox on them... but I won't shed a tear.
I got a question from a friend who is helping a client manage a fairly large repository of content: about 10 million full-text items. Naturally, they're concerned about search performance. They don't think they'll even need to scan all 10 million items for an open-ended search... but they're interested in a worst-case scenario.
The client seems to be following most of the best-practices for this kind of system:
- Do not allow open-ended searches for average users.
- Use customized query pages that pre-fill metadata to pre-narrow the search.
- Direct users to fill in additional metadata as needed
- Allow open-ended searches only for special users (administrators, auditors, etc.)
But this got my friend thinking: how many search results is ideal? You get massive diminishing returns on search, even Google. Wikipedia is replacing Google for many kinds of information research...
The rule of thumb I used is that if my full-text query returned more than 200 results, then I didn't construct a proper query. I learned this after years of searching for academic research papers before the web existed... You need to use statistically improbable word combinations to tease good results out of the chaos. Don't search for computer game, search for evil clown anchor catapult.
No, "evil clown anchor catapult" its not a computer game... but it should be.
Naturally, this means you need to have a very good idea of what you're looking for in order to find it. Such advice is worthless for window shopping... Metadata and tags help tremendously for general browsing... but then again, you're trusting somebody to use the same keywords you would use. If taggers aren't professionals, or at least passionate about findability, then you'll miss out on a great deal.
Its not uncommon for researchers and auditors to perform an exhaustive search, get thousands of responses, and then read everything they find... but the rest of us get bored quickly and need help.
That help usually takes the form of good information architecture, which requires a pretty good understanding of your content and your audience. The vast majority of the content in a 10 million item repository is of little interested to the average user. Its of great interest to the content creator, and the 3 or 4 people who need that specific report, but for everybody else it's noise.
Without information architects, email becomes the most used search engine for your site!
If people can't find things by browsing, they will phone, message, mail, or otherwise annoy others to get what they need. If you think you can't justify an information architect, count up how much money your organization wastes emailing links to people. I bet you'll be surprised...
I usually advise people to purchase Information Architecture for the World Wide Web, which is a good introduction to this field. Its a very good grounding on the subject. After that, I'd suggest Don't Make Me Think and Ambient Findability.
But, even if you're well grounded in the theory, it's still worthwhile to hire a specialist.
Jake is framing the dispute by saying he's Web 2.0, and Billy is Enterprise 2.0... thus Jake and his applications are bottom-up... whereas Billy and his Enterprise Content Management are top-down. Jake's about people, Billy's about data.
I love top-down versus bottom-up discussions... but Jake's entering this conversation a bit late...
First, I'd like to say that Billy's been in Content Management waaaaaaay before that pesky E got pre-pended. I recall the good old days back at Stellent when the web was young... we had to convince people that this internet thing was useful, and that HTML might be a good format for their content.
Those were surprisingly tough sales...
Thus, most of the initial sales were specific apps for specific departments. That's bottom-up all the way: get the information that people need to them in the proper format. Purchasing Content Management only became a top-down decision once the web really took off, and some companies has several hundred unmanaged web sites... it was totally impossible to find anything, let alone properly secure and manage it.
Enter ECM... then we finally (if rarely) were able to talk to CIOs and demonstrate to them a need.
Enter Web 2.0... or more specifically, the hype and buzz about blogs, wikis, RSS, and social apps... Guess what? Its the same damn problem. Over 2 years ago, many Stellent employees were warning their customers quite bluntly: your company will be using blogs and wikis in a few years whether you endorse them or not. Social apps were less of a concern: the combination of critical mass of adoption plus shrink-wrapped solutions were several years out... say 2008 or so...
Thus, I feel obligated to point out Jake's point of view is hardly a surprise to the ECM folks at Oracle... in fact, they've been quite well versed in this problem for over eight years.
Now... regarding the 2.0 stuff, I'd argue it's not about data, nor is it about people... its about knowledge.
Noodle on that...
I may have read too much into James' criticism yesterday... he clarified his point that WSDLs in the ECM space are awkward, which I agree with.
I feel this is probably because most Enterprise Content Management vendors are still pretty green when it comes to the value of web services, and service-oriented architectures (SOA). Stellent was using SOAs nine years ago, before there was even a name for it. It helped us solve specific problems, and add features rather quickly. Adding an XML/SOAP interface was a natural extension to the existing "content services"... so, provided Oracle learns through osmosis, its safe to say Oracle "gets it" when it comes to SOA and ECM.
I'm confused when he claims I'm pro-ReST, because I'm quite clearly a critic of ReST for enterprise systems... although I like its simplicity. And Stellent/Oracle ECM services are both clearly coarse-grained and stateless. I cover that in chapter 2 of my book on Stellent, which I would encourage he read before judging. And, as I said earlier, custom security integrations are fairly simple... so "deferring authorization" is possible in a number of ways.
Here's a quick sample of what Stellent web services looked like nine years ago:
And here's what they look like via SOAP:
And that's just one of the ways to get SOAP output... Note: it ain't perfect, and never will be. I'd like Oracle to make a handful of changes so its easier for the consumer to know what to do with the response... but that limitation is mainly due to the fact that Stellent was several orders of magnitude smaller than the competition, and thus comprehensive developer documentation was a lower priority...
However, its a genuine web service, and not a lame wrapper around an existing fine-grained API... which I believe is what James was complaining about.
James is at it again... this time inciting me by claiming I support crapy WSDLs as an Enterprise Content Management (ECM) standard.
I like SOAP and SOA... but I hope it wont come as a shock that I hate WSDLs, and I'm hardly a Java fanboy. So the odds are quite quite low that I'd support a crap WSDL standard that names elements arg0, uses a session ID, demands a plethora of complex types, or is to Java-y.
As I've said before in my anti-ReST rants, WSDLs and the Microsoft WS-* stack ruined the simplicity of SOAP on purpose. Why? According to Tim O'Reilly, a Microsoft architect confessed that they did it to force people to use Microsoft tools to SOAP-enable their applications. Typical... unfortunately the Java folks followed suit, and now anything other than bare-bones SOAP is a nightmare to code without an IDE.
Even if that weren't true, I still don't like the idea of WSDLs because it lulls people like James into a false sense of security... once we have WSDLs, it will be easy to get all of our systems to communicate!
They said the same damn thing about everybody using databases, everybody using portal servers, everybody using XML data files, etc... and I'm not amused. Integrating systems always requires a thoughtful hand, unless the applications are nearly worthless commodities... such as a toaster and an electrical plug.
The problem is not lazy developers, the problem is the fundamental difference between syntax and semantics and its effect on the human/computer interface. Even in mind-numbingly well defined web services problems -- such as stock trading data -- people call the same thing by different names. Why? Because its useful. It adds value. And that causes people who create web services for stock market info to give different variable names to the same exact thing... even though the terminology of the stock market has been the same for decades.
So... lets assume they eventually get together and agree on terms, and now all APIs to stock market info are exactly the same. Great! Now their service is a worthless commodity. In order to survive, they need to innovate. What does that mean? Inventing new ways of looking at the same data... then others will follow... again, using different words for the same damn thing. Suddenly, their APIs no longer mesh.
Anybody recall when linguists decided to invent Esperanto? Gosh, imagine it! A universal language that everybody can communicate with! All language problems were instantly solved overnight! Hmm... then why on earth did they also have to invent Interlingua, Ido, and other IALs? Because they were solving the wrong damn problem.
Now, I'm not saying all standards are crap, I'm saying that there are already four of them that treat ECM like a worthless commodity. There's absolutely no point in demanding a fifth one that's exactly the same, except based on fancy-new-buzzword-X. That is, unless there's something specifically wrong with the existing ones that can only be solved with a WSDL. Otherwise, what's the point? His complaints thus far can be solved with a tiny bit of refactoring, and better documentation.
And finally, nothing gets done without incentives. Plenty of ECM vendors have done quite well by delivering what the business users want. Some anticipated demand, some created demand, but most were followers. Unless somebody actually sees or anticipates a demand for a fifth, sixth, or seventh ECM standard based on WSDLs, its relatively pointless to give people the same mistake in a different wrapper.
SOA is still not being done well in the enterprise... people need to fail a little before best practices emerge. Until other apps are as SOA enabled as (certain) ECM systems, its a bit pointless to push so hard. I stand by my original statement that a decent ECM standard won't be worthwhile until 2009... but my current money is on something similar to WebDAV or APP; it won't be a Java-specific standard (JSR170 or JSR283), nor will it be anything as pointlessly complex as WSDL... plus it won't take off until Microsoft endorses it.
So ultimately, James and I agree. A WSDL standard for ECM would be a nightmare... but not because of limitations in the ECM market: rather because of limitations and broken promises on behalf of WSDLs.
UPDATE: I may have read a little too much into James' post... see my re-response.
So I was out having pasta last Thursday with a buddy of mine... his job is setting up Active Directory Federated Services (ADFS) at Microsoft HQ, and has been for many years... When stuff doesn't work right, he's one of the guys who gets to tweak the system till it does. When things go really wrong, he gets to phone a real live Active Directory developer, shake him out of bed, and see if the developer knows the voodoo incantations to get things working again.
Fun job... and I'm sure a lot of Active Directory admins would give their left arm for his rolodex...
Under the hood, ADFS uses Kerberos (plus voodoo) for authentication, and a SAML token for authorization -- a.k.a. entitlement management... he's helped set up federated access between Microsoft and several partners (such as Intel). He said its a whole lot easier now than it used to be. Its still far from simple to configure and manage, but setting up certificates is a breeze compared to the early betas.
I told him about my reservations regarding SAML (noted by certain bloggers)... I like the goals and all, but it was so complex I just didn't think it was (yet) worth the well-known maintenance effort. I preferred a "wait and see" approach. If I saw it hit a critical mass, then I'd bite. Then he said, "don't you understand? SAML completely eliminates the concept of the extranet!"
Then it hit home...
I've been doing web content management (WCM) for so long I'm stuck in the internet/intranet/extranet thought mode, and I just assumed that people would keep doing it that way... but a SAML integration would mean that one single logical server could satisfy the security needs for all audiences.
As Alec says: that's the way the internet used to be, and its about time it went back.
Not that such a pipe dream would necessarily happen... you might currently have dozens of content silos. However, if they are bound to a SAML enabled user repository, you could have no fear allowing access to that content from your extranets. Extracting data from a silo is a whole separate issue, naturally... but at least when you do so, it's secure.
Of course, the extranet as a concept isn't fully eliminated... I'd like my parter Company Foo to see a Company Foo branded site... and my other partner Company Bar to see a Company Bar branded site... but that's more personalization than anything.
Naturally, the devil's in the details. Just because it's possible to do it all with one system, that doesn't mean it'll happen. Setting up flexible personalization, reusable content, and getting everybody to agree on an "Entitlement Management" system won't be a picnic... The three p's always rear their heads: politics, paranoia, and performance. Also, from a security standpoint, some may consider SAML to be brittle and not defensible -- a single point of failure, in other words. Also, it's probably economically infeasible to force everybody onto one logical system... much to the chagrin of IT.
I'm still not sold, but I'm warming up to SAML...
Content silos -- the sworn enemy of enterprise content management -- are perhaps inevitable... because consolidating all that information is a never ending task. Consolidation helps mitigate the negative effects of silos, but even better are tools and systems that make consolidation unnecessary...
James McGovern was busy this weekend... one of his blogs asks the simple question why do software vendors sell insecure products... another asks specifically why Enterprise Content Management (ECM) people avoid talking about it, and another lamenting on how few ECM blogs are out there.
Well, not to be boastful, but I personally blogged twice about security holes in ECM products in the past 2 weeks: security holes in JSR170, and security holes in Web 2.0 apps. I don't know why other bloggers in the ECM field are being so lazy ;-)
Now... regarding why software vendors sell bad security products... I believe the Nobel-Prize winning economist George Akerlof covered that back in 1970 with his paper about buying used cars: The Market for Lemons. You can thank Bruce Schenier for bringing this to everybody's attention.
I blogged about lemons in security software when Schenier's original article came out... basically the problem is one of information asymmetry: the seller knows much more than the buyer. When it comes to used cars, a decent used car will cost $10,000, and a lemon costs $5,000. Unfortunately, the buyer isn't sure if that expensive used car is a lemon. Thus, to save money on the off chance that the car sucks, buyers typically pay the average price in situations with information asymmetry problems... This creates an economic situation where decent used cars don't sell well... thus fewer people bother to sell decent used cars... and eventually the used car lot is nothing but a bunch of lemons.
You can fix the information problem with economic signals, such as having a trusted mechanic look over the vehicle before you purchase it... that way even in a bad market, you can find good products.
The same holds true about security software: the seller knows much more than the buyer... vastly more in fact. Thus the market ensures that most people buy security software "lemons". To make matters worse, the economic signals are either worthless (such as endorsements by analysts), or prohibitively expensive (such as a complete penetration test by a security firm).
Thus, until the market situation changes, people will continue to purchase mediocre security systems... even if decent ones exist.
Its a bit unfair to blame the enterprise, or even the software vendors, for this market reality... they're all just people at a user car lot trying to do their best. Until there are genuine economic incentive to having secure systems (discounts on liability insurance), and there are decent economic signals about what products are actually secure (endorsements from insurance companies), it doesn't make business sense to purchase expensive security software when you have no evidence that it is actually secure.
As Schenier has said many times, the best economic option is for enterprises to talk big, but do as little as possible.
I'm sure that's frustrating as hell for the guys in Information Technology... but that's why we need Enterprise Architects...