It’s a Bird … It’s a Plane … It’s ACM! (Advanced Case Management)

ECM and BPM evil doers beware!  The days of creeping requirements … endless application rollout delays … one-size fits all user experiences … and blaming IT for all of it are over!

Advanced Case Management is here to save us.  Long before this superhero capability arrived from a smarter planet, we’ve had to use a bevy of workflow and BPM technologies to address the needs of case-centric processes.  In most cases, this has not worked well.  That’s because case-centric processes are different.

Traditional BPM processes tend to be straight-through and transactional with the objective of completing the process in the most efficient way and at the lowest possible cost and risk.

Case centric processes are not straight-through.  They are ad-hoc, collaborative and involve exceptions … sometimes, lots of exceptions.  In certain cases, these processes are so ad-hoc or collaborative that it is not realistic or possible to map them.  That’s because the objective is to make the best decision (within the context of the case) and the path to the right decision may not be known.  Speed and cost are always important but take backseat to achieving the best outcome … which usually involves customers, partners, employees or even citizens / patients.  You get the idea.

Why should you care?  Most “C” level survey these days lists Reinventing Customer Relationships at a top priority.  The same goals are seen again and again:

  • Get closer to customers (top theme)
  • Better understand our what customers need
  • Deliver unprecedented customer service

From a technology perspective … this means we need new tools to build those solutions that enable us to get closer, better understand and deliver optimal service to our customers.  Most customer oriented processes are case centric involving human interactions.  They tend not to be straight-through.

The traditional BPM model which depends on (1) process modeling, (2) process automation and (3) process optimization works fine for the straight-through processes … not so much for case management.

As such, a big gap exists today to build solutions that drive better case outcomes.  To close this gap, new tools that bring people, process and information together in the context of a case are needed when:

  • Processes are collaborative and ad-hoc
  • Activities are event-driven
  • Work is knowledge intensive
  • Content is essential for decision making
  • Outcomes are goal-oriented
  • The judgment of people impact how the goal is achieved
  • Process is often not predetermined

The discipline of case management is deeply rooted in industries like healthcare, public sector and the legal profession.  Case management concepts are being applied across all industries – and though organizations describe case management differently – they consistently describe the lack of tools needed for their knowledge workers to get their jobs done.  Some organizations may describe their challenges as complaint / dispute management, investigations, interventions, claims processing or other forms of business functions that have a common pattern or problem but not a straight-through process.  Cases also typically involve invoices, contracts, employees, vendors, customers, projects, change requests, exceptions, incidents, audits, electronic discovery and more.

Faster then a speeding bullet!

Yesterday’s BPM development tools simply don’t work for case management applications.  By the time you build the application, too much time has past, requirements change and IT usually gets the blame.  Time-to-value suffers.  I have nothing against BPM application development tools.  I just wouldn’t use a screwdriver to hammer a nail … and neither should you.  Case management solutions require a new kind of development environment and tools.  We need tools that are easy to use and allow a business user (not just IT) to very quickly build a solution.  They should be able to address the comprehensive nature of all case assets and provide a 360 degree view of a case.  They should leverage templates for a fast-start and represent industry best practices.  In the end, they need to significantly shorten time-to-value relative to other approaches.

More powerful then a locomotive!

Since the objective is to empower case based decision making, we need user experiences that are more robust and flexible then those of the past.  We need those experiences to be role-based and personalized so the end-user gets exactly the information they need to progress the case.  The user experience needs to be flexible and extensible … not to mention configurable, to meet unique business, case or user requirements.  The user experience should provide deep contextual data for case work and eliminate disjointed jumping between applications.  It must bring people, process and information together to drive case progression and optimal outcomes.  That way, a single case worker has all the information they need to improve case outcomes.

Able to leap tall buildings in a single bound!

Proactively advising case workers of best practices, historical outcomes, fraud indicators and other relevant insight is also needed.  Leveraging analytics to detect and surface trends, patterns and deviations contributes to better and more consistent outcomes.  In other words, we need powerful analytics for better case outcomes.  Comprehensive reporting and analysis gives case managers visibility across all information types to assess and act quickly.  Real-time dashboards help understand issues before they become a problem.  Unique content analytics can discover deeper case insight.  Bottom line … case managers need insight in order to impact results.

Anatomy of a superhero

Before being rocketed to Earth as some new problem solving superhero technology … a combination of capabilities are needed to address the needs of case management solutions.  Under the cape and tights of any case management superhero technology, you will find six core capabilities in a seamlessly integrated environment:

1 – Content.  By placing the case model in the content repository, information and other artifacts associated with cases are not only selected and viewed but also managed in the context of the case over its lifecycle.  These include collaborations, processes steps, and the other associated case elements.

2 – Process.  Cases may follow static processes that are prescribed for certain business situations.  They may also follow more dynamic paths based on changes to information associated with a case.  Straight through, transactional processes can be called as can more collaborative processes.

3 – Analytics.  Analytics help case workers to make the right decisions in case of fraudulent claims for insurance, social benefit coverage, eligibility for welfare programs and more. Analytics help detect patterns within or across cases or simply optimize the overall case handling to optimize case outcomes.

4 – Rules.  Many decisions in a case depend on set values, e.g. interest rates for loans based on credit rating, approval authority for transaction amounts, etc. By separating rules from process the case handling becomes much more agile as rules can change in lockstep with market changes.

5 – Collaboration.  Finding the right subject matter expert is often critical to make an ad-hoc decision required to bring a case to an optimal closure. Collaboration in form of instant messaging, presence awareness, and team rooms enables an organization and its case workers to work together to drive outcomes.

6 – Social Software.  Dynamic To Do Lists that are role based help case workers establish conversations and actions that must take place to close cases and link to information about the people that can help.  Users can brainstorm on appropriate solutions and actions and create wikis linked to particular case types to assist colleagues in their case work.

If you can’t do those six things … seamlessly … you aren’t very super … or advanced … and you certainly can’t meet the demands of case management solutions.

Advanced Case Management is now saving the world one case and solution at a time.

So “up, up and away” to better case management solutions and outcomes.  As always leave me your thoughts and comments here.

Content in Motion: The Voice of Your Customer

Do you listen to your customers?

No, really!  Of course, everyone answers “yes” when asked this question.  So much so … that the question really isn’t worth asking anymore.  The real question to ask is “What are you doing about it?”

Your customers write about your services, prices, product quality and their experiences with you in social media.  They write you letters (yes, letters on paper do exist), they send you emails, they call your call centers and even participate in surveys you conduct … Again I ask, what are you doing about it?

How are you translating all that information across all those input channels into action?  All of that content (you already have) in the form of customer interactions is just waiting to be leveraged (hhmmmm).

In three separate “C” Level studies (CIO, CFO, CEO) … the number one executive imperative was to “Reinvent Customer Relationships”.  Across the three studies, key findings were to:

  • Get closer to customers (top need)
  • Better understand what customers need
  • Deliver unprecedented customer service

Can anyone think of a better way to accomplish this then by examining all of that customer interaction based content to enable you to do something about it?  I bet there are loads of trends, patterns and new insights just waiting to be explored and discovered in those interactions … something demanding your attention and needing action.  This is one of the thoughts I had in mind when I blogged about “Content at Rest or Content in Motion? Which is Better?” a few weeks ago.  Clearly, identifying customer satisfaction trends about products, services and personnel is critical to any business.

The Hertz Corporation is doing this today.  They are using IBM Content Analytics software to examine customer interaction based content to better identify car and equipment rental performance levels for pinpointing and making the necessary adjustments to improve customer satisfaction levels.  Insights derived from enterprise content enable companies like Hertz to drive new marketing campaigns or modify their products and services to meet the demands of their customers.

“Hertz gathers an amazing amount of customer insight daily, including thousands of comments from web surveys, emails and text messages. We wanted to leverage this insight at both the strategic level and the local level to drive operational improvements,” said Joe Eckroth, Chief Information Officer, the Hertz Corporation.

Hertz isn’t just listening … they are taking action … by putting their content in motion.

Again I ask, what are you doing about it?  Why not test drive Hertz’s idea in your business?  You’ve already got the content to do so.

I welcome your input as always.  I recently bylined articles on Hertz and IBM Content Analytics for ibm.com and CIO.com entitled  “Insights into Action – Improving Service by Listening to the Voices of your Customers”.  For a more detailed profile on ICA at Hertz visit: http://www-03.ibm.com/press/us/en/pressrelease/32859.wss

IBM … 100 Years Later

Nearly all the companies our grandparents admired have disappeared.  Of the top 25 industrial corporations in the United States in 1900, only two remained on that list at the start of the 1960s.  And of the top 25 companies on the Fortune 500 in 1961, only six remain there today.  Some of the leaders of those companies that vanished were dealt a hand of bad luck.  Others made poor choices. But the demise of most came about because they were unable simultaneously to manage their business of the day and to build their business of tomorrow.

IBM was founded in 1911 as the Computing Tabulating Recording Corporation through a merger of four companies: the Tabulating Machine Company, the International Time Recording Company, the Computing Scale Corporation, and the Bundy Manufacturing Company.  CTR adopted the name International Business Machines in 1924.  The distinctive culture and product branding has given IBM the nickname Big Blue.

As you read this, IBM begins its 101st year.  As I look back at the last century, there is a path that led us to this remarkable anniversary which has been both rich and diverse.  The innovations IBM has contributed includes products ranging from cheese slicers to calculators to punch cards – all the way up to game-changing systems like Watson.

But what stands out to me is what has remained unchanged.  IBM has always been a company of brilliant problem-solvers.  IBMers use technology to solve business problems.  We invent it, we apply it to complex challenges, and we redefine industries along the way.

This has led to some truly game-changing innovation.  Just look at industries like retail, air travel, and government.  Where would we be without UPC codes, credit cards and ATM machines, SABRE, or Social Security?  Visit the IBM Centennial site to see profiles on 100 years of innovation.

We haven’t always been right though … remember OS/2, the PCjr and Prodigy?

100 years later, we’re still tackling the world’s most pressing problems.  It’s incredibly exciting to think about the ways we can apply today’s innovation – new information based systems leveraging analytics to create new solutions, like Watson – to fulfill the promise of a Smarter Planet through smarter traffic, water, energy, and healthcare.  This promise of the future … is incredibly exciting and I look forward to helping IBM pave the way for continued innovation.

Watch the IBM Centennial film “Wild Ducks” or read the book.  IBM officially released a book last week celebrating the Centennial, “Making the World Work Better: The Ideas that Shaped a Century and a Company”.  The book consists of three original essays by leading journalists. They explore how IBM” has pioneered the science of information, helped reinvent the modern corporation and changed the way the world actually works.

As for me … I’ve been with IBM since the 2006 acquisition of FileNet and am proud to be associated with such an innovative and remarkable company.

Content at Rest or Content in Motion? Which is Better?

I really wish I’d thought of this concept but I didn’t.  It’s such a simple idea when you think about it … that there are two fundamental types of content … enterprise content at rest and enterprise content in motion.

Content at Rest = Cost / Risk

Enterprise content at rest is sitting around just taking up space.  At rest implies not being accessed … not being used … not doing anything of value.  (Hhhmm … this sounds a lot like my Uncle Leo around the holidays.  I have this mental image of him asleep on my couch one Thanksgiving surrounded by several beer cans.  Sorry for sharing.)  Anyway … when at rest, this content usually includes duplicates and near-duplicates making the problem worse.  Content at rest drives significant and unnecessary costs in the form of storage, power, system administration and more. Worse yet, all this unnecessary content is ruining our search experiences.  We can’t find anything because of all this useless content just hanging around gumming up our search results.  Boy this sounds dumb.  Maybe we should start disposing of some of this stuff?

Content in Motion = Value / Reward

On the other hand, enterprise content in motion is highly valuable and rewarding.  Content that is part of business process or case management enabling better decisions and outcomes … or content that is community and social oriented driving better collaborative experiences and outcomes … or content that is being analyzed to unlock business insight across large amounts of unstructured data.  Sounds awesome, doesn’t it?  Let’s put all that content to work for us! (Reminds me of my Aunt Marge … who never stops cleaning, cooking, running errands taking care of the familiy critical stuff.  How Leo and Marge have stayed married all these years is beyond me).

What To Do

If we’re agreed that’s far more valuable to activate content, then how do we go about it? … and more importantly how do we pay for it?

Today, over 80% of most IT budgets are already allocated to managing existing “stuff” … programs, systems, storage including all that costly content at rest.  With information expected to grow 44 times by 2020,  This is a failure scenario.  IT budgets are flat or declining in most organizations so at current course and speed we’ll increasingly be spending 83%, 88%, 95% and eventually all of our IT budget on managing existing “stuff”.  This leaves very little or no money to invest in new ECM initiatives that drive value … like those that activate content and put content in motion.

And those who say … “but storage is always getting cheaper, so no big deal” should probably stop reading here because you won’t like what is coming next.  Storage may indeed be getting cheaper but the people, power, maintenance, physical space that it requires to work is not.  It’s was a dumb argument yesterday, a ridiculous one today and an untenable one going forward.  Most IT budgets already spend 17% on storage (yikes), which ought to be plenty.

Action Plan to Activate Content and Drive Value

Let’s just stop the madness and put much more focus, energy and budget on delivering value through content in motion!  Here are some basic steps you can take right now:

1. Think and act differently … it’s really about the communities, the processes and the insight related to your content.  Use and value comes from activity, not stagnance.

2. Defensibly dispose of everything you can, including retiring old content centric apps, abandoned SharePoint sites, unused file shares as soon as you can … except what you are obligated to keep for business, regulatory or legal purposes.  Big hint: This will free up loads of resources and budget that can be reallocated to new projects, like activating content.

3. Work with your line-of-business execs in three areas to activate content:

Case Management:  Automate and improve those workflows and processes that are case centric where people, process and content are essential to the outcome.  The more ad-hoc and exception oriented processes drive maximum content value … think claims processing, dispute resolution, customer inquiry, investigation, onboarding and more.

Responsible Social Content:  Enable a true social content experience for knowledge works where projects, activities, instant collaboration, tasks and ECM services are the norm.  Think Facebook + ECM for the enterprise … combine ECM, social software and a more responsible approach to content collaboration.

Content Analysis:  Leverage and exploit your content by understanding the trends, patterns, anomalies and deviations of your business that are currently trapped in your content. Think Business Intelligence for content and detect fraud, predict outcomes, find new opportunities, hear the voice-of-the-customer and more.

I think it’s obvious by now which is better … or in other words, we should all be more like my Aunt Marge and not  my Uncle Leo (he snores when he naps, too). 

Toby Bell (Gartner ECM analyst) made the “content at rest” and “content in motion” remarks which got me thinking along these lines.  I’ve taken Toby’s idea and added my own perspective.  I also discussed this concept at this weeks Managing Electronic Records Conference in Chicago and received positive feedback from the audience and a few individuals afterwards.

As always … leave me your thoughts and ideas here. I’ll be discussing this topic and other ECM topics at the upcoming Boston and Toronto UserNet events. Hope to see you there.

IBM at 100: SAGE, The First National Air Defense Network

This week was a reminder of how technology can aid in our nation’s defense as we struck a major blow against terrorism.  Most people don’t realize IBM contributed to our nation’s defense in the many ways it has.  Here is just one example from 1949.

When the Soviet Union detonated their first atomic bomb on August 29, 1949, the United States government concluded that it needed a real-time, state-of-the-art air defense system.  It turned to Massachusetts Institute of Technology (MIT), which in turn recruited companies and other organizations to design what would be an online system covering all of North America using many technologies, a number of which did not exist yet.  Could it be done?  It had to be done.  Such a system had to observe, evaluate and communicate incoming threats much the way a modern air traffic control system monitors flights of aircraft.

This marked the beginning of SAGE (Semi-Automatic Ground Environment), the national air defense system implemented by the United States to warn of and intercept airborne attacks during the Cold War.  The heart of this digital system—the AN/FSQ-7 computer—was developed, built and maintained by IBM.  SAGE was the largest computer project in the world during the 1950s and took IBM squarely into the new world of computing.  Between 1952 and 1955, it generated 80 percent of IBM’s revenues from computers, and by 1958, more than 7000 IBMers were involved in the project.  SAGE spun off a large number of technological innovations that IBM incorporated into other computer products.

IBM’s John McPherson led the early conversations with MIT, and senior management quickly realized that this could be one of the largest data processing opportunities since winning the Social Security bid in the mid-1930s.  Thomas Watson, Jr., then lobbying his father and other senior executives to move into the computer market quickly, recalled in his memoirs that he wanted to “pull out all the stops” to be a central player in the project.  “I worked harder to win that contract than I worked for any other sale in my life.”  So did a lot of other IBMers: engineers designing components, then the computer; sales staff pricing the equipment and negotiating contracts; senior management persuading MIT that IBM was the company to work with; other employees collaborating with scores of companies, academics and military personnel to get the project up and running; and yet others who installed, ran and maintained the IBM systems for SAGE for a quarter century.

The online features of the system demonstrated that a new world of computing was possible—and that, in the 1950s, IBM knew the most about this kind of data processing.  As the ability to develop reliable online systems became a reality, other government agencies and private companies began talking to IBM about possible online systems for them.  Some of those projects transpired in parallel, such as the development of the Semi-Automated Business Research Environment (Sabre), American Airlines’ online reservation system, also built using IBM staff located inPoughkeepsie,New York.

In 1952, MIT selected IBM to build the computer to be the heart of SAGE. MIT’s project leader, Jay W. Forrester, reported later that the company was chosen because “in the IBM organization we observed a much higher degree of purposefulness, integration and “esprit de corps” than in other firms, and “evidence of much closer ties between research, factory and field maintenance at IBM.”  The technical skills to do the job were also there, thanks to prior experience building advanced electronics for the military.

IBM quickly ramped up, assigning about 300 full-time IBMers to the project by the end of 1953. Work was centered in IBM’s Poughkeepsie and Kingston, NY facilities and in Cambridge, Massachusetts, home of MIT.  New memory systems were needed; MITRE and the Systems Development Corporation (part of RAND Corporation) wrote software, and other vendors supplied components.  In June 1956, IBM delivered the prototype of the computer to be used in SAGE.  The press release called it an “electronic brain.”  It could automatically calculate the most effective use of missiles and aircraft to fend off attack, while providing the military commander with a view of an air battle. Although this seems routine in today’s world, it was an enormous leap forward in computing.  When fully deployed in 1963, SAGE included 23 centers, each with its own AN/FSQ-7 system, which really consisted of two machines (one for backup), both operating in coordination.  Ultimately, 54 systems were installed, all collaborating with each other. The SAGE system remained in service until January 1984, when it was replaced with a next-generation air defense network.

Its innovative technological contributions to IBM and the IT industry as a whole were significant.  These included magnetic-core memories, which worked faster and held more data than earlier technologies; a real-time operating system (a first); highly disciplined programming methods; overlapping computing and I/O operations; real-time transmission of data over telephone lines; use of CRT terminals and light pens (a first); redundancy and backup methods and components; and the highest reliability of computer systems (uptime) of the day.  It was the first geographically distributed, online, real-time application of digital computers in the world.  Because many of the technological innovations spun off from this project were ported over to new IBM computers in the second half of the 1950s by the same engineers who had worked on SAGE, the company was quickly able to build on lessons learned in how to design, manufacture and maintain complex systems.

Fascinating to be sure … the full article can be accessed at http://www.ibm.com/ibm100/us/en/icons/sage/

IBM at 100: The 1401 Mainframe

In my continuing series of IBM at 100, I turn to our data processing heritage with the IBM 1401 Data Processing System (which was long before my time).

While the IBM 1401 Data Processing System wasn’t a great leap in power or speed, that was never the point. “It was a utilitarian device, but one that users had an irrational affection for,” wrote Paul E. Ceruzzi in his book, A History of Modern Computing.

There were several keys to the popularity of the 1401 system. It was one of the first computers to run completely on transistors—not vacuum tubes—and that made it smaller and more durable. It rented for US$2500 per month, and was touted as the first affordable general-purpose computer. It was also the easiest machine to program at the time. The system’s software, wrote Dag Spicer, senior curator at the Computer History Museum, “was a big improvement in usability.”

This more accessible computer unleashed pent-up demand for data processing. IBM was shocked to receive 5200 orders for the 1401 computer in just the first five weeks after introducing it—more than was predicted for the entire life of the machine. Soon, business functions at companies that had been immune to automation were taken over by computers. By the mid-1960s, more than 10,000 1401 systems were installed, making it by far the best-selling computer to date.

More importantly, it marked a new generation of computing architecture, causing business executives and government officials to think differently about computing. A computer didn’t have to be a monolithic machine for the elite. It could fit comfortably in a medium-size company or lab. In the world’s top corporations, different departments could have their own computers.

A computer could even wind up operating on an army truck in the middle of a forest. “There was not a very good grasp or visualization of the potential impact of computers—certainly as we know them today—until the 1401 came along,” said Chuck Branscomb, who led the 1401 design team. The 1401 system made enterprises of all sizes believe a computer was useful, and even essential.

By the late 1950s, computers had experienced tremendous changes. Clients drove a desire for speed. Vacuum-tube electronics replaced the electro-mechanical mechanisms of the tabulating machines that dominated information processing in the first half of the century. First came the experimental ENIAC, then Remington Rand’s Univac and the IBM 701, all built on electronics. Magnetic tape and then the first disk drives changed ideas about the accessibility of information. Grace Hopper’s compiler and John Backus’s FORTRAN programming language gave computer experts new ways to instruct machines to do ever more clever and complex tasks. Systems that arose out of those coalescing developments were a monumental leap in computing capabilities.

Still, the machines touched few lives directly. Installed and working computers numbered barely more than 1000. The world, in fact, was ready for a more accessible computer.

The first glimpse of that next generation of computing turned up in an unexpected place:France. “In the mid-1950s, IBM got a wake-up call,” said Branscomb, who ran one of IBM’s lines of accounting machines at the time. French computer upstart Machines Bull came out with its Gamma computers, small and fast compared to goliaths like the IBM 700 series. “It was a competitive threat,” Branscomb recalled.

Bull made IBM and others realize that entities with smaller budgets wanted computers. IBM scrambled together resources to try to make a competing machine. “It was 1957 and IBM had no new machine in development,” Branscomb said. “It was a real problem.”

During June and July 1957, IBM engineers and planners gathered inGermanyto propose several accounting machine designs. The anticipated product of this seven-week conference was known thereafter as the Worldwide Accounting Machine (WWAM), although no particular design was decided upon.

In September 1957, Branscomb was assigned to run the WWAM project. In March 1958, after Thomas Watson, Jr. expressed dissatisfaction with the WWAM project inEurope, the Endicott proposal for a stored-program WWAM was given formal approval as the company’s approach to meeting the need for an electronic accounting machine. The newly assigned project culminated in the announcement of the 1401 Data Processing System (although, for a time it carried the acronym SPACE).

The IBM 1401 Data Processing System—comprising a variety of card and tape models with a range of core memory sizes, and configured for stand-alone use and peripheral service for larger computers—was announced in October 1959.

Branscomb’s group set a target rental cost of US$2500 per month, well below a 700 series machine, and hit it. They also decided the computer had to be simple to operate. “We knew it was time for a dramatic change, a discontinuity,” Branscomb added. And indeed it was. The 1401 system extended computing to a new level of organization and user, driving information technology deeper into everyday life.

The full article can be accessed at http://www.ibm.com/ibm100/us/en/icons/mainframe/

Watson and The Future of ECM

In the past, I have whipped out my ECM powered crystal ball to pontificate about the future of Enterprise Content Management.  These are always fun to write and share (see Top 10 ECM Pet Peeve Predictions for 2011  and Crystal Ball Gazing … Enterprise Content Management 2020).  This one is a little different though …  on the eve of the AIIM International Conference and Expo at info360, I find myself wondering … what are we going to do with all this new social content … all of these content based conversations in all of their various forms?

We’ve seen the rise of the Systems of Engagement concept and number of new systems that enable social business.  We’re adopting new ways to work together leveraging technologies like collaborative content, wikis, communities, RSS and much more.  All of this new content being generated is text based and expressed in natural language.  I suggest you read AIIM’s report Systems of Engagement and the Future of Enterprise IT: A Sea Change in Enterprise for a perspective on the management aspects of the future of ECM.  It lays out how organizations must think about information management, control, and governance in order to deal with social technologies.

Social business is not just inside the firewall though.  Blogs, wikis and social network conversations are giving consumers and businesses a voice and power they’ve never have before … again based in text and expressed in natural language.  This is a big deal.  770 million people worldwide visited a social networking site last year (according to a comScore report titled Social Networking Phenomenon) … and amazingly, over 500 billion impressions annually are being made about products and services (according to a new book Empowered written by Josh Bernoff and Ted Schadler).

But what is buried in these text based natural language conversations?  There is an amazing amout of information trapped inside.  With all these conversations happening between colleagues, customers and partners … what can we learn from our customers about product quality, customer experience, price, value, service and more?  What can we learn from our internal conversations as well?  What is locked in these threads and related documents about strategy, projects, issues, risks and business outcomes.

We have to find out!  We have to put this information to work for us.

But guess what?  The old tools don’t work.  Data analysis is a powerful thing but don’t expect today’s business intelligence tools to understand language and threaded conversations.  When you analyze data … a 5 is always a 5.  You don’t have to understand what a 5 is or figure out what it means.  You just have to calculate it against other numeric indicators and metrics.

Content … and all of the related conversations aren’t numeric.  You must start by understanding what it all means, which is why understanding natural language is key.  Historically, computers have failed at this.  New tools and techniques are needed because content is a whole different challenge.  A very big challenge.  Think about it … a “5” represents a value, the same value, every single time.  There is no ambiguity.  In natural language, the word “premiere” could be a noun, verb or adjective.  It could be a title of a person, an action or the first night of a theatre play.  Natural language is full of ambiguity … it is nuanced and filled with contextual references.  Subtle meaning, irony, riddles, acronyms, idioms, abbreviations and other language complexities all present unique computing challenges not found with structured data.  This is precisely why IBM chose Jeopardy! as a way to showcase the Watson breakthrough.

IBM Watson (DeepQA) is the world’s most advanced question answering machine that uncovers answers by understanding the meaning buried in the context of a natural language question.  By combining advanced Natural Language Processing (NLP) and DeepQA automatic question answering technology, Watson represents the future of content and data management, analytics, and systems design.  IBM Watson leverages core content analysis, along with a number of other advanced technologies, to arrive at a single, precise answer within a very short period of time.  The business applications for this technology are limitless starting with clinical healthcare, customer care, government intelligence and beyond.

You can read some of my other blog postings on Watson (see “What is Content Analytics?, Alex”, 10 Things You Need to Know About the Technology Behind Watson and Goodbye Search … It’s About Finding Answers … Enter Watson vs. Jeopardy! … or better yet … if you want to know how Watson actually works, hear it live at my AIIM / info360 main stage session IBM Watson and the Impact on ECM this coming Wednesday 3/23 at 9:30 am.

BLOG UPDATE:  Here is a link to the slides used at the AIIM / info360 keynote.

Back to my crystal ball … my prediction is that natural language based computing and related analysis is the next big wave of computing and will shape the future of ECM.  Watson is an enabling breakthrough and is the start of something big.  With all this new information, we’ll want to use to understand what is being said, and why, in all of these conversations.  Most of all, we’ll want to leverage this new found insight for business advantage.  One compelling and obvious example is to be to answer age old customer questions like “Are our customers happy with us?” “How happy” “Are they so happy, we should try to sell something else?” … or … “Are our customers unhappy?” “Are they so unhappy, we should offer them something to prevent churn?” Undestanding the customer trends and emerging opportunities across a large set of text based conversations (letters, calls, emails, web postings and more) is now possible.

Who wouldn’t want to undertstand their customers, partners, constituents and employees better?  Beyond this, Watson will be applied to industries like healthcare to help doctors more effectively diagnose diseases and this is just the beginning.  Organizations everywhere will want to unlock the insights trapped in their enterprise content and leverage all of these conversations … in ways we haven’t even thought of yet … but I’ll save that for the next time I use my ECM crystal ball.

As always … leave me your thoughts and ideas here and hope to see you Wednesday at The AIIM International Conference and Expo at info360 http://www.aiimexpo.com/.