Disrupt … or Be Disrupted

“Established companies are being disrupted faster than ever before according to the Academy for Corporate Entrepreneurship. The academy also believes that 75% of the S&P 500 will be replaced by 2027.

That is a mind-blowing statistic.

Small businesses are not immune to this potential disruption either. When you consider that there are approximately 30 million small businesses in the US today and they employ almost half of the US workforce this warrants a further look.

What the heck is going on?

In case you hadn’t heard, we are entering a new era of technology innovation. Some claim we are at the precipice of another industrial revolution. History shows that this has happened only three times. The first industrial revolution (beginning ~1784) was typified by mechanical production, steam power, the railroads and telegraph. The second industrial revolution came in the next century (beginning ~1870) with electrical power, mass production, radio, tabulating systems and the advent of the assembly line. The third industrial revolution came in the next century (beginning ~1960) with automated production, programmable computers, electronics, video recorders and eventually the Internet.

At every industrial revolution

  • New industries were created and rapid growth ensued
  • Fortunes were made and fortunes were lost
  • New jobs were created and outmoded jobs eliminated

This fourth industrial revolution is being called the “Age of Intelligence”.  Data is exploding and flows from every device in unprecedented volumes, variety, and complexity. Traditional analytics and other decision support approaches are unable to fully exploit its value … driving a need for new innovation in many areas. New business models, a growing digital economy, aging workforce and global skills shortages are all driving this same need for smarter systems in all facets of life.

We have never seen so many disruptive technologies come along at the same time. It’s an overused cliché but this is a perfect storm of technology-based innovation. Artificial Intelligence (AI or Cognitive Computing) is leading the way. AI is a game changer. When combined with cloud, mobile, social, the Internet of things (sensors), nanotechnology, robotics, drones, 3D printing, new business models and more … it’s a lot to get your head around. It’s nearly impossible to fully understand the impact of this revolution right now … but’s let’s look at few areas …

If you’ve worked as a video rental clerk, travel agent, assembly-line worker, 411 operator, ticketing agent, stock-broker, department store or telephone book advertising salesperson you already know what I mean.

According to a recent McKinsey report … we can expect artificial intelligence technologies to play an increasingly great role in everyday life.  Their potential effect on the workplace has, unsurprisingly, become a major focus of research and public concern. The report also explores which job roles will or won’t be replaced by machines. It can be accessed here.

As for me … I believe the adoption of AI (and other disruptive technologies) will indeed impact our lives in a big way. It will take some time … and will create new roles/jobs and eliminate the need for others.

I am not old enough to remember when “Computer” was job title and not a machine. I do know that role consisted of people who manually computed and/or counted things and the primary tool was the slide rule. Before that, it was the abacus.

Today, those same people who became “Computers” in the 1950s and 60s are more likely to be Accountants, Financial Planners, Controllers even Programmers today … which by comparison are certainly higher value and higher paid roles.

The Information Age (or 3rd revolution) birthed an entire industry (Information Technology).  It impacted corporate structures/strategies/governance and brought us household names like Amazon, Apple, eBay, Facebook, Google, Microsoft, Oracle, Yahoo and too many small businesses to count. It seduced us into wondering what is REALLY possible. At present, Evans Data estimates there are 18.2 million software developers worldwide, a number that is due to rise to 26.4 million by 2019 (a 45% increase). On the whole, that’s a huge amount of positive change and human advancement.

There will always be doom and gloomers who worry that robots will take their jobs. Sure, some statistics show that a large percentage of all employment roles will be impacted by machines within the next two decades. This impact will be good for some and bad for others.

It seems to me that history will repeat itself, and the same outcomes will occur again, in both large and small businesses:

  • Some roles will be eliminated.
  • Some roles will evolve forward and upward.
  • Many new roles will be created.

My point is … now is the time to take action and get ahead of this. The McKinsey report does a good job of detailing the roles and workloads that are most likely to be impacted. I think the more important issue is what are you going to do to make sure your job or company is benefitting from this. You don’t want to be the person left wondering “what happened?”

IBM brought this into the mainstream when cognitive system Watson … beat the best human competitors at Jeopardy! in 2011. That was the starting gun … and the race is on.

My experience with AI/Cognitive Computing (so far) has taught me the following:

  • Information is exploding at such a rate that it is impossible to read, assimilate and apply except in small volumes. Technologies to assist us with decision-making are now mandatory.
  • There is so much new information being generated that it is also impossible for doctors, lawyers or any information based professional to keep up with their professional learning obligations. Do you want a Doctor who is up-to-date on the most recent medical information to treat you or one who hasn’t kept up on the medical literature?
  • Too much information is creating numerous bottlenecks to decision-making and process execution.  Many information based processes are actually getting slower. Resulting delays can cause more errors.
  • Any situation where a human has to read, research, explore, find and or learn new information (before making a decision) is ripe to benefit from artificial intelligence or other new decision support tools.
  • This is particularly the case when unstructured text or documents are involved. This form of data is typically “dark” and not easily locatable. It also takes longer to learn from text-based information.
  • There can often be so much information (hundreds of pages, many documents) that the required time to read and assimilate further bottlenecks decisions from being made … exacerbating the problem.
  • If video or audio  is involved, one can spend countless hours looking/listening for snippets of relevant content. The time invested to reward equation is so poor that most people just skip video/audio altogether when looking for information.
  • Net-net … any situation where human expertise/knowledge is being applied (regardless of information type) could probably benefit from a system that makes cognitive (AI) assistance available. These systems observe, reason, apply, recommend and learn from outcomes … eventually optimizing those same outcomes as guided by humans. They don’t get tired, go on vacation, have a bad day or introduce personal bias and emotion … typically things that subvert optimal decisions and outcomes.

The era of Cognitive Computing (or Artificial Intelligence if you prefer) is here NOW. Like disruptions of the past, there will be winners and losers. Robots and artificial intelligence based tools will certainly transform the nature of work.  I personally think for the better.

But what are you doing about it?

Heed this call to action – whether you are involved with a big business or a small business!

I will be delivering a keynote address and exploring this topic in much more detail at the upcoming Loudoun Small Business Conference on May 15, 2017. This event is the brainchild of the folks who run The George Mason Enterprise Center in Loudoun County, Virginia. Event details can be found here. If you are local, I hope to see you there.

As always, leave me your comments below and check out the following resources and organizations who will be at the event:

 

Transparency and Trust in the Cognitive Era

There are a lot of points of view emerging on artificial intelligence.  Elon Musk, Mark Cuban and others have been vocal on the responsible use or governance of artificial intelligence.  IBM’s version of this could really be called augmented intelligence because it is a more responsible approach to augmenting human decision making rather than simply replacing it .. as some would have it.

We all have to prepare for a future in which Artificial Intelligence (AI) plays a growing role.  The White House released a report on future directions and considerations for AI called Preparing for the Future of Artificial Intelligence.  This report surveys the current state of AI, its existing and potential applications, and the questions that progress in AI raise for society and public policy. The report also makes recommendations for specific further actions. A companion National Artificial Intelligence Research and Development Strategic Plan is also being released, laying out a strategic plan for Federally-funded research and development in AI.

We are in the early days of a promising new technology, and of the new era to which it is giving birth.  This technology is as radically different from the programmable systems that have been produced by the IT industry for half a century as those systems were from the tabulators that preceded them.

Commonly referred to as Artificial Intelligence, this new generation of technology and the cognitive systems it helps power will soon touch every facet of work and life – with the potential to radically transform them for the better.  This is because these systems can ingest and understand all forms of data, which is being produced at an unprecedented rate.

Cognitive systems like IBM’s Watson can reason over this data, forming hypotheses and judgments.  Most importantly, these systems are not simply programmed, they learn – from their own experiences, their interactions with humans and the outcomes of their judgments.

As with every prior world-changing technology, this technology carries major implications.  Many of the questions it raises are unanswerable today and will require time, research and open discussion to answer. It is both pragmatic and wise to establish principles to guide the evolution and adoption of AI.  IBM is establishing the following principles for the Cognitive Era:

Purpose: The purpose of AI and cognitive systems developed and applied by the IBM company is to augment human intelligence. The technology, products, services and policies will be designed to enhance and extend human capability, expertise and potential.  IBM’s position is based not only on principle but also on science. Cognitive systems will not realistically attain consciousness or independent agency. Rather, they will increasingly be embedded in the processes, systems, products and services by which business and society function – all of which will and should remain within human control.

Transparency: For cognitive systems to fulfill their world-changing potential, it is vital that people have confidence in their recommendations, judgments and uses. Therefore, the IBM company will make clear:

* When and for what purposes AI is being applied in the cognitive solutions IBM develops and deploys.

* The major sources of data and expertise that inform the insights of cognitive solutions, as well as the methods used to train those systems and solutions.

* The principle that clients own their own business models and intellectual property and that they can use AI and cognitive systems to enhance the advantages they have built, often through years of experience.  IBM will work with it’s clients to protect their data and insights, and will encourage it’s clients, partners and industry colleagues to adopt similar practices.

Skills: The economic and societal benefits of this new era will not be realized if the human side of the equation is not supported. This is uniquely important with cognitive technology, which augments human intelligence and expertise and works collaboratively with humans.  Therefore, the IBM company will work to help students, workers and citizens acquire the skills and knowledge to engage safely, securely and effectively in a relationship with cognitive systems, and to perform the new kinds of work and jobs that will emerge in a cognitive economy.

IBM believes it’s experience over more than a century and the daily work with clients from every industry and sector around the world have taught it that transparency and principles that engender trust are important for both business and society.  However, IBM also recognize that there is much learning ahead for all of us. In that spirit, it is hoped that the publication of these tenets can spark an industry-wide – indeed, a society-wide – dialogue on the fundamental questions that must be answered, in order to achieve the economic and societal potential of a cognitive future.

Industry organizations like the Cognitive Computing Consortium have been out front on these kinds of issues.

As always, leave me your thoughts and comments below.