by Ryan Harrington

On October 17, 1991, Walt Mossberg published the first installment of his weekly column for the Wall Street Journal, Personal Technology.  In his first words, he succinctly summarized popular sentiment at the time towards personal computers:

Personal computers are just too hard to use and it isn’t your fault.  The computer industry boasts that its products can help everyone become more productive. Maybe so. But many people can’t afford the time and money needed to get the most out of PCs.

Walt Mossberg interviews Steve Jobs at the D8 Conference (photo by Joi Ito).

Walt Mossberg interviews Steve Jobs at the D8 Conference (photo by Joi Ito).

In the two and a half decades since Mossberg shared these sentiments, much of the computer industry’s boasts have proven to be true. Personal computers have become significantly easier to use. People have become more productive. They spend both the time and money needed to get the most out of their computers.

As Mossberg now prepares to retire, the same sentiments that he shared so many years ago seem to ring true today, but about artificial intelligence rather than personal computing:

Artificial intelligence is just too hard to use and it isn’t your fault.  The data science industry boasts that its products can help everyone become more productive. Maybe so. But many people can’t afford the time and money needed to get the most out of AI.

Let’s dissect each statement.

Artificial intelligence is just too hard to use and it isn’t your fault.

For most small companies, artificial intelligence feels more like a far-off goal than a current-reality. The technology has large barriers to entry – some real and some perceived. The name itself implies something complicated. We’re used to hearing “artificial intelligence” in the context of sci-fi, not as an application that we can readily use. There is confusion about what artificial intelligence is, when the reality is that artificial intelligence is actually a whole bunch of old techniques being made new again by better data processing.

The data science industry boasts that its products can help everyone become more productive. Maybe so.

Like personal computers nearly thirty years ago, data science makes large claims about productivity (though generally for companies as opposed to people). Accessing the insights behind data allows companies to make better decisions as they are needed. Typically, the algorithms and technologies that power artificial intelligence give us access to information that is hidden in the data – not easily seen by a human. In its current state, artificial intelligence acts to augment the human decision making process, giving us information that we have never had access to before.

But many people can’t afford the time and money needed to get the most out of AI.

The perception that artificial intelligence is a complicated endeavor – one that requires a great deal of resources – widens the gap between large and small companies.  Larger companies, led by Google, Facebook, and Microsoft, are making large bets on artificial intelligence. As they move towards an AI first paradigm, they are hiring some of the world’s best talent in the field. This leaves a vacuum of talent in their wake. For smaller companies, this means that investing in artificial intelligence is expensive. This issue is the largest roadblock for adoption of artificial intelligence techniques at smaller companies. The talent gap, and therefore the expense, of artificial intelligence must be closed. This will only come as more students are pulled towards data science as a career path.

Walt Mossberg’s sentiments from October 17, 1991 are proving to be prescient for artificial intelligence in the same way that they were for personal computing. The democratization of artificial intelligence and the increase of talent entering the field will ultimately lead to productivity gains for companies and better experiences for consumers. In the meantime, the data science community must concentrate on ways to improve access to artificial intelligence resources, making it possible for smaller companies to compete with larger ones. Only time will tell if AI will be as pervasive as personal computing. We’re willing to make the bet that it will.

Posted
AuthorRyan Harrington