AI becomes useful for the first time
It’s 1986. PCs and pocket calculators are commonplace, but CD-ROMs have just exploded onto the scene – with more storage capacity than the average person’s PC hard drive, moving large amounts of data around has never been easier. Hardware is advancing quickly and interest in AI resurfaces.
The 1980s gave rise to a new AI paradigm that was embraced by business for the first time. Why? Because AI had reached a milestone – it became useful for the first time. Researchers had all but given up on a complete “electronic brain”, but hardware was getting better. AI resurfaced as a tactical tool for business – solving problems in small, specific domains, operating on rules created by industry experts.
The most famous example of these applications was an Expert System called XCON, a program created at Carnegie Mellon for DEC. By 1986 XCON was saving 25 million dollars annually for DEC. This success was a smash – other companies created in-house expert AI systems, and inevitably, a supporting industry sprung up to support them. It may as well have been 2018.
AI still relied heavily on human inputs. Expert Systems like XCON were amazing for their applications, but they were difficult to maintain, and updating them was a nightmare. Adopting AI meant that special personnel were necessary to do this work. Early adopters of expert systems had to take this on themselves, procuring, training, and allocating personnel for this, all while learning on the fly. Data science, a field exploding in popularity today, was born out of this experiment.
Despite the success of expert systems in the corporate domain, funding was still scarce in pure AI research up until the early 80’s. Japan allocated $850 million to an ambitious project aiming to create AI that could converse, translate languages, and reason – far beyond expert system capability. This investment spurred other nations to open the floodgates for the second time so as not to be left behind in the AI revolution. But as all “races” do, the promises surrounding AI had spiraled out of control for a second time.
More powerful and adaptable desktop computers made by Apple and IBM exploded onto the scene with lower and lower startup and maintenance cost. Expert systems hardware was much more expensive, and justification became more difficult. When the hardware industry collapsed, XCON & other successful expert systems’ flaws became more apparent. By 1993, the industry had been bankrupted or acquired, and the first commercial wave of AI had ended. This is known as the second AI winter.