May 8th, 2017
The Race for Intelligence: How AI is Eating Hardware – Towards an AI-defined hardware worldRSS Share Category: GPU
By: Sri Ambati
With the AI arms race reaching a fever pitch, every data-driven company is (or at least should be) evaluating its approach to AI as a means to make their owned datasets as powerful as they can possibly be. In fact, any business that’s not currently thinking about how AI can transform its operations risks falling behind its competitors and missing out on new business opportunities entirely. AI is becoming a requirement. It’s no longer a “nice to have.”
It’s no secret that AI is hot right now. But the sudden surge in its popularity, both in business and the greater tech zeitgeist, is no coincidence. Until recently, the hardware required to compute and process immense complex datasets just didn’t exist. Hardware, until now, has always dictated what software was capable of — in other words, hardware influenced software design.
The emergence of graphic processing units (GPUs) has fundamentally changed how people think about data. AI is data hungry — the more data you feed your AI, the better it can perform. But this obviously presents computational requirements, namely, substantial memory (storage) and processing power. Today’s GPUs are 100x faster than CPUs, making analysis of massive data sets possible. Now that GPUs are able to process this scale of data, the potential for AI applications are virtually limitless. Previously, the demands of hardware influenced software design. Today, the opposite is true: AI is influencing how hardware is designed and built.
Here are the three macro-level trends enabling AI to eat hardware:
1.) AI is Eating Software
The old paradigm that business intelligence software relies upon rule-based engines no longer applies. Instead, the model has evolved to the point where artificial intelligence software now relies upon statistical data training, or machine learning. As statistical data training grows up, it’s feasting on rules-based engines. However, this transformation requires an immense amount of data to train the cognitive function, and AI is influencing the design of hardware to facilitate the training. AI is not only influencing hardware design, as evidenced by the rise of GPUs, but also eating the traditional rules-based software that has long been the hallmark of business intelligence.
What does this mean in practical terms? It means businesses can now use AI to address specific problems, and in a sense “manufacture” intelligence. For example, creating a human doctor involves roughly 30 years of training, from a child’s birth to when he or she has completed her residency and gets their first job. But with AI, we can now create a “doctor” without 30 years of training. On a single chip, encoded with AI, a self-learning “doctor” can be trained in 11 days with petabytes of data. Not only that, you can install this “doctor” into a million places by replicating that chip, so long as there’s a device and connectivity.
This may be an extreme example, but it illustrates just how quickly AI is advancing our ability to understand from data.
2.) The Edge is Becoming More Intelligent
Another major trend supporting AI’s influence over hardware is the democratization of intelligence. In the 1980s, mainframes were the only devices powerful enough to handle large datasets. At the time, nobody could have possibly imagined that an invention like the personal computer would come along and give the computing power of a mainframe to the masses.
Fast forward 30 years later, history is repeating itself. The Internet of Things is making it possible for intelligence to be distributed even further from centralized mainframes, to literally any connected device. Today, tiny sensors have computing power comparable to that of a PC, meaning there will be many more different types of devices that can process data. Soon, IoT devices of all sizes will be much more powerful than the smartphone.
This means that intelligence is headed to the edge, away from big, centralized systems like a mainframe. The cloud enables connection between edge and center, so with really smart devices on the edge, information can travel rapidly between any number of devices.
3.) Everything is Dataware
AI constantly seeks data, and business intelligence is actionable only when the AI has a steady diet of data. Thanks to the hardware movement and the shift of intelligence to the edge, there are more points of data collection than ever. However, the hardware movement is not just about collecting and storing data, but rather continuously learning from data and monetizing those insights. In the future, power is at the edge, and over time, the power of the individual device will increase. As those devices continue to process data, the monetization of that data will continue to make the edge more powerful.
AI presents us with a distributed view of the world. Because data is being analyzed on the edge and continuously learning, knowledge is not only increasing at the edge, but flowing back to the center too. Everything is now dataware.
As the demands for data processing power increase across businesses, AI is transforming how enterprises shape their entire data strategy. Software is changing as a result. Gone are the days where rules-based computing is sufficient to analyze the magnitude of available data. Statistical data training is required to handle the load. But CPUs can only handle a fraction of the demand, so the demands of AI are influencing the way that hardware is designed. As hardware becomes more ubiquitous via IoT, intelligence and data are moving to the edge and the balance of power is shifting to the masses.