The NVIDIA chipset from back in late 2011 was indeed one of the most important factors in making AI available to developers
While we hear a lot about how Artificial Intelligence will change the world as we know it, what we don’t often hear about is how video game technology contributed to the biggest breakthroughs in AI and deep learning. AI was a term first coined in 1956 but it wasn’t until the ImageNet challenge of 2012 that a deep learning algorithm outperformed human coded software. Subsequently, in 2015, deep learning projects from both Microsoft and Google were able to outperform an actual human.
The questions that beg to be asked here are:
What changed in 2012 to allow us to have the processing power for deep learning and neural networks, especially since there weren’t any quantum leaps in CPU technology during the time.
What does it have to do with video games?
NVIDIA debuted the first commercially available graphics card in 1999 called GeForce 256, which according to the NVIDIA definition was a “single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second.” In simple words, GPUs are great at doing a lot of low-level math which is what’s required to build the realistic virtual worlds we see in video games today. It just so happens that the same technology that puts together pixels and polygons, is also ideal for training deep learning models.
The Big Bang
Getting back to 2012 and the big AI breakthrough, the Google Brain project had learned to recognize cats from YouTube videos which was quite an achievement in 2011. The only problem was that it required 2,000 CPUs. While this wasn’t really a problem for giants like Google, it made deep learning pretty much inaccessible for everyone else. This is when the big eureka moment happened and Bryan Catanzaro from Nvidia Research and Andrew Ng’s team at Stanford decide to collaborate and use GPUs instead of CPUs for deep learning. The rest, of course, is history.
Close your eyes for a moment and imagine a giant data center with hundreds of servers powering and cooling over 2,000 CPUs. It took just 12 Nvidia GPUs to deliver the same deep learning performance as 2,000 CPUs. Now that’s what you call a quantum leap! That’s what led to the 2012 breakthrough where Ales Krizhevsky of the University of Toronto designed a neural network called AlexNet that outperformed human coded software. This was done by training it with a million different images that required trillions of mathematical operations, all done of course, on Nvidia GPUs.
GPUs and Deep Learning
Deep learning enables a computer to learn by itself by going through massive amounts of data. Unlike traditional software that needs to be coded by hand, all you need to do here is have millions of examples to feed the neural network. This has led to unprecedented levels of accuracy in areas of image and speech recognition which has, in turn, led to a revolution in the world of chatbots and virtual assistants like Siri, Google Assistant, and Alexa. Amazon’s Alexa, one of the most popular machine learning services, up until November 2020, ran all its TTS (text to speech) workloads on GPU instances.
From self-driven automobiles to NLP (natural language processing), facial recognition, healthcare, entertainment and even election predictions, the applications of deep learning in the real world are virtually endless. In addition to using GPUs to power AI for all kinds of mobile apps like e-commerce, stock trading, and banking, GPUs and deep learning algorithms are also being used to navigate drones. With regard to “end-user experience,” in particular, something that’s a top priority for organizations today, deep learning is critical for tasks like sentiment analysis, and opinion mining. This is done by constantly feeding your deep learning algorithm with customer opinions, reviews, and social media posts.
Silicon Serendipity
It’s a strange story, to say the least, and it gets even stranger. In recent news, there has been a shortage of GPUs owing largely to their application in the mining of cryptocurrencies. Apparently, mining cryptocurrency requires thousands of low-level mathematical calculations, something that we already know GPUs are best suited for. As of April 1st, 2022 Nvidia has announced the building of 2 supercomputers, one called Eos for its own use in climate science and digital biology, and another for Taiwan’s National Health Research Institutes (NHRI). It’s also announced a partnership with the US Airforce to build an autonomous vehicle computer.
The list goes on from ventures in the world of quantum computers to software that turns 2D into 3D scenes, and even an omniverse cloud. Jensen Huang, CEO of Nvidia was quoted stating “At no time in the history of our company have we been at the center of such large markets.” That’s not bad for an organization that initially set out to make the world a better place for gamers, especially with people even calling them the new Intel.
In case you missed:
- Introducing Nvidia’s L4 GPU profiles in the cloud
- Mainstream AI workloads too resource-hungry? Try Hala Point, Intel’s largest Neuromorphic computer
- Nvidia Project GROOT for humanoid robots
- Could Quantum Gaming Become a Thing?
- What’s Nvidia doing in the restaurant business?
- How AI Is Helping Restore the World’s Coral Reefs
- These AI powered devices add smells to virtual worlds
- AI in the Field of Scientific Discovery, Are We Ready?
- This computer uses human brain cells and runs on Dopamine!
- The Latest Breakthroughs in Quantum Computing