Google’s Deep Learning Game
Updated: May 21
by Jason Costa
Google has been particularly good at getting in front of technology vectors, and riding waves as they materialize. The company saw the explosion of content on the internet and made it organized and searchable. Again, Google saw the opportunity in mobile and procured an operating system to ensure that the service was readily available as soon as a user pulled out their phone. Google moved to be “mobile first” as a company in 2010, and more recently in 2017 it was announced that Google would become an “AI first” company.
While it’s still very early with respect to deep learning as a technical vector, the technology is evolving quickly and there will be a huge market opportunity. These types of opportunities enable new entrants to appear, much as the internet wave enabled Google itself to manifest. But, something about the deep learning wave feels different: as if Google will be the Google of the deep learning game. They’re setting up all of the pieces to ensure that this is the case, and in aggregate these pieces will enable Google’s ability to compound improvements over time.
Missed Open-Source Opportunities
Open source done right can really set an industry “standard”. In fact, it can be a wedge into a huge market for companies that orchestrate the process effectively.
Take the big data wave that’s been happening over the past decade plus. Google effectively invented the blueprint for MapReduce by 2004, which was capable of processing and generating large data sets on a cluster. The name has been genericized over the years, though it originally referred to the closed work that Google did. But if you think of MapReduce as the internal implementation within Google, then Apache Hadoop is its open-source Java clone (Hadoop itself was the precursor to Hive, later built at Facebook). Released by Yahoo in 2011, over the years Hadoop has been adopted by countless web-scale players such as Twitter, eBay, Pinterest, and many others.
Google lost the big data “standard” game by not open-sourcing MapReduce, and Hadoop pulled away in the race. It’s amazing to see that just years later though - Google open-sourced TensorFlow, and it's become *the* framework of choice for deep learning. Even back in 2016, many folks were already predicting that amongst the deep learning frameworks available, TensorFlow in many ways had already won.
What was clear though, was that Google had learned from the MapReduce vs Hadoop situation. Not only did the company open-source TensorFlow, but it made it incredibly accessible to a wide array of developers looking to jump into the deep learning space. Google could afford to open-source the framework, because they have the data (which won’t be open-sourced). In effect, commoditizing the compliment.
Keras & Easier Abstraction
While TensorFlow has the broader brand recognition and bigger community, it’s still a relatively complex framework to get up and running for most beginners. That’s where Keras comes into play. Keras is a high level API that’s built on top of TensorFlow, and is far easier to approach than it’s big brother.
While TensorFlow offers more functionality and flexibility, Keras is much easier to start rapid prototyping with. It’s fairly easy to start building models with, and one can imagine the Computer Science students just getting started with deep learning will opt to start with Keras. Eventually, many of those engineers will naturally go on to use TensorFlow: so in a sense, Keras is yet another feeder into the TensorFlow ecosystem. This builds developer mindshare.
Node JS Support
Amazon could push many into the arms of GCP
While Google is pushing ahead on GCP, they are still behind on platforms like SageMaker in terms of adoption. Google’s Cloud AI is not nearly as easy to use as SageMaker. Furthermore, Amazon has been executing quite fast with the fairly recent launch of the IDE for SageMaker.
But many SMBs & large retailers are terrified of Amazon, and with good reason too. Amazon is the dark star of commerce, and the company continues to extend its grip. Commerce companies in particular are terrified of Amazon continuing to eat their lunch. One has to wonder if many of these companies aren’t internally mandating a migration off of AWS to ensure that they aren’t feeding the competition.
In the meantime, many of these entities migrating off of AWS will play right into the hands of Google Cloud. There’s the rub: Google will just end up acquiring more customers on their Cloud platform, where many of them will get easy access to TPUs, begin to run training jobs in the cloud, and almost certainly start using TensorFlow (if they’re not already).
What it means for Google & the Cloud Wars
There are really only 3 players in the deep learning game as it pertains to the cloud: Google, Microsoft, and Amazon. Right now GCP is a distant #3 in the cloud race, but as Google moves forward with building an impressive deep learning stack for developers to use it’s important to note that the ecosystem will funnel all kinds of important feedback to the company. That feedback will ensure Google is always several steps out in front of the entire ecosystem. Every time Google releases new DL tooling, feedback from devs will come in from all corners. If a dev spots a bug, or realizes that something needs to be tuned or calibrated within TensorFlow, if there are other ways to improve the training of models, or that strange behavior is being observed in regards to a model’s output - all of this will go back to Google for them to make adjustments & optimizations to the tech stack. You name it: bug auditing, improved predictability, bias detection, and much more will all be readily available to Google via the vast TensorFlow developer ecosystem.
Why is this critical? Google will get access to all of those improvements first, and will be able to incorporate these improvements directly into their GCP AI offerings. Said differently, the infrastructure to do deep learning will become (and already is) commoditized. The thing you can't commoditize is the compounding effect of community. Community matters and will ultimately pick the winner because of the ability for everyone to leverage the incremental gains of tooling improvements.
This advantage could ensure that Google continues to be out at the frontier of deep learning for years to come. The wildcard variables that could cost Google the game here are customer obsession and platform DNA, both of which are far stronger cultural elements at Microsoft and Amazon. These are critical variables for an enterprise platform play, and could be Google’s achilles heel. To be fair, it’s possible that the new CGP leadership is changing that: it’s too soon to know. From a technology perspective though, developers will be more likely to build, train and deploy their models on GCP’s DL platform, not on Amazon’s SageMaker or Azure’s Machine Learning Studio. When developers are working on deep learning models, they’re going to have to take a very hard look at GCP. This will have a massive impact on the long-term outcome of the cloud wars.