• Google wants to democratize AI and operationalize machine learning (ML) with the release of Google Cloud Machine Learning Engine, a platform that includes developer-friendly APIs and pre-trained data models.
• But what the company really needs isn’t just data, algorithms or even data scientists but instead a new breed of developers, who can build software that can anticipate outcomes.
It’s always the same at the end of a company’s keynote address. After all of the important messages have been conveyed and all of the product announcements have been made, a mid-level corporate mouthpiece will take the stage and provide the audience with some positive reinforcement of what went before. It’s like the closing credits of a film, something that may contain a nugget of interest to the cinephile. More often, it serves as filler, a thematic soundtrack to accompany attendees as they make for the exits.
But the closing credit scroll at this week’s Google Cloud Next 2017 keynote was different, at least for me. After Fei-Fei Li, Chief Scientist, Google Cloud AI and Machine Learning, had left the stage, having delivered one of the most compelling and well-written speeches of the day, Alphabet’s Executive Chairman Eric Schmidt walked to the podium, pulled a well-crinkled crib sheet out of his jacket, and proceeded to blow my mind.
He of course dutifully reinforced what we’d just heard, adding a few words about how financially speaking, companies should lift and shift to the cloud straightaway. Last year’s message was about meeting the customer where they were in their migration efforts, if you’re keeping score. And then, just as with a Marvel comics superhero film, he tossed out a post-credits scene that shone a light on exactly where Google wants and needs to go if it is to capitalize on its considerable investments in AI and big data.
Earlier Fei-Fei Li mentioned that in order to democratize AI, Google would need to focus on four things:
It’s clear from the long list of product announcements made during Google Cloud Next this year that the company is executing on these four pillars. It rolled out an end-to-end ML platform in the Google Cloud Machine Learning Engine, introduced a new TensorFlow-informed API that operationalizes common search algorithms for video content (Cloud Video Intelligence API), and announced the acquisition of Kaggle, a community of more than 80,000 ML professionals.
However, Eric’s post-credit scene appended this simple thought to the conversation:
“AI changes how you write programs. It’s all about building apps that learn outcomes.”
Up to this point, the ML and big data industry has been focused on a single, fundamental outcome: put big data to work within the business process itself and add some ML secret sauce to that mix, improving things like operational efficiency through predictive models that are capable of learning from the constantly shifting sands of big data. That notion simply builds on and improves an idea we’ve been kicking around for the last 30 years with predictive analytics.
For me, what Eric pointed at here, however, is something entirely different. He’s not talking about making developers data-savvy or giving them the tools to build a better predictive mousetrap. He’s saying that we need developers that envision the entire user experience as a constantly shifting set of outcomes derived from ML and big data. An application that learns one of any “n” number of outcomes doesn’t just branch from one expected result to another. The results and the path toward those results should shift and move in lockstep with the user, the app and the contextual sea of change in which they both reside.
This sort of shift in thinking is akin to the shift from waterfall to agile software development or from sequential to parallel processing. Once we recognize a new way to write code, what we write with that code will never be the same. To realize the potential of AI, therefore, we need more than data and algorithms. We need a new class of developer, who sees software as something that is mutable, something able to both accommodate change and anticipate outcomes. If Google Cloud Next were itself an ML-informed app, I think it might move Eric’s closing soliloquy to an earlier, more consequential time slot.