Quantcast
Channel: Disrupted Intelligence – Disruption
Viewing all articles
Browse latest Browse all 28

Disrupted Search Engines – Google counters Facebook and puts its own AI source in the public domain

$
0
0

Facebook’s genius AI move finally countered by Google – is Google getting slow?

iDisrupted Commentary

There’s a bar room brawl going in Silicon Valley – Google vs Apple vs Facebook and it’s at the point of a knife fight. Earlier this year – and we suspect it was because Facebook was behind Google in AI development – it made the genius move of open sourcing its AI code thus making a whole world of developer resource available.

Now Google has done the same thing – partially turning over its AI code into the open source community with its Tensorflow product – here’s an outline description from Wikipedia. The race to the top continues. . .

From Wired: Tech pundit TimO’Reilly had just tried the new Google Photos app, and he was amazed by the depth of its artificial intelligence.

O’Reilly was standing a few feet from Google CEO and co-founder Larry Page this past May, at a small cocktail reception for the press at the annual Google I/O conference—the centerpiece of the company’s year. Google had unveiled its personal photos app earlier in the day, and O’Reilly marveled that if he typed something like “gravestone” into the search box, the app could find a photo of his uncle’s grave, taken so long ago.

Disrupted AI - Tensorflow

Disrupted AI – Tensorflow – from 9to5Google

Google is open sourcing software that sits at the heart of its empire.

The app uses an increasingly powerful form of artificial intelligence called deep learning. By analyzing thousands of photos of gravestones, this AI technology can learn to identify a gravestone it has never seen before. The same goes for cats and dogs, trees and clouds, flowers and food.

The Google Photos search engine isn’t perfect. But its accuracy is enormously impressive—so impressive that O’Reilly couldn’t understand why Google didn’t sell access to its AI engine via the Internet, cloud-computing style, letting others drive their apps with the same machine learning. That could be Google’s real money-maker, he said. After all, Google also uses this AI engine to recognize spoken words, translate from one language to another, improve Internet search results, and more. The rest of the world could turn this tech towards so many other tasks, from ad targeting to computer security.

Well, this morning, Google took O’Reilly’s idea further than even he expected. It’s not selling access to its deep learning engine. It’s open sourcing that engine, freely sharing the underlying code with the world at large. This software is called TensorFlow, and in literally giving the technology away, Google believes it can accelerate the evolution of AI. Through open source, outsiders can help improve on Google’s technology and, yes, return these improvements back to Google.

“What we’re hoping is that the community adopts this as a good way of expressing machine learning algorithms of lots of different types, and also contributes to building and improving [TensorFlow] in lots of different and interesting ways,” says Jeff Dean, one of Google’s most important engineers and a key player in the rise of its deep learning tech.

‘If Google open sources its tools, this can make everybody else better at machine learning.’ Chris Nicholson

In recent years, other companies and researchers have also made huge strides in this area of AI, including Facebook, Microsoft, and Twitter. And some have already open sourced software that’s similar to TensorFlow. This includes Torch—a system originally built by researchers at New York University, many of whom are now at Facebook—as well as systems like Caffe and Theano. But Google’s move is significant. That’s because Google’s AI engine is regarded by some as the world’s most advanced—and because, well, it’s Google.

“This is really interesting,” says Chris Nicholson, who runs a deep learning startup called Skymind. “Google is five to seven years ahead of the rest of the world. If they open source their tools, this can make everybody else better at machine learning.”

To be sure, Google isn’t giving away all its secrets. At the moment, the company is only open sourcing part of this AI engine. It’s sharing only some of the algorithms that run atop the engine. And it’s not sharing access to the remarkably advanced hardware infrastructure that drives this engine (that would certainly come with a price tag). But Google is giving away at least some of its most important data center software, and that’s not something it has typically done in the past.

Google became the Internet’s most dominant force in large part because of the uniquely powerful software and hardware it built inside its computer data centers—software and hardware that could help run all its online services, that could juggle traffic and data from an unprecedented number of people across the globe. And typically, it didn’t share its designs with the rest of the world until it had moved on to other designs. Even then, it merely shared research papers describing its tech. The company didn’t open source its code. That’s how it keep an advantage.

With TensorFlow, however, the company has changed tack, freely sharing some of its newest—and, indeed, most important—software. Yes, Google open sources parts of its Android mobile operating system and so many other smaller software projects. But this is different. In releasing TensorFlow, Google is open sourcing software that sits at the heart of its empire. “It’s a pretty big shift,” says Dean, who helped build so much of the company’s groundbreaking data center software, including the Google File System, MapReduce, and BigTable.

Open Algorithms

Deep learning relies on neural networks—systems that approximate the web of neurons in the human brain. Basically, you feed these networks vast amounts of data, and they learn to perform a task. Feed them myriad photos of breakfast, lunch, and dinner, and they can learn to recognize a meal. Feed them spoken words, and they can learn to recognize what you say. Feed them some old movie dialogue, and they can learn to carry on a conversation—not a perfect conversation, but a pretty good conversation.

Typically, Google trains these neural nets using a vast array of machines equipped with GPU chips—computer processors that were originally built to render graphics for games and other highly visual applications, but have also proven quite adept at deep learning. GPUs are good a processing lots of little bits of data in parallel, and that’s what deep learning requires.

More Artificial Intelligence

But after they’ve been trained—when it’s time to put them into action—these neural nets run in different ways. They often run on traditional computer processors inside the data center, and in some cases, they can run on mobile phones. The Google Translate app is one mobile example. It can run entirely on a phone—without connecting to a data center across the ‘net—letting you translate foreign text into your native language even when you don’t have a good wireless signal. You can, say, point the app at a German street sign, and it will instantly translate into English.

TensorFlow is a way of building and running these neural networks—both at the training stage and the execution stage. It’s a set of software libraries—a bunch of code—that you can slip into any application so that it too can learn tasks like image recognition, speech recognition, and language translation.

Google built the underlying TensorFlow software with the C++ programming language. But in developing applications for this AI engine, coders can use either C++ or Python, the most popular language among deep learning researchers. The hope, however, is that outsiders will expand the tool to other languages, including Google Go, Java, and perhaps even Javascript, so that coders have more ways of building apps.

According to Dean, TensorFlow is well suited not only to deep learning, but to other forms of AI, including reinforcement learning and logistic regression. This was not the case with Google’s previous system, DistBelief. DistBelief was pretty good at deep learning—it helped win the all-important Large Scale Visual Recognition Challenge in 2014—but Dean says that TensorFlow is twice as fast.

In open sourcing the tool, Google will also provide some sample neural networking models and algorithms, including models for recognizing photographs, identifying handwritten numbers, and analyzing text. “We’ll give you all the algorithms you need to train those models on public data sets,” Dean says.

The rub is that Google is not yet open sourcing a version of TensorFlow that lets you train models across a vast array of machines. The initial open source version only runs on a single computer. This computer can include many GPUs, but it’s a single computer nonetheless. “Google is still keeping an advantage,” Nicholson says. “To build true enterprise applications, you need to analyze data at scale.” But at the execution stage, the open source incarnation of TensorFlow will run on phones as well as desktops and laptops, and Google indicates that the company may eventually open source a version that runs across hundreds of machines.

A Change in Philosophy

Why this apparent change in Google philosophy—this decision to open source TensorFlow after spending so many years keeping important code to itself? Part of it is that the machine learning community generally operates in this way. Deep learning originated with academics who openly shared their ideas, and many of them now work at Google—including University of Toronto professor Geoff Hinton, the godfather of deep learning.

But Dean also says that TensorFlow was built at a very different time from tools like MapReduce and GFS and BigTable and Dremel and Spanner and Borg. The open source movement—where Internet companies share so many of their tools in order to accelerate the rate of development—has picked up considerable speed over the past decade. Google now builds software with an eye towards open source. Many of those earlier tools, Dean explains, were too closely tied to the Google infrastructure. It didn’t really make sense to open source them.

“They were not developed with open sourcing in mind. They had a lot of tendrils into existing systems at Google and it would have been hard to sever those tendrils,” Dean says. “With TensorFlow, when we started to develop it, we kind of looked at ourselves and said: ‘Hey, maybe we should open source this.’”

That said, TensorFlow is still tied, in some ways, to the internal Google infrastructure, according to Google engineer Rajat Monga. This is why Google hasn’t open sourced all of TensorFlow, he explains. As Nicholson points out, you can also bet that Google is holding code back because the company wants to maintain an advantage. But it’s telling—and rather significant—that Google has open sourced as much as it has. . . more from Wired. . .

 

The post Disrupted Search Engines – Google counters Facebook and puts its own AI source in the public domain appeared first on Disruption.


Viewing all articles
Browse latest Browse all 28

Latest Images

Trending Articles





Latest Images