Eric Knorr
Contributing writer

The rise of machines that learn

analysis
Aug 11, 20144 mins

A new big data analytics startup, Adatao, reminds us that we're just at the beginning of a new phase of computing when systems become much, much smarter

When quantity reaches a certain level, it makes a qualitative difference. “Remember artificial neural networks?” Christopher Nguyen asked me. “When you have enough memory and compute, a funny thing happens. It comes alive.”

Nguyen, former engineering director for Google Apps, was referring to a slice of the technology behind his startup, Adatao, which just received $13 million in funding from Andreessen Horowitz. Adatao’s value proposition comes in two parts: pInsights, a document-based visualization layer that provides end-users with simple, real-time querying of vast data sets; and pAnalytics, a monster data processing engine built on Hadoop and Apache Spark. All of this, including the ANN (artificial neural network) component, is made possible by the huge memory and processing power that, today, has become a commodity.

[ Also on InfoWorld: What’s machine learning? It depends on who you ask. | Download InfoWorld’s Big Data Analytics Deep Dive for a comprehensive, practical overview of this hot topic. | Get a digest of the key stories each day in the InfoWorld Daily newsletter. ]

Adatao’s mission is to bring big data analytics to the masses, enabling people to collaborate on Google Apps-like documents that incorporate charts derived from huge data sets. I saw a demo, and with a cluster of eight eight-core servers (each with 30GB of RAM) hosted on Amazon Web Services, queries of a multi-terabyte data set were blazingly fast. To deliver the ease of use promised, Adatao needs ANN to identify data objects on the fly in response to queries entered in plain English. According to Nguyen, the system can recognize as many as 20,000 objects.

If Adatao is sucessful, it could well be a game-changer. But what excited me most was the artificial intelligence aspect.

Immediately after the demo, I called my friend Miko Matsumura, vice president of marketing for Hazelcast, who has a masters in computational neuroscience from Yale. I told him that, according to my highly limited understanding, artificial intelligence has largely turned out to be a hardware problem rather than a software problem, and Adatao’s ANN implementation seemed to provide a fresh example.

Miko immediately referred me to the work of Paul and Patricia Churchland, who once noted that those who deny the possibility of artificial intelligence are like a man waving a magnet in a dark room and declaring that magnetism cannot create light — when he simply wasn’t waving it fast enough to induce the current to light a bulb. Today you could argue that we have the huge memory and computing capacity necessary to begin lighting up artificial intelligence all over the place.

In fact, it’s already happening, and the main practical application is big data analytics. As James Kobielus noted earlier this year, “Machine learning is so pervasive that we can often assume its presence in big data applications.”

“Our warm and creepy future,” is how Miko refers to the first-order effect of applying machine learning to big data. In other words, through artificially intellligent analysis of whatever Internet data is available about us — including the much more detailed, personal stuff collected by mobile devices and wearables — websites and merchants of all kinds will become extraordinarily helpful. And it will give us the willies, because it will be the sort of personalized help that can come only from knowing us all too well.

Somehow, it’s not surprising that the first objective of machine learning on top of big data is to induce customers to spend more money and stay loyal. But the potential extends across every conceivable discipline, from healthcare to climatology. Thanks to the cheap, enormous computing resources that are making new intelligent systems possible, we are now entering a qualitatively different phase of computing. To deny that is to wave a magnet in the dark.

This article, “The rise of machines that learn,” originally appeared at InfoWorld.com. Read more of Eric Knorr’s Modernizing IT blog. And for the latest business technology news, follow InfoWorld on Twitter.

Eric Knorr

Eric Knorr is a freelance writer, editor, and content strategist. Previously he was the Editor in Chief of Foundry’s enterprise websites: CIO, Computerworld, CSO, InfoWorld, and Network World. A technology journalist since the start of the PC era, he has developed content to serve the needs of IT professionals since the turn of the 21st century. He is the former Editor of PC World magazine, the creator of the best-selling The PC Bible, a founding editor of CNET, and the author of hundreds of articles to inform and support IT leaders and those who build, evaluate, and sustain technology for business. Eric has received Neal, ASBPE, and Computer Press Awards for journalistic excellence. He graduated from the University of Wisconsin, Madison with a BA in English.

More from this author