Galen Gruman
Executive Editor for Global Content

Interview with Woz: To innovate, get personal

analysis
Feb 4, 20147 mins

The Apple co-founder explains why being human is key to good tech and why technology alone won't fix our schools

News, news reporting. Old-fashioned microphone sits on a desk.
Credit: BrAt82/Shutterstock

Many of us in the tech press say that we’ve hit a lull in technology innovation, after an amazing run of truly disruptive new technologies from cloud computing to social networking, from mobile devices to voice services, in the last decade. Steve “Woz” Wozniak is not so sure, but he does believe that innovations can’t be scheduled or even predicted with any certainty. They take off only when many factors come together.

Wozniak, of course, knows a little something about innovation. The 63-year-old engineer is a co-founder of Apple and helped invent personal computing in the forms of the Apple II and Macintosh. He has been an adviser to and sounding board for Apple over much of its history, as well as a consultant to other tech companies. He founded a company called CL 9 in 1987 that produced the first programmable universal TV remote control. He has been an elementary school teacher. Now he is chief scientist at Fusion-io and speaks on technology and innovation throughout the world; this week, he’s a featured speaker at the Apps World conference in San Francisco.

There’s no question that companies are doggedly pursuing the next big thing in technology, whatever that may be. For example, “everyone is talking about wearable computing. There are about 30 companies that seem to be doing the same thing. But nothing seems to be pointing to the right way,” Woznak says. One reason is simple: “You tend to deal with the past,” replicating what you know in a new form. Consider the notion of computing eyeware like Google Glass: “People have been marrying eyewear with TV inputs for 20 years.”

What it takes for innovation to take root
What does it take for technology innovation to flower in the same way the PC did in the mid-1980s, the Internet in the early 2000s, smartphones in the late 2000s, cloud computing and social networking in the early 2010s, and tablet computing in the mid-2010s? For one, “the enabling technology has to become cheap enough,” Wozniak says.

For example, you can buy tiny projectors today, but they’re only useful if you have a blank wall to project them against, thus limiting their usefulness — they’re basically refinements of projectors we’ve used for decades. But imagine if a tiny projector, perhaps built into your smartphone, could project holographic images, à la the “Star Wars” movies. That would be a big shift, as then you could project anywhere you are, whether to show video or conduct a virtual face-to-face meeting. “It’s too expensive to do that today — we need to wait until it becomes affordable. But you can’t predict when that happens.” That’s why companies like Apple, Google, Microsoft, and IBM have all sorts of research products going on, to see if enabling technologies can be made affordable, then pounce on them when they do.

Of course, having the enabling technology is not enough. Wozniak reminded me of the Segway scooter, promised at its 2001 debut as the next revolution in urban transportation, a personal vehicle that could go greater distances than a bicycle but was easier to use than a motocycle and took much less road space than a car. They are used today only in limited ways, such as for city tours. They never quite clicked, so society didn’t adopt them or create the pressure to change some of the other key factors, such as making them street-legal or providing standard insurance.

The future of technology is about being more human
There’s another factor: being more human, more personal. Wozniak points out that people use technology more the less it feels like technology. “The software gets more accepted when it works in human ways — meaning in noncomputer ways. … The mouse is a good example. Using it works like how we see things in space; you’re not having to think that you move 5 inches but instead move your hand,” and the mouse follows along, with the computer’s software intrepreting the distance in context. Still, “people don’t use a mouse in real-world activities,” he notes, which is why the touch interface became so popular so quickly once the technology became cheap enough and sophisticated enough to feel natural in devices like the iPhone and iPad.

Wozniak cites Apple’s Newton MessagePad as another example, a device that took standard handwriting instead of arcane command-line inputs that made people have to think about what they are doing instead of just doing it. Today, voice-based assistants such as Apple’s Siri and Google’s voice recognition are making a very human form of interaction increasingly normal on computing devices. Wozniak foresees one day that the combination of natural language recognition, artificial intelligence-like analysis and transaction systems, and easy connectivity as enabling technology to be almost a companion for people, working with them very much on their own terms.

Wozniak cites Google Search as an example of that notion today. “Search engines replace a smart person” in terms of finding things, he notes, making everyone much more able to explore the Web and find information than could possibly be done in the past, such as at a library. For Wozniak, “replacing” people with technology really means what he calls “companion computing,” where everyone has a personal guide or assistant that is not possible without the use of technology — a force for democratizing knowledge, services, and ability.

Technology isn’t the solution to the education system’s flaws
Every since there was an Apple II, we’ve heard that computers in the classroom would give American students an edge, especially disadvantaged students. Apple, Dell, Hewlett-Packard, IBM, and many others have or have had strong education computing sales businesses as a result. Today, the cry is to put iPads or other tablets in the classroom.

Despite 30 years of having computers in the classroom, “I don’t see any change in how people come out [of the education system] — they’re not smarter,” Wozniak says. “We put the technology into a system that damages creative thinking — the kids give up, and at a very early age.” Wozniak believes the mass-production system of education is the key problem, because students must follow a regimen dictated on a weekly basis, rather than “get a goal for the year and a reading list they can explore at their own pace to get to that result.”

The education system forgets that “if you love something, you go really far into it on your own,” and Wozniak believes that’s how schools need to think about education. “We need one good teacher per student” to allow each student to follow their own course, at their own pace, through the learning needed — under a teacher’s guidance. Of course, there are nowhere near enough teachers, nor budget to pay for them.

But maybe one day — 20 years or more from now — computers can be those one-on-one teachers, or at least teacher’s assistants, Wozniak says. That’s that notion of “companion computing” applied to education. “Computers can’t do it yet,” but some of the pieces are in place today.

The more human that computing gets, the more possible that vision will be — and not just for education.