Galen Gruman
Executive Editor for Global Content

The fallacy of collaboration technology

analysis
Jun 1, 20129 mins

Videoconferencing, unified communications, and shared editing don't work the way people do

Sometimes, the future won’t go away, even though it also doesn’t actually transpire. Case in point: I’ve been a technology writer and editor for nearly 30 years and have yet to see the promised utopia of collaborative computing. This is the same future that envisioned flying cars and undersea cities, mind you. Just as those don’t exist, neither does the virtual collaboration vision in which we’re all videoconferencing from anywhere while working on the same documents and projects together in real time.

Many of the technologies needed to support that vision exist, so why isn’t it a reality? In fact, we’ve seen the failure of a high-profile version of this future in the ignoble death of the Cisco Cius videoconferencing tablet. Additionally, despite the universality of cameras in laptops, their rare use in business for videconferencing shows it’s not just exorbitant infrastructure costs that doomed the Avaya and Cisco tablets.

Likewise, the adoption of unified communications technology — pitched for better business collaboration — has been negligible, despite more than a decade’s worth of promotion by Microsoft, Cisco, Hewlett-Packard, IBM, and many others. At the same time, personal communications has exploded through instant messaging, email, social networks, and other new techniques.

There are several reasons relating to the fallacy in understanding what collaboration is and should be. As often happens in technology, the futurists start with the technology and impose it on people’s behavior, without looking to see where the technology can leverage innate behavior.

I’ve described how social networking in business suffers from a similar fallacy. People love to talk and chat and gossip, and social networking lets that happen at a scale never before possible. But that doesn’t mean such behavior makes sense at work. The dynamic is similar for collaboration, as 30 years of knowledge management and its failed progeny have amply demonstrated.

Still, collaboration is good, and I believe we have more of it than ever — but not in the way the major vendors are pushing.

There are three components to the vision of business collaboration. Let’s take each one in turn.

Videoconferencing: An awkward, pricey phone call The sexy aspect of collaboration, if you’re a nerd, is videoconferencing. Technologists have long loved the notion of immersive video experiences, where people can get together without getting together — a weirdly alienated camaraderie. Of course, the business pitch was to save on travel costs and, after 9/11, the uncertainties of air travel. I’m confident it can be useful, but not that often and not for most of us.

Think about it: Throughout human history, collaboration was either hyperlocal (that is, in person) or hugely distant (in both time and space). People would conduct business via letters that took months or years to deliver — even just two centuries ago. Or they would send agents who would act independently but establish a local presence. In other words, for thousands of years, we didn’t need groupthink for most actions and decisions, and instead worked together asynchronously.

The Internet and the communications technologies that ride on it have closed much of the time gap — it takes seconds, not months, for a message to be transmitted. Though we can collaborate faster, it’s still very much an asynchronous activity, and that’s a good thing. In business, it lets you pause, research, and otherwise think before you speak or act. I believe that’s why we’ve taken so quickly to email — an asynchronous mechanism that lets you store, organize, and search your communications history — and more recently to the various forms of instant messaging.

When we need to get together live but aren’t in close-by locations, the next best thing should be a videoconference. But it isn’t. Instead, it’s the phone, a device we’ve used for nearly a century. A group call can be awkward, as there’s no way to signal who wants to speak or prevent simultaneous conversations, but we’ve learned to manage. Phone conferences work even when you don’t know all the participants; you may lack some body language, but you still get voice cues.

A videoconference is just as awkward as a phone conference, and in some ways moreso. Unless you have high-priced room-sized screens and the dedicated networks that go with them, you’re dealing with small images of the people in the conference, which greatly degrades the body language signals. Plus, it’s hard to know where to focus your attention. What you get is an overhead-heavy version of a phone conference, which is why I believe few people do group videoconferences despite all the tools available.

Maybe Google+ Hangouts and the like will change that, but I suspect not in business. Sure, video calls via Skype, FaceTime, and AIM are popular for personal communications, just as Facebook and Pinterest are for personal socializing. But the point is the personal connection, not so-called collaboration.

Unified communications: Too much work Wouldn’t it be great if all your communications came through just one channel, so you could see your voicemails in your email and your instant messages in your email? Apparently not, as this promise has been pitched for a decade with little uptake. Originally premised on VoIP, in a phone-centric approach, unified communications has tried to morph into a messaging-centric model. That makes sense, as people clearly talk less and message more.

Yet you see little actual unification, despite the dozens of clients for PCs, Macs, iPads, and smartphones available. My theory: Maybe unification is too hard for the brain. As we’ve moved to larger and larger screens, it’s become more valuable to have multiple communications streams in parallel, each in its own window. You can switch attention as needed.

But switching attention necessarily means losing the context of the other streams, and if someone is speaking or showing a video or slideshow, you need to stay focused to get the whole context. These systems don’t work like TiVo, where you can pause the stream, go elsewhere, and resume where you left off or even jump back 30 seconds to recall the preceding context. When you start using an iPad or a smartphone, the smaller screen makes such switching even more difficult — there’s room for only one visual or interaction “channel” at a time. You end up at most with two simultaneous modes: what you hear via the audio and what you see on the screen (video, website, presentation, or app).

It’s true that many of us have become skilled at half-listening to a phone call while checking our email, skimming PowerPoints, or using the Web — I know I have. With the iPad and smartphones, we can now do so in live meetings — and we do. But that’s not collaboration or unified communications. That’s monitoring one stream intermittently while focusing on another (politely and surreptitiously, of course).

Real collaboration means paying attention, and the more streams in play, the harder it is. That’s the fallacy of unified communications in the context of collaboration. It makes more sense as a repository for multiple forms of related information — but email does that decently enough with attached presentations and links. An Internet-connected computer or tablet does that at an even higher level, so the need for a dedicated unifier tool is questionable.

Shared editing: A free-for-all The final form of collaboration technology is group editing, where everyone can work on the same document live. The metaphor is of whiteboarding in a brainstorming session, where people sketch out their thoughts on a surface everyone can see, and argue verbally and on the whiteboard over their points of view.

That’s great for brainstorming, but not much else. If you’ve used a tool like Google Docs for group editing, you know how implausible it is to work that way. Not only do you have competing changes, but you lose all sense of organization as everyone goes off in a separate, uncoordinated, and essentially stovepiped directions. That’s not how you get things done. Yet Microsoft, Google, and others have been trying to sell that vision for years.

In the traditional model, you assign someone to create the draft. That person gathers input before creating the draft, circulates the draft for comments and thoughts, assesses all that feedback, and delivers a final draft. The same person needs to formulate a view of the whole document, assess the feedback and other context, and drive the final document to a cohesive whole that meets the original goal. That’s something a single human brain does. There’s collaboration, but it’s managed and filtered. That’s why revisions tracking and commenting tools are so widely used, but not ones connected to live shared edits.

I suspect the push for group editing technologies comes from a chip-on-shoulder desire to be heard (in that alienated manner common to many techies), so it’s a way of imposing naive equality on a process that requires an expert. People have different skills and levels of ability, and pretending otherwise is bad for business. There are very few human group activities where it’s a free-for-all, which is what the shared-editing approach really is.

Collaboration is better than ever Where does this leave collaboration? In a better place. Because it’s so much easier to communicate to individuals and groups through so many means — blogs, email, video and photo sharing sites, instant messaging, social networks, phone calls, and so on — there’s much more opportunity to collaborate. But most of these are asynchronous forms of collaboration that let each party think (if desired) about their contributions, and it allows the people ultimately responsible for the results to take and deliver on that responsibility.

The problem with the much-hyped collaboration technologies is that they destroy the benefit of asynchronicity or overburden the participants with so much mental and technology overhead that more is lost than gained. Yes, there are justifiable uses for videoconferencing, unified communications, and shared editing — but they are the exceptions, not the rule, as you can see in the workplace.

This article, “The fallacy of collaboration technology,” was originally published at InfoWorld.com. Read more of Galen Gruman’s Smart User blog at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.