jessicagroopman
Contributor

Shifting from machine logic to intelligent interfaces

opinion
Feb 28, 20184 mins

While machine learning introduces new approaches to software, AI’s more transformative impact will be in the way we interface with each other, businesses, and the world around us

futuristic user interface - heads-up display
Credit: Thinkstock

The most profound advancements in technology are often less about technology and more about new interfaces replacing old ones. The car offered an entirely new interface to mobility than a horse or a bike; the telephone did the same for communications; databases replaced filing cabinets; the internet and, later, mobile devices have drastically altered (if not replaced) print media.

Artificial intelligence is ushering in a broader shift than merely faster logic and data processing; it is enabling altogether new user interfaces and experiences. While AI takes myriad forms, virtually every application can be rolled into one of three buckets: machine capabilities involving vision, language, or analysis.

A growing range of technologies are powering these capabilities, including machine learning, deep learning, natural language processing, and computer vision. But it’s often combinations of the above that forge altogether new ways information systems are mimicking biological systems and human-like abilities.

Consider the following interfaces that are powered by various AI technologies:

  • Voice recognition 

  • Facial recognition
  • Emotion recognition

  • Hand recognition
  • Gesture recognition 

  • Iris or retina recognition
  • Gaze-tracking
  • Gait (walking) recognition
  • Social robots
  • In-store robots or avatars

  • Autonomous vehicles 

  • Drones 

  • Hearables (with conversation-based agents)
  • Virtual agents
  • Augmented reality and mixed reality
  • Virtual reality 

  • Computer vision and three-dimensional (3D) modeling 

  • Virtual assistants 

  • Tactile, texture, impact, grip recognition
  • Language translation services 


AI is driving a shift in what can be digitized

What constitutes a data-emitting event in the physical world is expanding. Sensors and networking technology expanded digital from personal computers to mobile to objects and infrastructure; AI is digitizing the modalities we use to interact.

The list above all represent alternatives to current modes of interaction. They also represent diverse efforts aimed at scaling how we digitally interface with the physical world.

For example, biometric authentication (using facial, iris, or other anatomically based recognition) is widely considered an improvement to current password/PIN security vulnerabilities, given the relative difficulty of replication. Indeed, millions of smartphones are already outfitted with fingerprint and increasingly facial recognition software. Consider how biometric authentication will impact real-world experiences such as payment, accessing assets like a home or car, going through airport security, health care or records access, and marketing and emotion recognition.

Voice and natural language understanding (NLU) represent another sea change in our expectations of experience. Beyond the convenience of Siri or Alexa, voice is quickly becoming a ubiquitous and seamless command, control, and information access modality in the enterprise, in industrial environments, in cars, for the disabled, and beyond.

Consider how augmented information overlay could impact social interactions; how it will expedite decision-making; how in enterprise environments, it is already being implemented to accelerate repair and maintenance. Powered by augmented and mixed reality, image and object recognition, and potentially simultaneous localization and mapping, the ability to augment our vision with real-time context will usher in a new type of reliance on technology. 

From intelligent interfaces to invisible interfaces

AI and its convergence with IoT and infrastructure technologies can also render technological interface altogether invisible! Take Amazon Go, the frictionless grocery store that eliminates the checkout experience. While a complex configuration of sensor data fusion, shelf weights, cameras, computer vision, deep learning, mobile and POS integration, the experience for shoppers feels largely tech-free. Shoppers walk in, pick their desired items from the shelf, and simply walk out.

In contrast to many explicit user interface modalities, like gesture or voice recognition, AI also underlies numerous use cases in which overt interactions in one environment—in-home, in-store, with a robot, while driving, etc.—are then incorporated to inform interactions such as automation, security, or advertising elsewhere.

Diverse impacts and implications underscore each of these emerging interfaces. Never mind what some predict will eventually become embedded within us to augment our knowledge, recall memories, or cognitively control our environment. We may still be far from commercially viable brain-machine interface, but we can be sure of one thing. AI represents more than machine logic, it represents the future of how we interface with each other, businesses, and the world around us.

jessicagroopman

Jessica Groopman is Industry Analyst & Founding Partner at Kaleido Insights, a research and advisory firm analyzing the impacts of technology disruption on humans, businesses, and ecosystems. Jessica leads the automation practice, conducts research on product, service, and process automation using the Internet of Things, artificial intelligence, and blockchain, and specializes in user experience and data integrity. Based in the San Francisco Bay Area, Jessica works with innovative companies in retail, smart home, health, technology, agriculture and media to develop research, content and digital strategies.

Groopman is a regular speaker, moderator, and panelist at IoT industry events. She is also a frequent contributor to numerous 3rd party blogs and news/media outlets. Jessica has been principal analyst with Tractica where she contributed to their automation and robotics practice. She has also served as contributing member of the International IoT Council, the IEEE’s Internet of Things Group, IoT Guru Network, and FC Business Intelligence’s IoT Nexus Advisory Board. Jessica was also included in Onalytica’s list of the 100 Most Influential Thought Leaders in IoT.

Jessica has also served as research director and principal analyst with Harbor Research where she headed research and content strategy and assisted in leading Harbor’s Smart Systems Lab program. Prior, Jessica was an industry analyst with Altimeter Group, where she covered Internet of Things and contributed to research around other disruptive technological trends, such as real-time marketing, social media, and mobile commerce.

The opinions expressed in this blog are those of Jessica Groopman and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.

More from this author