You may not have heard the term “Zero UI,” but chances are, you've heard of some of its high-profile products. Maybe you even have one in your home. Do any of these ring a bell?
- Amazon Echo/Alexa
- Google Home
- Nest Thermostat
Zero UI (Zero User Interface) is the movement of interfaces away from screens and toward artificial intelligence. It was first presented by Andy Goodman of Fjordnet. The technology has actually been around for awhile, too. With Zero UI products, there’s no touchscreen and devices respond to the user’s environment.
Let’s backtrack a bit. Traditionally, we’ve communicated with technology through a graphical user interface (GUI). GUIs are native to screens, and we interact with them through any combination of clicks, scrolls, taps, and swipes. What we see on a screen determines how we proceed with our usage.
With Zero UI, rather than relying on visuals and screens, the information comes from natural interactions such as movement, ambient, gesture-based, and voice recognition. This reduces the time one spends on the computer or on a phone. It also increases machine learning and understanding, as the goal of many of these products is to become smarter and integrated with user habits over repeated use.
Many screenless devices are also smart devices and use algorithms to record user data, then function in response. For example, the Nest Thermostat collects data on a person’s temperature preferences, schedule, and even details like how drafty the home is. So, if you typically leave for work at the same time every day, it’ll automatically decrease the temperature in hopes of returning a lower utility bill.
The integration of environmental, gesture-based, and speech recognition software in some of these Zero UI products has also contributed to comprehensive insights about accessibility. For example, it's common for visually impaired and blind individuals to use screen readers when using a computer. Screen reading software announces written content through audio. Amazon Echo, a smart speaker, removes the screen from the equation. The user can ask Echo's Alexa assistant to read the news and handle other daily tasks, such as ordering groceries or requesting an Uber, in a friendlier and less convoluted manner.
For designers, producing invisible interfaces means that we will have to continue broadening our understanding of human behavior. Design has always incorporated information about perception and psychology; for example, what motivates someone to click a link, scroll down a page, or close a browser window? As we consider the move toward devices that decenter two-dimensional experiences, where information is transposed beyond clicks and taps, we will have to consider a more complex view of human behavior. Data analytics, physics, and audio design are a few fields that will increasingly inform us in orchestrating user scenarios.
Neither the term nor the trend is literal. Zero UI is not an abandonment of visual design, but rather an integration of our other sensory experiences to provide feedback, inform, and assist us. Overall, it seeks to save valuable time and make our lives easier.