Volume 24 Number 10
Usability in Practice - Getting Inside Your Users’ Heads
Dr. Charles B. Kreitzberg
|Have you ever gone to a movie and thought it was great, but your companion thought it was awful? Or have you listened to a politician you thought was spouting nonsense even while others cheered? People often see the same thing but interpret it differently. It can be hard to get into someone’s head and understand where that person is coming from. Sometimes you look at others and wonder what in the world are they thinking. But if you want to design software that users find intuitive, you need to answer that question, at least to some extent. In this month’s column, we’ll suggest a way of thinking about what’s going on inside your users’ heads as they interact with a software product.||
Now, we all know that inside a user's head is really about a pound and a half of wet, messy neurons and assorted chemical messengers. But for purposes of design, knowing that is about as useful as knowing that your laptop is made up of three pounds of silicon, copper, and polyvinyl chloride. With both computers and the human brain, the most effective insights take place when we speak in metaphors. Just as it's more productive to think about programming constructs like objects and methods than about bits and buses, it's more productive to talk about cognitive constructs like concepts and scripts than about synapses and neurotransmitters.
So What’s a Useful Way to Think About What’s in a User’s Head?
Computers and the brain have a lot of interesting parallels. Like computers, brains involve both hardware and software. They deal with input, buffering, pattern recognition, search and retrieval. Human memory seems to be a mash-up of knowledge bases and multimedia storage.
For usability and user experience, we need to decide what aspects of cognition are most relevant. We think of cognitive processes as organized into three interacting subsystems: the executive, the perceptual system, and the user’s mental model.
At the top is the executive function, which is the control center. You can think of the executive function as the place where the user sets goals, makes decisions, and solves problems. The executive monitors incoming data (things you see, hear, or touch) and internal data (thoughts and emotional feelings). One of the key responsibilities of the executive is the allocation of processing resources to the various data elements that are coming from the sensory and perceptual processors. It does this by determining how much attention to pay to each.
The second system is the perceptual system. Its function is to manage the constant stream of information that floods our senses and determine what information needs further processing and what can be safely ignored. Sensory information enters the perceptual system through our I/O devices, such as our eyes and ears. The perceptual system performs preprocessing to organize sensory streams into objects and compares the objects to items in memory to determine how important they are. If they are potentially important, they are passed on to the cognitive system (mental model) for more in-depth processing. For example, if you are driving and suddenly see an object in the road, the perceptual system will focus on it. Once it determines that it’s only an old paper cup, it discards the object without processing it further. But if it determines that the object is potentially important or a possible hazard, it alerts the executive to allocate additional processing to it. In terms of usability, an understanding of how the perceptual system works yields a set of design principles that can be applied to screen design. (For more information, see our previous column at msdn.microsoft.com/magazine/ee413547.aspx.)
The third system is the user’s mental model, the way the user represents knowledge—think of it as a database of knowledge and experience. Understanding the user’s mental model is key to creating an intuitive design. Joel Spolsky suggests that the “cardinal rule of all user interface design” is that a user interface is well-designed when the program behaves exactly how the user thought it would (joelonsoftware.com/uibook/chapters/fog0000000057.html)
Why Is It Important to Understand How Cognitive Processing Works?
If you understand the user’s mental model, you can figure out how to design the UI and task flows so that they match the model. And in those cases where you need to change the user model (through training or other user support), you can design the training or help so that it is as effective and efficient as possible.
Figure 1 shows how this works. The executive controls speed and direction. It determines where to focus attention and how much mental effort to allocate to it. It employs the perceptual processor to scan input from the environment and do just enough preprocessing to determine what is important and what is not. The executive also calls on the mental model processor for deeper cognitive processing tasks.
Figure 1 How Cognitive Processing Works
What About the Knob?
The knob at the bottom of Figure 1, labeled “Processing Level,” points out one difference between silicon and carbon computers. An electronic computer always delivers consistent processing. The carbon computer throttles up and down depending on the state of the user. If the user is relaxed or fatigued, the processor throttles down, much like some power management systems do for the same reason—to conserve energy. When the user is actively solving problems, it throttles up. Also, stimulation in the environment causes the processor to throttle up. That’s why you can’t sleep at a rock concert.
So How Can We Use This?
The nice part of having a model to work with is that it suggests ways to solve design problems. Understanding how people think about problems and interact with the environment is the foundation of creating a user interface that works just as the user expects. In that way, Spolsky is spot on.
Some design principles, particularly those that come from our understanding of perception, apply in most every situation. A large bright object on the screen commands a user’s attention. It’s even more compelling if the object moves—at least for a few milliseconds, until the executive decides it’s just a banner ad and not worth our attention. But many significant design decisions are deeper than perception and involve understanding the user’s mental model. Compared to perception, which is more or less the same for everyone, mental models are diverse and personal. The reason is that mental models are based on experience, and each of us has had different experiences. But even though everyone’s mental model is unique, we have enough shared experiences to work together. Usually.
Remember the book Men Are from Mars, Women Are from Venus? The author (John Gray) created a best-seller by pointing out the differences in mental models of men and women. Paul Glen published an insightful book called Leading Geeks that compared the mental models of developers (us) and business people (them). As you read through books like these, you come to understand that everyone’s reality is different, and this can lead to miscommunication and frustration.
If you understand the user’s mental model, you can not only design to it but also provide support to bridge the gaps. With an understanding of how people learn, you can design user experiences that help users acquire a deeper and more powerful mental model of how the software is organized and what they can do with it. When you do this, you can create power users.
Understanding something about how human cognitive processing works can also help you become aware of ways that your own mental models differ from those of the user. Looking at software from the perspective of technical understanding is quite different from looking at the same product without it. Developers often overestimate a user’s technical knowledge and vocabulary, and this leads to design decisions that users find confusing. Understanding the gaps can help you avoid making these kinds of errors in design.
What Are Mental Models?
Mental models are the internal representations of the way that people represent reality. They include both cognitive and emotional components and determine how we interpret and respond to the things we encounter. Because two people have different mental models, they can look at the same situation and interpret it very differently.
This clash of mental models is the source of one of the best-remembered lines in Woody Allen’s 1980 movie Stardust Memories (urbanwildlifesociety.org/pigeons/WoodyAllnRatWWings.html). In the scene, Sandy Bates (played by Woody Allen) and Dorrie (played by Charlotte Rampling) are talking when a pigeon lands on the window sill. They have very different reactions. Sandy sees the pigeon as dirty and dangerous and refers to pigeons as “rats with wings,” but Dorrie sees it as cute.
How Is a Mental Model Structured?
Mental models are very complex and subtle. After all, they embody the process of human thought and insight. But for purposes of design discussions, we can boil them down to three basic elements: concepts, patterns, and scripts. Of course, these don’t correspond to physical structures in the brain— they are metaphors, but useful ones.
Concepts are the building blocks of mental models. Concepts represent objects in the real world as well as more abstract ideas. For example, if you show someone a flash thumb drive, he or she can recognize it and make inferences about it—for example, it fits in a USB slot, it holds data, and so on.
How deeply someone understands the concept of a flash drive depends on how rich that person’s concept is. A nontechnical user might think of the drive simply as an easy way to store and carry photographs, while a more technical user would understand the subtleties of transfer rate and NAND flash memory.
Concepts are linked together so that you can get from one concept to a related concept. It’s like a personal semantic web. For example, the “flash drive” concept might be linked to other concepts, like “USB interface,” “USB 2.0, “and “file system.” You can follow the links from concept to concept and gain some sense of how they are organized. In fact, that’s how psychoanalysis works. Just lie down on the couch and freely associate, which means traversing conceptual links until you come across an odd one (“How come you think of a cigar every time you mention your sister”), and that’s a clue that there is something to explore. But, back to UI design.
Flash drives are, of course, concrete objects, but concepts can also be abstract. The concept of a programming object, for example, cannot be defined by showing examples from the real world. Abstract concepts are much more difficult to learn and discuss than concrete ones, and that is often a problem when you are designing the user experience. Since so much of programming is abstract and metaphorical, it’s very hard to communicate these concepts to people less technically inclined. If you doubt it, just try to explain refactoring code to a nontechnical friend.
Does the User Have the Concept?
Ultimately, concepts are what give things their meaning. If we say to you “Send me the bitmap as a compressed file,” you need to have concepts for bitmap, compression, and send before you can understand what we want you to do. So determine if your users have the required concepts in the first place. If not, you need to design around this or teach the missing concepts. Teaching is the harder of the choices, but it’s the one that software developers have usually embraced (“Oh, just give the user some training”). From the developer’s point of view, this is, of course, the easiest solution. But the reality is that educating the user is a tough job—just ask any teacher. To the extent that you can design the software to fit the user’s mental model, you will get a better result.
Is the Concept Active?
Recently Charlie had the experience at a local hardware store of encountering a salesperson who looked familiar but who he couldn’t place. Later he realized that it was Bob, a clerk at a local deli where Charlie buys his morning coffee. Bob was earning extra money working in the hardware store, but because Charlie saw him in a different context, he failed to make the connection.
This is a common occurrence (for me, at least) that illustrates a peculiarity of concepts—the user has to recognize the relevance of the concepts before they can do their job. What this means to the designer is that you need to activate a concept to make certain that the user can make use of it. How do you do this? There are three ways:
- Use the name of the concept. Saying “Edit the XML file” rather than “Edit the parms” ensures that the user connects the action with his or her understanding of XML.
- Use a visual cue or picture.
- Activate a nearby concept. When you activate a specific concept, you also activate the concepts that are associated with it.
Activating a concept through context can also minimize ambiguity. If someone tells you “Thrashing is bad,” you would have a different mental picture depending on whether the previous sentence was “I want to talk about paging” or “I want to talk about corporal punishment.”
You know the feeling that you get (“Oh heck, here we go again”) just before you get into a hassle with someone for the fifteenth time. That feeling is the recognition of a pattern. Patterns are tough to define because they don’t refer just to a single thing but to a class of similar things. It’s the similarity that we perceive that makes something a pattern. In software development, folks have formalized ideas of patterns like the Gang of Four and Fowler’s software design patterns, and interaction design patterns are becoming quite popular among user experience designers.
A lot of different kinds of patterns exist. Here are some examples of different types of patterns:
- Static representations
- Dynamic behaviors
- Interpersonal and social behaviors
- Multiple perspectives
- Changes over time
Patterns are frameworks for understanding and problem-solving. Just as patterns can help you solve design problems, patterns can also be used to organize information in a way that is meaningful to the user—to help them discover and use patterns in your own design. If you can help users recognize familiar patterns, they will see how the current situation is “like” a pattern they already understand and be better able to execute on it.
In the context of mental models, scripts are cognitive structures that include time and process. Scripts are also referred to as “procedural memory.” Scripts represent flow, involving steps and conditions, much like imperative code or a workflow technology like Windows Workflow Foundation. For example, going to a restaurant might have these steps and conditions:
- Enter and be shown to a table (if there is a wait, leave name)
- Read the menu
- Select the food
- Be served
Scripts generally do not include all the details in a situation, so these details are addressed in real time. For example, a script for a job interview might include greeting and introductions, answer questions, wrap up the discussion. But how each of these steps is actually conducted cannot be determined until the interview starts.
How Can We Come to Understand What Our Users’ Mental Models Are?
It’s one thing to know about mental models and how they are structured, but you need to be able to put that knowledge into practice. The first step is to understand the models of perception and cognition so that you can apply the principles that derive from them.
But in most cases you need to probe more deeply and understand the mental models of your intended audience. And that raises the question of how you can get a reasonable idea of what actually is in your users’ heads. That is the purpose of user research.
As a starting point, it’s important to remember that everyone’s mental model is different. Remember how the characters in Woody Allen’s film had different concepts for a pigeon? But people also have shared elements in their mental models. For this reason, a key element of user research is segmenting the audience. Audience segments are typically represented as personas.
Your user research plan can be crafted according to your needs, but some common approaches are surveys, interviews, focus groups, and what is broadly known as “contextual inquiry.”During design, you can use collaborative design techniques like mood maps and card sorting to give you more insights, and then on the flip side, once you have something designed, you can use usability testing as a way to validate your design against real users’ mental models to see whether you got it right (or at least got close).
Surveys can be useful for questions that lend themselves to factual answers and quantitative responses. Surveys are useful for understanding the demographics of your audience, and they are also effective in helping you decide about appropriate technologies, form factors, preferred terminology, and similar objective elements. While you can also use surveys to obtain more qualitative answers, you have to be more cautious in interpreting qualitative results and, when possible, validate them against other data.
If you’re working on a new version of an existing application or site, check to see what tracking usage metrics and data (such as search terms) are available. These can also help you gain insight into the minds of your users.
Interviews and focus groups are really great ways to get a better understanding of people. Phone interviews are good, but in-person interviews enable you to observe nonverbal communication. If you use interviews in any form you’ll be amazed at how much your perception and understanding of your users can change when you just get out there and talk to them.
“Contextual inquiry” is a broad term used to encapsulate different forms of going to the places (the “contexts”) where people are using the products you make. It can be something as simple as observing call center reps for a few hours or something as complex as actually living with people. You probably won’t want or need to go quite that far, but if you are interested, explore some of the techniques that anthropologists use, and especially look at the techniques called “design anthropology*.*” The goal of contextual inquiry is to understand how the user functions in context and what it is like to be in their shoes.
Other low cost (and usually less effective) ways of doing user research are to go to conferences and seminars, read relevant publications, participate in online communities, and find as many opportunities as possible to learn about your audience. Generally, anything you can do to get a better understanding of people will help you form a more accurate understanding of their mental models.
Combining some sampling of these techniques with collaborative design and usability testing—along with a generous empathy—can go a long way toward helping you get inside your users’ heads and design better solutions that match their mental models more closely.
How Far Should You Go?
All of this may seem like a lot to learn, especially if this stuff isn't involved in your full-time job. But the benefits it can yield in terms of overall project success outweigh the effort by helping you get your designs right with less fuss and hassle, fewer last minute changes and tweaks (less rework), and more good will from your customers and end users.
How far should you go? Some developers are satisfied to have a basic understanding of mental models. That’s okay if it helps you make sense out of user experience (UX) design, work with UX designers, and understand how to integrate UX design into your development process. Other developers might want to delve more deeply into understanding cognitive processes. A deeper level of understanding can help you craft much better solutions by providing you with a framework that supports more effective design decisions.
Dr. Charles Kreitzerg is CEO of Cognetics Corp. (cognetics.com), which offers usability consulting and user experience design services. His passion is creating intuitive interfaces that engage and delight users while supporting the product’s business goals. Charles lives in Central New Jersey, where he moonlights as a performing musician.
Ambrose Little lives with his wife and four children in central New Jersey. He's been designing and developing software for more than 10 years and is honored to be an INETA speaker and Microsoft MVP. Lately, he's shifted from technical design to designing for people and is now a user experience designer for Infragistics.