# User Friendly - Cliff Kuang Robert Fabricant Synced: [[2023_11_30]] 6:03 AM Last Highlighted: [[2023_08_29]] ![rw-book-cover](https://readwise-assets.s3.amazonaws.com/static/images/default-book-icon-2.dae1dc4d332b.png) ## Highlights [[2023_07_26]] (Page 15) > Crowder proposed that a computer program be gauged not just on how well it solved a problem but on how easy it made the lives of the people trying to solve it. [[2023_07_28]] (Page 45) > That mental model can be deep or shallow—it might vary from just a sense that this button does that, to a picture in your head about how your hybrid car charges its battery. But those mental models are knowingly crafted by the designers who put interfaces in front of you. [[2023_08_24]] (Page 141) > We take for granted how the internet arrived for us in the West. We take for granted all the metaphors involved. But once seen, these rise up like the first skyscrapers in a skyline. The “World Wide Web” evoked the image of a literal spiderweb, spanning the globe. What connects the web? Hyperlinks, like links in a chain piecing together all the places you want to go. If you can’t find the right link, then you have a search engine, a machine that gleans information as it crawls the web. These metaphors fall short of an instruction manual, but they nonetheless foster some basic sense of the internet’s logic: how to navigate it, using a browser—two more metaphors, borrowed from sailing and libraries, which bring forth ideas about coordinates and filing systems. These metaphors helped explain not only what the internet was, but what it could become. [[2023_08_24]] (Page 141) > In the early 1990s, in the West, the news was filled with explainers about the “information superhighway.” We don’t much remember them or the metaphors that they contained. That’s simply because it was explained to us all so slowly, over time. We learned what the web was by using it. Eventually, we didn’t need the metaphors at all. (As the design theorist Klaus Krippendorff writes, “Metaphors die in repeated use but leave behind the reality that they had languaged into being.”)5 [[2023_08_24]] (Page 143) > you have the metaphor “time is money,” then you’re not just comparing time and money. You’re assuming rules about how time should behave: If time is like money, then, just like money, it can be saved or invested wisely; it can be wasted or stolen or borrowed.9 The right metaphor is like an instruction manual but better, because it teaches you how something should work without you ever having to be told. [[2023_08_24]] (Page 144) > That power is what allows metaphors to transfer ideas from a specialized domain—say, the inner workings of a bunch of networked computers, known only to their engineers—to a new cohort. Metaphors strip away what’s specialized and complex, focusing our attention on just the few things we need to make sense of something, the ideas we share. Saying the internet is like a web of information, connected by links, tells you what the web is for: joining up spheres of knowledge. It implies what you might do, even what you might invent. Metaphors become so embedded in our experience that they seem second nature: time is money; life is a journey; the body is a machine. But often, the metaphors we live with have been designed. [[2023_08_24]] (Page 157) > That’s how metaphors work: Once their underlying logic becomes manifest, we forget that they were ever there. No one remembers that before the steering wheel in a car, there were tillers, and that tillers made for a natural comparison when no one drove cars and far more people had piloted a boat.24 The metaphor disappeared once driving cars became common. In digesting new technologies, we climb a ladder of metaphors, and each rung helps us step up to the next. Our prior assumptions lend us confidence about how a new technology works. [[2023_08_24]] (Page 168) > Similarly, the Dyson vacuum, with its exposed piping and carefully outlined motor casings, was meant to tell a story about the company’s zeal for engineering. The transparent dust canister, a first in the history of vacuum cleaners, was likewise meant to show you what all that machinery had done. Seeing the dust you’d just gathered created a feedback loop that hadn’t existed before. If you own a Dyson, then you know the satisfaction of being surprised by the sheer volume of all that dust you’ve collected, and how it just makes you want to vacuum more. None of that would exist but for the beautiful rigor of the self-consciously high-tech design. [[2023_08_24]] (Page 169) > Metaphor is no less important in how we make things beautiful. In the user-friendly world, beauty is a tool that transforms something that’s easy to use into something we want to use. [[2023_08_24]] (Page 169) > “Beauty” is the word we use when a designer’s vision overlaps with our own. [[2023_08_26]] (Page 193) > In addition to creating a culture in which the entire staff became students of human behavior, there were two more ingredients in IDEO’s way of working: putting prototypes, no matter how primitive, in front of users as quickly as possible, and the idea that the design process didn’t lie with any one “designer.” Both tenets sprang from the environment that had nourished the young company. Helped by the self-organizing hacker ethos that had spawned Silicon Valley, both Moggridge and Kelley assumed that their office would be radically egalitarian and nonhierarchical. [[2023_08_26]] (Page 194) > Today, Fulton Suri’s insistence on rooting innovation in the nuance of individual experience has become the maxim that if you design for everyone, you design for no one. [[2023_08_26]] (Page 210) [[favorite]] > It’s common to hear technologists articulate that same dream of making technology so useful that it’s invisible. But how will it become so? Simply by weaving itself into the social fabric that preceded it; by becoming more humane. The teleology of technology’s march is that it should mirror us better—that it should travel an arc of increasing humaneness. [[2023_08_26]] (Page 218) > The famous Aeron chair, which has become synonymous with infinitely adjustable office comfort, didn’t begin life as an investigation into the sitting habits of worker bees. It started as a research project to create a breathable mesh sitting structure that wouldn’t cause the elderly to develop bedsores. [[2023_08_26]] (Page 221) > Today, we are drowning in interactions with smartphones and smart devices, such as our cars and homes—all of which suddenly want to talk to our phones as well. We live in a world of countless transitions. Instead of there being one device, there is actually an infinite number of handoffs between devices. There needs to be a new kind of design process to manage those seams. “The assumptions about computing are that our devices are one-on-one with visual interactions. The design discipline is built around those assumptions,” Holmes pointed out. “They assume that we’re one person all the time.” [[2023_08_26]] (Page 241) > Disney wasn’t experiencing something unique. Rather, it was experiencing something that has become common in this user-friendly era, when entire organizations have to work together to create one simple thing that every one of their customers will touch. How do you get one thousand people to agree on a single detail in an app, or one tiny piece of the MagicBand system, if they don’t share a vision? The modern corporation wasn’t designed to serve up a coherent experience. It was designed for the division of labor, to expend its energies on the efficiency of the parts rather than the shape of the whole. Those seams are obvious once you start to look at them: how Amazon’s website has started to seem not like Amazon but like a photo negative of Amazon’s organizational structure, with entire rabbit holes of navigation dedicated to video, groceries, audiobooks, music, even a weird section of the website telling you all the things you can do on Alexa—which is its own weird universe that mysteriously connects to all that other stuff. [[2023_08_26]] (Page 262) > Any man whose errors take ten years to correct is quite a man. > —Robert Oppenheimer [[2023_08_26]] (Page 269) > What was a slot machine—or any other game of chance—if not a Skinner box? You pulled a lever, and you never knew what you would get. It was the prospect of winning big that reeled you in. It was the fact that you almost never did that kept you pawing at the lever. [[2023_08_26]] (Page 271) > “Once you know how to push people’s buttons, you can play them like a piano,” wrote the designer Tristan Harris. “Tech companies often claim that ‘we’re just making it easier for users to see the video they want to watch’ when they are actually serving their business interests. And you can’t blame them, because increasing ‘time spent’ is the currency they compete for.” [[2023_08_26]] (Page 273) > This naive enthusiasm was incubated at Stanford by B. J. Fogg, who had been a star student of Clifford Nass, the Stanford professor whom we met in chapter 4 who studied the politeness that people applied to computers. Following the work of his mentor, Fogg analyzed the ways computers shape our behavior. Yet he was about to see an experimental outlet Nass could never have dreamed of, in the form of Facebook. > By the end of 2006, just two years after launch, Facebook had amassed 12 million active users and showed no sign of slowing. Seeking to spur even more growth, Facebook by then had opened up its platform so that outside developers could build games upon it. Fogg was keen enough to recognize Facebook as a virgin mine of psychological data—and not just a mine, but a place to put psychological theories to work. So in September 2007, for an undergraduate computer-science course titled Apps for Facebook,21 Fogg asked his students to build their own Facebook games, and to target their users with a variety of psychological principles. These included a form of online dodgeball, which asked players to goad their friends into joining, and a virtual hug exchange, which capitalized on the human need to return kindness. Together, the seventy-five students managed to garner $1 million in revenue and 16 million users within ten weeks. The final class presentation was attended by five hundred people, including hungry investors.22 Watching that explosive growth, Fogg wondered: What made some of those games so irresistibly sticky? He codified the principles in just three elements: motivation, trigger, and ability. Create a motivation, no matter how silly or trivial. Provide a trigger that lets a user sate that motivation. Then make it easy to act upon it. [[2023_08_26]] (Page 274) > (Indeed, one of Fogg’s disciples, Nir Eyal, rocketed to guru status in Silicon Valley by popularizing Fogg’s insights in a book titled Hooked.) Boiled down, Fogg’s model is simply that we form new habits when triggers in our environment allow us to act upon our motivations—pleasure and pain, hope and fear, belonging and rejection. Goading a user into action is merely about having triggers arrive at the perfect time, and letting us act upon them with maximal ease. And what’s the best way to reward those actions? Uncertain rewards that tickle our dopamine centers, of course. To be sure, even as he was articulating and developing these theories, Fogg tried valiantly to ward off their potential misuse. [[2023_08_29]] (Page 282) > With just a few dozen likes, Kosinski’s model could guess with 95 percent accuracy a person’s race. Sexual orientation and political party were almost as close, at 88 percent and 85 percent. Marital status, religiosity, cigarette smoking, drug use, and even having separated parents were also within the model’s predictive reach. Then things got eerie. Seventy likes were enough to predict a person’s responses on a personality quiz even better than their friends could. Just 150 likes would be enough to outdo the person’s parents. At 300 or more likes, you could predict nuances of preference and personality unknown even to a person’s partner. [[2023_08_29]] (Page 283) > Kosinski had shown that if you knew a person’s Facebook likes, you knew their personality. And if you knew their personality, then you could readily tailor messages to them—based on what made them angry or scared or motivated or lonely. [[2023_08_29]] (Page 287) > The most optimistic thinkers in Silicon Valley believe that the answer is for all of us to be able to code. That’s why today there are so many beautifully designed products aimed at teaching kids the basics. But why should coding remain a barrier to remaking our digital world? Why isn’t it easier for all of us to peer under the hood of an algorithm, much as in a previous era we might have tinkered with our cars? [[2023_08_29]] (Page 292) > For example, it’s astounding how little Facebook makes per user—somewhere between two and four dollars per month. How far-fetched is it that we might finally account for its costs and opt into something else with our money? Americans will happily pay 50 percent more for organic goods. How much more would we pay for products that give us peace of mind, let alone the ability to be better to ourselves? [[2023_08_29]] (Page 310) > A growing body of research shows that it’s fear of missing out—FOMO—that drives the unhappiness that seems to spring from social networking.8 But interestingly, that unhappiness seems also limited to the generations that didn’t have social networking from their very earliest years. Somehow, kids who grew up with social networking found a way to inoculate themselves from the danger of overconnection. Researchers detected in them a self-knowledge about how much was too much. They knew how to stay away when they needed to. I don’t think it’s a fool’s hope that one of those kids will go on to make something that embodies that reflexive self-control. After all, there probably isn’t any way to design the FOMO out of Facebook. Facebook is FOMO. A better Facebook means something that is nothing like Facebook, but which can fulfill the same need for connection. For now, we can only imagine what a product meant to make our world both smaller and more manageable might look like [[2023_08_29]] (Page 313) > The next phase in user experience will be to change our founding metaphors so that we can express our higher needs, not just our immediate preferences. This will require users to resolve tensions that may seem impossible to resolve: how to connect people to more things while making their world easier to understand; to offer fewer, better choices in a world constantly filling up with more of them. It starts with remaking the assumptions that hide in plain sight.