ŷ

Jump to ratings and reviews
Rate this book

Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech

Rate this book
Buying groceries, tracking our health, finding a date: whatever we want to do, odds are that we can now do it online. But few of us ask why all these digital products are designed the way they are. It’s time we change that. Many of the services we rely on are full of oversights, biases, and downright ethical nightmares: Chatbots that harass women. Signup forms that fail anyone who’s not straight. Social media sites that send peppy messages about dead relatives. Algorithms that put more black people behind bars.

Sara Wachter-Boettcher takes an unflinching look at the values, processes, and assumptions that lead to these and other problems. Technically Wrong demystifies the tech industry, leaving those of us on the other side of the screen better prepared to make informed choices about the services we use—and demand more from the companies behind them.

240 pages, Hardcover

First published October 10, 2017

276 people are currently reading
8,176 people want to read

About the author

Sara Wachter-Boettcher

3books94followers
Sara Wachter-Boettcher is a web consultant based in Philadelphia, and the author of the forthcoming Technically Wrong, from W.W. Norton, as well as two books for web professionals: Design for Real Life, with Eric Meyer, and Content Everywhere.

She helps organizations make sense of their digital content, and speaks at conferences worldwide.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
838 (33%)
4 stars
1,139 (45%)
3 stars
432 (17%)
2 stars
69 (2%)
1 star
26 (1%)
Displaying 1 - 30 of 347 reviews
Profile Image for John.
458 reviews411 followers
January 2, 2021
Well . . . This is another one of those funny books that is sort of a �5� and sort of a �3.� The book broadly claims that the tech industry builds interfaces and products that are (not necessarily intentionally) biased. The book says that the main driver is the homogeneity of tech company investors and employees.

There is no doubt in my mind that this is true, and on that basis, I’d recommend this to anyone in or outside of tech. We product builders and designers are doing a crap job of acknowledging the incredibly broad types of people and styles of interaction out there. Because of tech’s homogeneity, there’s so much stuff that just isn’t thought about critically (e.g., image analysis software not being able to analyze non-white faces). But as I’ll get into it in a moment, I would very strongly recommend this to historians of technology as a little guide to problems that deserve significantly more research. The author’s a web consultant, but I think we need to bring out the scholars. There’s good stuff here about geography, the 2010s -- more reports of personal experiences would make the story even more valuable. (I keep thinking about to another book I reviewed: Turco’s The Conversational Firm, which shows how far we can get with ethnographical strategies.)

There are some arguments here that are very dear to my heart. For example, on p. 137 and chapter 3, the author notes how engineers and product designers will focus on the main experimental flow, and minimize the importance of “edge cases.� For example, say 80% of the users are young, and only 20% are old (perhaps needing bigger fonts). Well, the company is going to focus on where the money is: So font-changing features may be downplayed. The author rightly stresses harm and consequences: Even though the 20% might not be where the money is, the negative consequences of not helping them out with a useful UI can cause a lot of damage. One area I have been concerned about is privacy and security in healthcare. Say a login code is sent to an email: But that email might go to a shared account. For the most part, this is probably not troubling: The user “opted in,� supplying that email. But should we work harder to ensure that only the individual can access that account? What if it’s a shared account and medical details about domestic violence make it to that address. Again, say the patient has signed a consent to allow that message to go via email to a particular address. Should that minority example make us very concerned to protect the “minority� user pattern? I think so. The book does a good job walking the reader through this.

But I have some concerns:

* Geography: Time and time again, the examples lean towards west coast companies: Uber, Facebook, Twitter, etc. There are some exceptions. But I’d like to know: If the California tech culture is so bad, are there other places that are better?

* Timespan: Is this a particularly bad moment? Wachter-Boeettcher provides the appalling facts around the decline of women computer science majors (37% in 1984, 18% in 2014). “I can’t pretend to know the precise reason for this shift� (p. 182). Me neither. But this book is so anchored in the present, it begs the question of how we would assess, say, the tech culture of the 80s. I bet it was better. But was it? Just as an example, back in the day, Ann Wollrath was the lead author on the original RMI article. Big stuff. What was the culture? It would mean a lot if Wollrath told us that it was the same back then. Then we might understand the core problem as a more broader ill.

* Intentions: There are some good anecdotes here about how female voices are used for Siri, Alexa, and Google Maps (etc.) (pp. 36-38). Right. But what conclusion should we draw? “Women are expected to be more helpful than men . . . The more we rely on digital tools in everyday life, the more we bolster the message that women are society’s ‘helpers’� (p. 38). I get this. But then the author says: “Did the designers intend this? Probably not.� I protest! Go out and interview the designers! What were their reasons? Apple, in particular, thinks hard about this stuff. What were the factors going into a female Siri, and how did they outweigh providing other Siris (male; accented; whatever)? I want to know. The book makes an insinuation, but I think there’s a real research task to be performed. Bring out the ethnographers.

In large part, the book is driven by articles in the tech media. The next step is to get out there and start quoting people on their individual experiences, in order to test some claims (e.g., is the problem peculiarly tech in California in the 2010s? Or is it men in tech (more geography would help)? Or even a side-effect of the investment structure and capitalism (seems implicit in chapter 9) -- and, in particular, figure out where people are doing it right, and why. [The one positive example given in the book is Slack, but I’m not going to give much quarter there: Slack was produced by advertising and exploring the corporate customer’s desire to control discourse in the company, not by inclusiveness.]
Profile Image for Rachel.
25 reviews8 followers
November 7, 2017
I want to qualify my rating of this book: If you haven’t previously thought about sexism, racism, or other forms of discrimination in the tech industry, this is a five-star recommendation. However, as someone who regularly reads about this topic and pays attention to tech news, I encountered very little new information in this book. It was also a bit disappointing to see so much focus on recent big news stories (e.g. the Google Photos categorization fail, Uber sexism and spying, Facebook year in review) rather than a wider range of companies and more in-depth looks at what went wrong, how it happened, and how companies are or could be doing things differently. So I wasn’t blown away by the book, but it holds valuable information for some folks and I just might be the wrong audience.
Profile Image for ☘Misericordia☘ ⚡ϟ⚡⛈⚡☁ ❇️❤❣.
2,519 reviews19.2k followers
September 13, 2020
Some interesting concepts:
- Normalizing
- Edge vs Stress
- The 'select one' being the problematic idea
- The default settings of our lives (and everything else)
- Q: metrics are only as good as the goals and intentions that underlie them (c)
- Q: inappropriate, trying-too-hard, chatty tech products. (c)
- “marketing negging�
- The unlikely delights of Q:'1-800-Flowers purchase particularly relevant to a Scorpio' ©
- DAUs/MAUs/CAUs
Quite a lot of problematic issues. Precisely the ones that lead diversity intentions to ruin.

Some ludicrous ideas: like, going about how to connect with a made up persona made up specifically to connect with. I'm calling this one a BS job!

Also, a lot of genuinely good material. Here go handpicked examples of both:
Q:
There, there, dear. Don’t worry about what we’re doing with your account. Have a balloon. (c)
Q:
Back in 2011, if you told Siri you were thinking about shooting yourself, it would give you directions to a gun store. (c) Now I'm tempted to use Siri. Attabotgirl.
Q:
far too many people in tech have started to believe that they’re truly saving the world. Even when they’re just making another ride-hailing app or restaurant algorithm. (c) I'm pretty sure that goes way beyond that. And even beyond the tech.
Q:
I’m writing this in the wake of the 2016 presidential election—an election that gave us an American president who is infamous for allegations of sexual assault, racism, conflicts of interest, collusion, and angry Tweetstorms, and who rode to power on a wave of misinformation. (c) The problem was that there were 2 very problematic candidates, not just one. Hah. Another problem is that people actually expect Facebook or Twitter or some other shit to tell them how to vote. Problem numero trece is that people seem to be actually believing that that the flow of trash called 'news' in FB isn't actually 'news'. So� How very comfy to blame the tech.
Q:
You don’t need a computer science degree or a venture capital fund. You don’t need to be able to program an algorithm. All you need to do is slough away the layers of self-aggrandizement and jargon, and get at the heart of how people in technology work—and why their decisions so often don’t serve you. (c) That's actually not true. Self-aggrandizement and jargon - all of that is just perception which might be skewed or not. Understanding is the key.
Q:
we’ll take a closer look at how the tech industry operates, and see how its hiring practices and work culture create teams that don’t represent most of us—no matter how many “diversity� events these companies put on. (c) Why should they hire someone who represents anything instead of someone who's able to do the job? Diversity is about not refusing to hire a capable young mother or someone of another race. Hiring representatives is a totally different opera.
Q:
Designers and technologists don’t head into the office planning to launch a racist photo filter or build a sexist assumption into a database. (c) LOL
Q:
� she spent the next hour listening to older men tell her about the “female market,”�
The men in the room insisted that most women really care about leisure-time activities. (c) Now, this must have been fun :)
Q:
Even though the company had forty-odd employees and had been in business more than a decade, no staff member had ever been pregnant. � “We have three other women of childbearing age on our team, and we don’t want to set a precedent,� the owner told her, as if pregnancy were some sort of new trend. (c) Wowser. These guys must have grown on trees. Some rotten fruits.
Q:
� the two teams with lots of women on staff, were sent an email by a board member asking them to “put together some kind of dance routine to perform at the company presentation.

The heads of each department, all men, stood up and talked about their successes over the course of the year. The only women who graced the stage were a group of her peers in crop tops and hot pants. The men in the audience wolf-whistled while the women danced. (c) That's some company.
Q:
Amélie Lamont, whose manager once claimed she hadn’t seen her in a meeting. “You’re so black, you blend into the chair,� she told her. (c) Damn. I've actually once had a very similar discussion. I've never before or after wanted so much to suggest that that reviewer should by the effing glasses and spare me the bullshit!
Q:
Tech is also known for its obsession with youth—an obsession so absurd that I now regularly hear rumors about early-thirties male startup founders getting cosmetic surgery so that investors will think they’re still in their twenties. (c) Yep, that's a fact.
Q:
Other companies start their workdays with all-staff meetings held while everyone does planks—the fitness activity where you get on the ground, prop yourself up by your feet and elbows, and hold the position until your abs can’t handle it anymore. If you’re physically able to plank, that is. And you’re not wearing a dress. Or feeling modest. Or embarrassed. Or uncomfortable getting on your hands and knees at work. (c) Ridiculous� Riddiculus!
Q:
I’m not interested in ping-pong, beer, or whatever other gimmick used to attract new grads. The fact that I don’t like those things shouldn’t mean I’m not a “culture fit.� I don’t want to work in tech to fool around, I want to create amazing things and learn from other smart people. That is the culture fit you should be looking for. (c) Golden words!
Q:
The good news is there’s actually no magic to tech. As opaque as it might seem from the outside, it’s just a skill set—one that all kinds of people can, and do, learn. There’s no reason to allow tech companies to obfuscate their work, to call it special and exempt it from our pesky ethics. Except that we’ve never demanded they do better. (c) And except that many of us don't really bother learning how stuff works. Had these companies disclosed all their proprietary code today not many of us would know how to make head or tail of it.
Q:
Are you a “Kelly,� the thirty-seven-year-old minivan mom from the Minneapolis suburbs? Or do you see yourself as a “Matt,� the millennial urban dweller who loves CrossFit and cold-brew coffee? Maybe you’re more of a “Maria,� the low-income community college student striving to stay in school while supporting her parents.
No? Well, this is how many companies think about you. (c) Now, that's a great point.
Q:

she test-drove some menstrual cycle apps, looking for one that would help her get the information she needed.
What she found wasn’t so rosy.
Most of the apps she saw were splayed with pink and floral motifs, and Delano immediately hated the gender stereotyping. But even more, she hated how often the products assumed that fertility was her primary concern—rather than, you know, asking her. (c) LOL. It wasn't rosy: it was pink and florid.
Q:
Glow works well for women who are trying to get pregnant with a partner. But for everyone else, both services stop making sense—and can be so alienating that would-be users feel frustrated and delete them. (c) Well, frankly, I don't think the right problem is being highlighted here. Glow might be cheesy. It also actually was initially rolled out for women trying to get pregnant. So, IMO, women who don't, might do better choosing some other app. No shit, Sherlock. Every single app doesn't have to be a multitool capable of Pyhon coding, getting one pregnant and building space ships.
The problem more likely is that the market either doesn't clearly specify the alternative needs and apps applicable to other cases or does have voids in some respects. That's actually both a problem and a business opportunity.
Q:
What happens when those someones are the people we met in Chapter 2: designers and developers who’ve been told that they’re rock stars, gurus, and geniuses, and that the world is made for people like them? (c) The Big Flip Flop?
Q:
But when default settings present one group as standard and another as “special”—such as men portrayed as more normal than women, or white people as more normal than people of color—the people who are already marginalized end up having the most difficult time finding technology that works for them. (c) Amen.
Q:
If you’ve designed a cockpit to fit the average pilot, you’ve actually designed it to fit no one. �
So, what did the air force do? Instead of designing for the middle, it demanded that airplane manufacturers design for the extremes instead—mandating planes that fit both those at the smallest and the largest sizes along each dimension. Pretty soon, engineers found solutions to designing for these ranges, including adjustable seats, foot pedals, and helmet straps—the kinds of inexpensive features we now take for granted. (c)
Q:
When designers call someone an edge case, they imply that they’re not important enough to care about—that they’re outside the bounds of concern. In contrast, a stress case shows designers how strong their work is—and where it breaks down. (c) Edge vs Stress gives interesting dichotomy.
Q:
I saw race and ethnicity menus that couldn’t accommodate people of multiple races. I saw simple sign-up forms that demanded to know users� gender, and then offered only male and female options. I saw college application forms that assumed an applicant’s parents lived together at a single address. (c) Fucked up design. And not just design.
Q:
Take Shane Creepingbear, a member of the Kiowa tribe of Oklahoma. In 2014 he tried to log into Facebook. But rather than being greeted by his friend’s posts like usual, he was locked out of his account and shown this message:
Your Name Wasn’t Approved. �
Adding to the insult, the site gave him only one option: a button that said “Try Again.� There was nowhere to click for “This is my real name� or “I need help.�

Facebook also rejected the names of a number of other Native Americans: Robin Kills the Enemy, Dana Lone Hill, Lance Brown Eyes. (In fact, even after Brown Eyes sent in a copy of his identification, Facebook changed his name to Lance Brown.) (c) Oh, this is top.
Q:
� there’s still the fact that Facebook has placed itself in the position of deciding what’s authentic and what isn’t—of determining whose identity deserves an exception and whose does not. (c) Which is quite obviously bonkers.
Q:
People who identify as more than one race end up having to select “multiracial.� As a result, people who are multiracial end up flattened: either they get lumped into a generic category, stripped of meaning, or they have to pick one racial identity to prioritize and effectively hide any others. They can’t identify the way they would in real life, and the result is just one more example of the ways people who are already marginalized feel even more invisible or unwelcome. (c)
Q:
When you remember how few people change the default settings in the software they use, Facebook’s motivations become a lot clearer: Facebook needs advertisers. Advertisers want to target by gender. Most users will never go back to futz with custom settings. So, Facebook effectively designs its onboarding process to gather the data it wants, in the format advertisers expect. Then it creates its customizable settings and ensures it gets glowing reviews from the tech press, appeasing groups that feel marginalized—all the while knowing that very few people, statistically, will actually bother to adjust anything. Thus, it gets a feel-good story about inclusivity, while maintaining as large an audience as possible for advertisers. It’s a win-win . . . if you’re Facebook or an advertiser, that is. (c)
Q:
It was cute, unless you wanted to react to a serious post and all you had was a sad Frankenstein (c) Quite the company.
Q:
“Hi Tyler,� one man’s video starts, using title cards. “Here are your friends.� He’s then shown five copies of the same photo. The result is equal parts funny and sad—like he has just that one friend. It only gets better (or worse, depending on your sense of humor) from there. Another title card comes up: “You’ve done a lot together,� followed by a series of photos of wrecked vehicles, culminating in a photo of an injured man giving the thumbs up from a hospital bed. I suppose Facebook isn’t wrong, exactly: getting in a car accident is one definition of “doing a lot together.� (c) This is both hilarious and horrifying.
Q:
You can probably guess what went wrong: in one, Facebook created a montage of a man’s near-fatal car crash, set to an acoustic-jazz ditty. Just imagine your photos of a totaled car and scraped-up arms, taken on a day you thought you might die, set to a soft scat vocal track. Doo-be-doo-duh-duh, indeed. (c)
Q:
� Tumblr: “Beep beep! #neo-nazis is here!� it read. �
a Tumblr employee told Rooney that it was probably a “what you missed� notification. Rooney had previously read posts about the rise in fascism, and the notification system had used her past behavior to predict that she might be interested in more neo-Nazi content. �
� another Tumblr user shared a version of the notification he received: “Beep beep! #mental-illness is here!�) (c) Well, this is what happens when people are being treated as kids by apps.
Q:
Maybe I’m the only one who’s just not interested in snotty comebacks from my phone, though I doubt it.

Why would anyone want their credit card offers to be dependent on the weather?
What, precisely, would we do to make a 1-800-Flowers purchase particularly relevant to a Scorpio?

“How the hell did I end up here?� (c)
Q:
Delight is a concept that’s been tossed around endlessly in the tech industry these past few years, and I’ve always hated it. (c)
Q:
� What Facebook Thinks You Like. The extension trawls Facebook’s ad-serving settings, and spits out a list of keywords the site thinks you’re interested in, and why. There’s the expected stuff... Then there’s a host of just plain bizarre connections: “Neighbors (1981 Film),� a film I’ve never seen and don’t know anything about. A host of no-context nouns: “Girlfriend,� “Brand,� “Wall,� “Extended essay,� “Eternity.� I have no idea where any of this comes from—or what sort of advertising it would make me a target for. Then it gets creepy: “returned from trip 1 week ago,� “frequent international travelers.� I rarely post anything to Facebook, but it knows where I go, and how often. (c)
Q:
1,500 individual tidbits of information about you, all stored in a database somewhere and handed out to whoever will pay the price (c)
Q:
The technology is based on deep neural networks: massive systems of information that enable machines to “see,� much in the same way the human brain does. (c) That's not precisely correct.
Q:

� a future where Facebook AI listens in on conversations to identify potential terrorists, where elected officials hold meetings on Facebook, and where a “global safety infrastructure� responds to emergencies ranging from disease outbreaks to natural disasters to refugee crises. (c) Welcome to the fish bowl!
Profile Image for Vish Wam.
46 reviews12 followers
December 2, 2018
Why do apps and profile info pages mostly come with only two gender options - male and female? What if someone doesn't wish to be identified as either? Why is there still a vast underrepresentation of women and minorities in the tech sector? Why hasn't there been a massive MeToo rising in the tech industry across the world? If tech companies are largely run by white or Asian men, do the products they release also reflect the bias and stereotypes they believe in?

From Uber's severely regressive history in handling sexual harassment complaints, to why Google Photos inadvertently turned out to be racist, the book is full of anecdotes on where the tech industry is messing up.

If you are one who has felt that the tech sector needs to diversify, this book can help you better understand why, through stories from experts in the space.

If you thought the tech industry's idea of 'Meritocracy' to recruit talent is fair, then this is a must read for you too.
Profile Image for Manzoor Elahi.
34 reviews47 followers
November 2, 2018
Most tech products are full of blind spots, biases, and outright ethical blunders. Like in the spring of 2015, when Louise Selby, a pediatrician in Cambridge, England, joined PureGym, a British chain. But every time she tried to swipe her membership card to access the women’s locker room, she was denied: the system simply wouldn’t authorize her. Finally, PureGym got to the bottom of things: the third-party software it used to manage its membership data—software used at all ninety locations across England—was relying on members� titles to determine which locker room they could access. And the title “Doctor� was coded as male.

In March of 2016, JAMA Internal Medicine released a study showing that the artificial intelligence built into smartphones from Apple, Samsung, Google, and Microsoft isn’t programmed to help during a crisis. The phones� personal assistants didn’t understand words like “rape,� or “my husband is hitting me.� In fact, instead of doing even a simple web search, Siri—Apple’s product—cracked jokes and mocked users.

Back in 2011, if you told Siri you were thinking about shooting yourself, it would give you directions to a gun store. After getting bad press, Apple partnered with the National Suicide Prevention Lifeline to offer users help when they said something that Siri identified as suicidal. But five years later, no one had looked beyond that one fix. Apple had no problem investing in building jokes and clever comebacks into the interface from the start. But investing in crisis or safety? Just not a priority.

Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases. In the researchers� experiments, the three programs� error rates in determining the gender of light-skinned men were never worse than 0.8 percent. For darker-skinned women, however, the error rates ballooned � to more than 20 percent in one case and more than 34 percent in the other two.

The findings raise questions about how today’s neural networks, which learn to perform computational tasks by looking for patterns in huge data sets, are trained and evaluated. For instance, according to the paper, researchers at a major U.S. technology company claimed an accuracy rate of more than 97 percent for a face-recognition system they’d designed. But the data set used to assess its performance was more than 77 percent male and more than 83 percent white.
()
How I'm fighting bias in algorithms | Joy Buolamwini ()

Algorithms used by the police are better at identifying some racial groups than others. Few studies that have been done on facial recognition software suggest a persistently lower accuracy rate for African-American faces � usually about 5 to 10 percent lower than for white faces, and sometimes even worse. This raises concerns, given that African-Americans are already overscrutinized by law enforcement. It suggests that facial recognition technology is likely to be “overused on the segment of the population on which it underperforms.�

The inaccuracies are troubling but nothing new. Many readers might recall that, back in 2010, consumer-grade facial recognition software was famously failing to detect that Asian users had their eyes open, or that black users were in the frame at all. Facial recognition software used in web services like Flickr and Google have tagged African-Americans as primates.

The algorithmic bias has been described as “the coded gaze� by Joy Buolamwini, an MIT Media Lab graduate student, in a nod to the literary and sociological term “the white gaze,� which describes seeing the world from the perspective of a white person and assuming, always, that your audience is white.

()

From massive businesses like Walmart and Apple to fledgling startups launching new apps, organizations of all types use tools called personas—fictional representations of people who fit their target audiences—when designing their products, apps, websites, and marketing campaigns. So that, ideally, team members think about them regularly and internalize their needs and preferences. That’s great in theory, but when personas are created by a homogenous team that hasn’t taken the time to understand the nuances of its audience, they often end up designing products that alienate audiences, rather than making them feel at home.

We can see a common example in the story of Fatima, a Middle-Eastern American design strategist based in the Bay Area. As the project kicked off, Fatima sat down with the teams from both companies—and was literally the only woman at the table. Pretty soon, someone started a video meant to show the product’s positioning. It was all flash: yacht parties, private jets, $2,000 shoes. Fatima cringed. The smartwatch they were designing was meant to target the midrange market.

She spent the next hour listening to older men tell her about the “female market,� using tales of their wives� shopping habits as proof. The team wanted to target women who are fashionable, tech-savvy, or both. About two-thirds of respondents were identified as the former, and half as the latter. Except, the men refused to believe Fatima. As soon as she started presenting her data, they wrote her off: “Oh, 51 percent of the women can’t be tech-savvy,� they said.

“I felt like I was in an episode of Mad Men. That’s a specific project, a physical piece of technology, that would exist in the world or not based on whether these men in the room accepted what I had to say or not,� she said. “They just weren’t willing to accept the research and use it as a foundation.� The project got shelved, and the brand partnered with a celebrity to design a smartwatch instead. It flopped. “It wasn’t based on needs; it was based on stereotypes,� Fatima said.

This mind-set—where someone assumes they have all the answers about a product, and leaves out anyone with a different perspective—isn’t rare. Scratch the surface at all kinds of companies—from Silicon Valley’s “unicorns� (startups with valuations of more than a billion dollars) to tech firms in cities around the world—and you’ll find a culture that routinely excludes anyone who’s not young, white, and male.

Biased algorithms. Alienating online forms. Harassment-friendly platforms. All kinds of problems plague digital products, from tiny design details to massively flawed features. But they share a common foundation: a tech culture that’s built on white, male values—while insisting it’s brilliant enough to serve all of us. Or, as they call it in Silicon Valley, “meritocracy.� Tech industry clings to meritocracy like a tattered baby blanket. David Sacks, an early executive at PayPal, claimed that “if meritocracy exists anywhere on earth, it is in Silicon Valley.�

The meritocracy myth is particularly pernicious in tech, because it encourages the belief that the industry doesn’t need to listen to outside voices—because the smartest people are always already in the room. This presumption quickly breeds a sort of techno-paternalism: when a group of mostly white guys from mostly the same places believes it deserves to be at the top, it’s also quick to assume that it has all the perspective it needs in order to make decisions for everyone else.

Tied up in this meritocracy myth is also the assumption that technical skills are the most difficult to learn—and that if people study something else, it’s because they couldn’t hack programming. As a result, the system prizes technical abilities—and systematically devalues the people who bring the very skills to the table that could strengthen products, both ethically and commercially: people with the humanities and social science training needed to consider historical and cultural context, identify unconscious bias, and be more empathetic to the needs of users.

Originally, programming was often categorized as “women’s work,� lumped in with administrative skills like typing and dictation (in fact, during World War II, the word “computers� was often applied not to machines, but to the women who used them to compute data). As more colleges started offering computer science degrees, in the 1960s, women flocked to the programs: 11 percent of computer science majors in 1967 were women. By 1984, that number had grown to 37 percent. Starting in 1985, that percentage fell every single year—until, in 2007, it leveled out at the 18 percent figure we saw through 2014.

That shift coincides perfectly with the rise of the personal computer—which was marketed almost exclusively to men and boys. We heard endless stories about Steve Jobs, Bill Gates, Paul Allen—garage tinkerers, boy geniuses, geeks. Software companies, and soon after, internet companies, all showcased men at the helm, backed by a sea of techies who looked just like them. And along the way, women stopped studying computer science, even as more of them were attending college than ever before.

You might assume that much of the attrition comes from women leaving to start or care for a family. Nope. Only about 20 percent of those who quit SET leave the workforce. The rest either take their technical skills to another industry (working for a nonprofit or in education, say), or move to a nontechnical position. People call this the “leaky bucket�: when women and underrepresented groups leave because they’re fed up with biased cultures where they can’t get ahead.

If the tech industry has acknowledged this problem and says it wants to fix it, why are the stats so slow to change? If you ask tech companies, they’ll all point to the same culprit: the pipeline. The term “pipeline� refers to the number of people who are entering the job market prepared to join the tech industry: those who are learning to code in high school and graduating from computer science or similar programs. If the pipeline doesn’t include enough women and people of color (though, honestly, many companies never get beyond talking about gender here), then tech companies simply can’t hire them. Or so the story goes.

In a 2014 analysis, USA Today concluded that “top universities turn out black and Hispanic computer science and computer engineering graduates at twice the rate that leading technology companies hire them.� Adding to the problem, potential employers spend their time looking for a “culture fit”—someone who neatly matches the employees already in the company—which ends up reinforcing the status quo, rather than changing it.

In a 2014 report for Scientific American, Columbia professor Katherine W. Phillips examined a broad cross section of research related to diversity and organizational performance. And over and over, she found that the simple act of interacting in a diverse group improves performance, because it “forces group members to prepare better, to anticipate alternative viewpoints and to expect that reaching consensus will take effort.�

In one study that Phillips cited, published in the Journal of Personal Social Psychology, researchers asked participants to serve on a mock jury for a black defendant. Some participants were assigned to diverse juries, some to homogenous ones. Across the board, diverse groups were more careful with details than were homogenous groups, and more open to conversation. When white participants were in diverse groups rather than homogenous ones, they were more likely to cite facts (rather than opinions), and they made fewer errors, the study found.

In another study, led by Phillips and researchers from Stanford and the University of Illinois at Urbana-Champaign, undergraduate students from the University of Illinois were asked to participate in a murder-mystery exercise. Each student was assigned to a group of three, with some groups composed of two white students and one nonwhite student, and some composed of three white students. Each group member was given both a common set of information and a set of unique clues that the other members did not have. Group members needed to share all the information they collectively possessed in order to solve the puzzle. But students in all-white groups were significantly less likely to do so, and therefore performed significantly worse in the exercise. The reason is that when we work only with those similar to us, we often “think we all hold the same information and share the same perspective,� Phillips writes. “This perspective, which stopped the all-white groups from effectively processing the information, is what hinders creativity and innovation.�

Uber may be an extreme example, but it can help us understand tech’s insular culture much more clearly: if tech wants to be seen as special—and therefore able to operate outside the rules—then it helps to position the people working inside tech companies as special too. And the best way to ensure that happens is to build a monoculture, where insiders bond over a shared belief in their own brilliance. That’s also why you see so many ridiculous job titles floating around Silicon Valley and places like it: “rock-star� designers, “ninja� JavaScript developers, user-experience “unicorns� (yes, these are all real). Fantastical labels like these reinforce the idea that tech and design are magical: skill sets that those on the outside wouldn’t understand, and could never learn.

The reality is a lot more mundane: design and programming are just professions—sets of skills and practices, just like any other field. Admitting that truth would make tech positions feel a lot more welcoming to diverse employees, but tech can’t tell that story to the masses. If it did, then the industry would seem normal, understandable, and accessible—and that would make everyday people more comfortable pushing back when its ideas are intrusive or unethical. So, tech has to maintain its insider-y, more-brilliant-than-thou feel—which affects who decides to enter that legendary “pipeline,� and whether they’ll stick around once they’ve arrived.

Not every tech company looks at the world like Uber does (thank god). Just look at messaging app Slack, a darling of the startup world with an office motto that’s refreshingly healthy: “Work hard and go home.� Slack is often described as a delight to use—but it’s a delight borne of nuance and detail, not shoved-down-your-throat cuteness. And the company got there by what so few tech companies seem to bother with: considering their users as real, whole people.

One of the first things CEO Stewart Butterfield wants to know about when interviewing candidates for a position isn’t which programming languages they know or where their computer science It’s whether they believe luck played a role in getting them where they are—whether they think their success is a product not just of merit and talent, but of good circumstances. His goal is simple: to build a team where people don’t assume they’re special. No rock stars, no gurus, no ninjas—just people who bring a combination of expertise, humility, and empathy. Slack doesn’t rely on believing that programmers are the chosen ones (in fact, Butterfield, who has a master’s degree in philosophy, is known for extolling the values of the liberal arts to anyone in tech who’ll listen).

Lo and behold, that culture also leads to a more diverse staff: women held more than 40 percent of Slack’s management positions in 2016, and more than a fourth of engineering roles too. Black people accounted for nearly 8 percent of engineers. Slack’s disarming honesty and disinterest in chest thumping are antithetical to the way most of tech talks about itself. And it’s working: Slack is the fastest-growing business app ever.
Profile Image for Kelly.
Author6 books1,214 followers
Read
September 30, 2019
Nothing surprising here, but infuriating and important nonetheless (if you at all work in tech as a woman or person of color, you'll recognize all of this). Well researched and written. The sexism in algorithms is something I've not thought about, but damn was that interesting.
Profile Image for Kathy Reid.
22 reviews4 followers
November 1, 2017
A must read for anyone who designs digital experiences, and doesn't want to be an inadvertent dude-bro.

Against a backdrop of increasingly ubiquitous technology, with every online interaction forcing us to expose parts of ourselves, Sara Wachter-Boettcher weaves a challenging narrative with ease. With ease, but not easily. Many of the topics covered are confronting, holding a lens to our internalised "blind spots, biases and outright ethical blunders".

As Wachter-Boettcher is at pains to highlight, all of this is not intentional - but the result of a lack of critical evaluation, thought and reflection on the consequences of seemingly minor technical design and development decisions. Over time, these compound to create systemic barriers to technology use and employment - feelings of dissonance for ethnic and gender minorities, increased frustration for those whose characteristics don't fit the personas the product was designed for, the invisibility of role models of diverse races and genders - and reinforcement that technology is the domain of rich, white, young men.

The examples that frame the narrative are disarming in their simplicity. The high school graduand whose Latino/Caucasian hyphenated surname doesn't fit into the form field. The person of mixed racial heritage who can't understand which one box to check on a form. The person who's gender non-conforming and who doesn't fit into the binary polarisation of 'Male' or 'Female'. Beware, these are not edge cases! The most powerful take-away for me personally from this text is that in design practice, edge cases are not the minority. They exist to make us recognise of the diversity of user base that we design for.

Think "stress cases" not "edge cases". If your design doesn't cater for stress cases, it's not a good design.

While we may have technical coding standards, and best practices that help our technical outputs be of high quality, as an industry and as a professional discipline, we have a long way to go in doing the same for user experience outputs. There are a finite number of ways to write a syntactically correct PHP function. Give me 100 form designers, and I will will give you 100 different forms that provide 100 user experiences. And at least some of those 100 users will be left without "delight" - a nebulous buzzword for rating the success (or otherwise) of digital experiences.

Wachter-Boettcher takes precise aim at another seemingly innocuous technical detail - application defaults - exposing their (at best) benign, and, at times, malignant utilisation to manipulate users into freely submitting their personal data. It is designing not for delight, but for deception.

"Default settings can be helpful or deceptive, thoughtful or frustrating. But they're never neutral."

Here the clarion call for action is not aimed at technology developers themselves, but at users, urging us to be more careful, more critical, and more vocal about how applications interact with us.

Artificial intelligence and big data do not escape scrutiny. Wachter-Boettcher illustrates how algorithms can be inequitable - targeting or ignoring whole cohorts of people, depending on the (unquestioned) assumptions built into machine learning models. Big data is retrospective, but not necessarily predictive. Just because a dataset showed a pattern in the past does not mean that that pattern will hold true in the future. Yet, governments, corporations and other large institutions are basing large policies, and practice areas on algorithms that remain opaque. Yet while responsibility for decision making might be able to be delegated to machines, accountability for how those decisions are made cannot be.

The parting thought of this book is that good intentions aren't enough. The implications and cascading consequences of seemingly minor design and development decisions need to be thought through, critically evaluated, and handled with grace, dignity and maturity. That will be delightful!
Profile Image for Tam.
431 reviews218 followers
January 9, 2018
A good and short read. Plenty of examples, but mostly the famous ones on the internet - the author's alignment with the truly marginalized is limited, mostly with female/gays/transgender/nonwhites but still the educated, unlike O'Neil in who places her heart towards the poor, the abused whose stories may not be heard at all, buried deep, powerless. The problems aren't less worthy to discuss, though. The sexist and racist culture is so embedded, the privileges so taken for granted, the arrogance and the belief that tech people are coolest and smartest and above everyone else so fierce. That needs to change.
Profile Image for Jay French.
2,151 reviews85 followers
June 13, 2020
Given the title of this book, I assumed it would focus exclusively on the problems of bias in software and machine learning. This has been in the news for quite a while, and on top of the news recently. While most of the book provides stories about bias as I expected, a large part of the book was about various other behaviors, sexist, racist, illegal, and just bad. (Think hiring at Uber.) If you have kept up with these kinds of issues in Wired/Fast Company magazines and their ilk, you get many more examples here, but not much by way of solutions. Despite that mild disappointment, I found the writing kept my interest, at least up until the end, when it felt like the authors were reaching for things to write about. Good for helping an ITer, data scientist, or a tech company exec to think through how these issues may touch on your own company, products and practices.
Profile Image for Tony Y.
11 reviews
January 15, 2021
A quick read that exposes the reader to a variety of issues in tech, with anecdotes and some helpful statistics. It's good for people who haven't really thought about the problems in tech, but I finished it wanting a bit more depth -- each chapter could be its own book, with more detailed case studies.

Overall, this book helped me reflect a bit deeper on the team that I work on, especially the ways in which it can perpetuate bias and be insensitive. It also made me reflect on how conscious my team can be of some of these issues, and the infrastructure around addressing these issues.
Profile Image for Yulia Kryval.
121 reviews10 followers
January 29, 2023
Ця книжка - гарне нагадування для продуктових команд, дизайнерів, маркетологів, власників бізнесів і, в принципі, всіх причетних до створення продуктів чи сервісів перевірити свої концепти, флови і вже давно релізнуті проекти на те, чи не закралися десь упередження (через расу, стать, етнічність, ідентифікацію тощо), стереотипні неперевірені гіпотези або ідеалізована «норма» в основі вашого дітища:
- на кого ви орієнтувались при ідеації чи валідації гіпотез
- хто і на основі яких даних розробляв ваші алгоритми
- як тренували ваші ML моделі
- чи не зловживає голос вашого бренду нездоровою чи недоречною веселістю
- чи не видається ваш контент tone deaf і так далі.

Бо дуже часто компанії, які не мали на меті створити упереджені нетолерантні чи навіть небезпечні продукти, все ж доходили до цього через вузьку перспективу при дизайні ше концепту чи інтеракцій (основна теза авторки в тому, шо багато компаній створені на основі того, шо є уявленням про норму білих привілейованих чоловіків-засновників/підприємців/хаслерів).

Все ж деколи книжка сприймалась трошки драматично і приклади базуються в основному на техгігантах з Кремнієвої долини - деколи хочеться глобальнішої репрезентації (світ не сходиться клином на Фейсбуці, Гуглі і Убері). Іноді після прикладів, як робити не варто, не вистачало скерування або референсу, а як же треба робити.

Серед іншого: читається легко, багато цікавих прикладів, купа посилань для самостійного заглиблення в розбір кейсів (якшо на це є бажання).
Profile Image for Ane Vorhaug.
6 reviews1 follower
April 14, 2020
Viktig bok! Skal gi denne boka til jul til alle jeg kjenner som jobber i IT.
Profile Image for Maya.
465 reviews50 followers
August 28, 2020
Some of this was already pretty familiar to me, because these stories have been in the news. But they're staggered in reality and I think compiling these together into a book makes for a valuable read.

One of the cases here which I didn't know about (but am really not even slightly surprised,) was about how a woman with a gym membership couldn't use her badge on the women's locker room. The reason, of course, was because they had used title (Mr., Ms., Mrs., etc) as a way of selecting the correct locker room, and this woman happened to be a doctor.

Honestly a lot of these examples on their face are kind of face-palmy, leaving out segments of the population and causing more subtle harm. But they speak to a larger issue with how data is used, (or not collected and thus not used) which actively harms large swaths of the population.

If you're looking for more after reading this, especially with a lens toward how it affects Black people in America, I suggest reading . And if you haven't read , I really suggest you do.
Profile Image for Jill.
127 reviews2 followers
October 13, 2017
I won this book in a giveaway. I work in the tech sector and was interested in this book because I am leading a digital transformation effort at my job and wanted to make sure i didn't fall into any of these traps. The book was not what I was thinking it was but boy were my eyes opened. I have worked in tech for 35 years. I'm a woman and have experienced the discrimination the book describes early in my career developing software for a utility. While I was raising my kids, I taught computers in college part-time then returned to the workforce when they were driving. I thought my days of discrimination were behind me but just last year it happened again. I was being groomed for a position to take over for my boss, the IT Director, when he retired. When he announced his retirement date, I was expecting the promotion but I didn't get it. Even though my boss was progressive, the good ol' boy network of the company, choose otherwise and now I report to someone who not only has never managed IT but has never worked in it. So I am training my boss. Toxic!

I didn't realize that software meant for the general public had such a narrow view of "normal". This book opened my eyes tremendously. I am ashamed of my industry.

This should be required reading for anyone studying in the tech field in college. I have forwarded this title to the college at which I taught.
Profile Image for Kate Kaput.
Author2 books53 followers
January 2, 2019
Long review coming: This book was my first Feminist Book Club delivery, & it was brilliant & techie, but written in a digestible, accessible, & down-to-earth way for those of us who don't work in tech. I had no idea of all these problems - like Google Photos identifying black faces as "gorillas," mobile ads targeting people in low-income areas with ads for for-profit colleges, or a gym chain in Britain where a woman couldn't get into the locker rooms because the locker rooms were coded by title, like Mr. or Mrs. - & hers, "Doctor" was coded as male. This book tackles problems small & large, including how they occur & how they can be stopped. Read this.
Profile Image for Amy Rhoda  Brown.
212 reviews42 followers
March 14, 2019
This is a crystal clear description of how the monoculture of tech leads to terrible apps, toxic online behaviour, and the failure of the developers to take responsibility for what their decisions, based on their narrow worldview, have wrought. Easy to read, well laid-out and compelling.
Profile Image for Vovka.
1,004 reviews41 followers
October 7, 2020
A good overview of the issues, but lacks deeper insight.

This book is geared to the general nontechnical public. For example, it defines an algorithm and gives examples of an algorithm for adding two numbers. It skims across the landscape, without going deep on any one of the numerous issues that it covers. In the final chapter, it talks about Silicon Valley's troubled meritocracy, but readers who want a real discussion of the flaws of “meritocratic� Silicon Valley should probably direct their attention to "The Meritocracy Trap" by Daniel Markovits.

Repeatedly, this book falls into the same traps it preaches against. Two examples:

(1) An app is praised for using red and blue text in its UX design. This is cited as an example of good UX, but the author apparently forgot to include the diverse perspective of red-blue colorblind folks in making this critical judgment. About 19 years ago, as a software PM, I made the same mistake and learned from it. The mistake is particularly hard to swallow here because it occurs in a book that castigates others for failing to consider diverse user needs.

2) The author talks about signing up for a Facebook account with the "obviously" ridiculous name of "Sara Nopenopenope." Apparently, the author fails to consider that a name she considers wrong might well be a legitimate surname in non-American cultures. A quick search just now shows me that "Nopnop" is a surname in Thailand, and "Nonono" is a surname in South Africa, Russia, and other countries. The author flames programming bros for coding myopic algorithms that prevented Shane Creepingbear (a real person with a real Native American name) from registering on Facebook, but then engages in the same kind of accidental discrimination herself. That’s a special kind of embarrassing to read.

Neither of these mistakes is particularly terrible, but they do prove how difficult it is to get this stuff right, and that difficulty isn't something the author deigns to acknowledge. That makes the entire book a bit less credible and recommend-worthy.

A bit of humility sprinkled throughout the polemic would've made this a better book.
Profile Image for Philipp.
677 reviews216 followers
December 21, 2017
Recommended reading on the current (very current) state of the tech industry. Overlaps a little bit with and cites , but focuses more on programmer an designer choices, assumptions and hidden biases instead of algorithms.
First I'd thought of recommending it only to programmers - there's a bunch of stuff on personas and other design techniques that are not of interest to 'regular' humans - but then it branches out and goes into the role of the tech industry in daily life, fake news, concerted online harassment, and all the other acrid smoke from the garbage fire that is the modern WWW.
Profile Image for Amy.
Author1 book45 followers
November 30, 2017
This was a very thoughtful exploration of how bias is built into the tech products we use every day, and how that bias subsequently shapes and reinforces behaviors offline. Wachter-Boettcher explores not just how technology is built, but also how the organizations that build it perpetuate particular cultural norms that just don't work for many of the people they supposedly serve. As someone who works in technology as a behavior change designer, I'll return to this book for reflection in the future. This is also a book I can see myself giving to others who either want or need to think about the many ways tech consumes, reiterates, and reinforces harmful biases.
Profile Image for Parker.
193 reviews31 followers
Read
January 23, 2018
This is a good solid introduction to a really important issue. Given the nature of the subject matter, a lot of the most striking anecdotes in here were covered by the tech press and so were widely circulated within the community of people observing this kind of thing closely. But even as somebody who pays a lot of attention to the problems described in this book, a few stories were new to me. Certainly, if this is not an area you are already pouring hours of each day into, there will be a lot of new and compelling stories for you.

In any case, this book is entertaining, readable, and persuasive.
144 reviews16 followers
September 30, 2018
This is a must-read for all UXers, business analysts and product owners. I read Watcher-Boettcher’s “Design for Everyday Life� and there are some similar references, but this goes beyond examining product design and illuminates the biases that can cause exclusion and even trauma in tech usage.
Profile Image for Marrije.
541 reviews22 followers
October 4, 2018
Read it. You’ll be angry, and inspired.
Profile Image for Noelle Pangilinan.
254 reviews2 followers
March 22, 2020
Great intro book for someone interested in tech biases, but as someone from the industry who has studied a lot of this stuff, I didn’t learn too much. I wanted to learn some actionable things to bring to work, but instead this felt like Watcher-Boettcher was just trying to expose issues in the tech industry.

Still enjoyed this overall though!
Profile Image for David.
337 reviews5 followers
July 16, 2022
Lots of observations on the impact of how data, algorithms, and product choices have unintended (or intended) consequences. Awareness is step 1...
Profile Image for Haley.
18 reviews
June 4, 2019
A fairly quick read that has some valuable sections. At times, it can get basic if you’re at all knowledgeable about lack of diversity and it’s implications in tech/product development. But there was definitely enough detail in some areas that I felt like it was very worthwhile and I learned from this book. I liked the case studies of certain issues I’d never heard about, or maybe briefly heard about but didn’t know the whole story.
Profile Image for Thom.
1,757 reviews66 followers
June 23, 2021
A good breakdown of current user experience (UX) problems, with some examination into their likely causes. Eye-opening for those who haven't considered these issues.

This book is a snapshot of west-coast tech companies and current UX. I wanted to see it go further, with some history, or some coverage of other regions. It would have benefited from interviews with designers past and present, and a broader exposure. Instead, it can be described as a quick read :)

The best thing from this book are tips to change the conversation. Instead of thinking of changes for a minority of people as being "edge cases", referring to them as "stress cases" really makes the point. Another change was getting rid of some things (choosing Miss, Mr. or Mrs.) and just leaving an open text entry box - if you feel you really need one. This is the sort of UX thinking people *should* be doing.
Profile Image for Douglas Lord.
712 reviews31 followers
December 6, 2017
This scathing critique of the tech industry and its techniques is both informative and hair-raising. Wachter-Boettcher winningly posits that from top (industry giants like Facebook) to bottom (smaller, niche companies), services rely on finely crafted promises of ease, interconnectedness, and service to humanity. In reality, these are for-profit businesses. As these companies become more and more ubiquitous they act as quasi-public utilities—sans the governmental oversight and controls; Google’s summer 2017 anti-diversity uproar, Facebook’s September 2017 revelations about Russian ads, and the Equifax breach-and-coverup revealed in that same month (all of which occurred after this book was written) lend much credence to W-B’s well-written criticism. Indeed, W-B convincingly shows Silicon Valley bigwigs as a hegemony that “…routinely excludes anyone who’s not young, white, and male.� The inevitability of embedded tech, of it becoming “…more fundamental to the way we understand and interact with our communities and governments,� writes W-B, must be balanced with an absence of “…biased, alienating, or harmful� aspects in its creation. For every “there’s an app for that,� whether you are tracking your health, dating, or banking online, there’s a design flaw that humiliates, belittles, and undermines real human beings (e.g., a smart scale that scolds a toddler for gaining weight, a binary choice for sexual orientation, Native American names being deemed unacceptable on social media). VERDICT Provocative, passionate, impossible to ignore.

Find reviews of books for men at Books for Dudes, , the online reader's advisory column for men from Library Journal. Copyright Library Journal.
Profile Image for Sean Lynn.
82 reviews2 followers
March 25, 2019
As a white dude who works in tech, this was a bit of an eye opener.

Technically Wrong by Sara Wachter-Boettcher argues that many of the products and services designed In Silicone Valley are inherently, though not necessarily intentionally, biased. As the many programmers are caucasian and male, the products they design do not always meet the desires and needs of the much more diverse market. Thus they accidentally exclude whole groups of people, who instead turn to more inclusively designed products, or are forced to go without.

The problems are compounded when some one, other that a white guy, is hired and tries to bring their input and perspective to the table. The work place can become dismissive or even hostile to these criticisms, and too often those hired to solve the problems of narrow perspective end up leaving. The author provides many real world examples of this, and many other issues in the industry.

There is, however, hope. She also presents some companies who are trying to change this trend, as well as the policies they’re implementing in order to combat the perspective hegemony. Technically Wrong not only shines a light into this dilemma, but also helps light the way out.
Displaying 1 - 30 of 347 reviews

Can't find what you're looking for?

Get help and learn more about the design.