What does it mean to be human in a world that is rapidly changing with the development of artificial intelligence?
Through the voices of ordinary people in places far removed from Silicon Valley, Code Dependent explores the impact of a set of powerful, flawed, and often exploitative technologies on individuals, communities, and our wider society. Madhumita Murgia, AI Editor at the FT, exposes how AI can strip away our collective and individual sense of agency � and shatter our illusion of free will.
AI is already changing what it means to be human, in ways large and small. In this compelling work, Murgia reveals what could happen if we fail to reclaim our humanity.
Code Dependent is an investigation into the human side of AI: the ordinary, non-Silicon Valley people affected by and involved in areas relating to artificial intelligence. Journalist Madhumita Murgia tells the stories of people and communities impacted by AI from people labelling AI training data to people whose lives are changed by the decisions of AI systems or having deep fake videos made of them. Not everything is negative: there's also healthcare benefits, if only these technologies can be made freely available and in places that most need them. And as the book moves towards the ending, Murgia argues that these stories give us principles we should consider going forward to ensure AI works for ordinary people, not the other way around.
Notably, this book focuses on the human side of technology, rather than the technological side, and foregrounds the experiences of people and the complexity of AI's role. Even for areas that are often discussed in other books, such as predictive policing, this book offers examples I've not seen before and direct interviews with people affected, not something all technology books have. At the same time, it does provide an accessible description of a lot of AI-related technologies; for example, it's the first time I've seen—as someone who reads a lot about AI—a simple explanation of what a 'transformer' is and why it has been so important for generative AI. This combination makes Code Dependent useful both for people who do read tech books, but are interested in human stories rather than the same talking points, and people who are newer to the topic and would like a way in that focuses on people.
Sometimes I found the framing or phrasing a bit simplistic or lacking nuance and complexity, but generally, it was an accessible book about AI that tells stories rather than just facts, and takes areas we might have heard or read plenty about and shows specific people's lives in relation to these topics. The parting message about religions coming together to discuss AI was not where I expected the book to go and I'm not quite sure how I feel about that being the conclusion (given that high up people in a religion aren't really 'ordinary people' necessarily), but I do appreciate that this was a book about AI that had a lot of things I'd not read about before, or at least not in this form.
Given the current hype and fear around AI, Code Dependent is likely to become a much-talked-about book, offering people a different way in to reading and thinking about artificial intelligence and what it means for our lives.
"...when I set out to write this book, I wanted to find real-world encounters with AI that clearly showed the consequences of our dependence on automated systems."
Code Dependent examines the various ways in which AI is affecting people across the globe.
Murgia reports on everything from exiled activists in China and doctors in rural India, to food delivery drivers in the US and content moderators working for all the social media giants.
She explores how algorithms are used for criminal projection, medical treatment, job application filtering, children’s education, and more.
She also discusses digital colonialism, urging readers to think about whose taking information from whom. Whether it’s being taken consensually. And what’s being done with it.
One of the most compelling chapters in this book covers deepfake pornography and the people—mostly women—who are victims of it. Murgia speaks to a defense attorney who represents victims of this kind of assault. And she says it’s wrong to call it “digital assault� because it’s assault that happens in a digital space, but the victim is not digital. They are human.
The big takeaway from this book is that how algorithms are used is increasingly becoming a human rights issue, and it’s going to get more complicated from here. If you’re at all interested in our future with AI, Code Dependent is a must read.
Footnote: This book would make for a stellar reading experience when paired with Supremacy by Parmy Olson.
--
My heartful thanks goes out to the kind people at for generously sending me a copy of this book.
No matter how exceptional a tool is, it only has utility when it preserves human dignity.
Does generative AI preserve human dignity?
In a word, no.
From its exploitative roots that begin in the rare minerals and metals mines in the DRC to the workers helping train statistical analysis tools to detect digital items more accurately to the driver delivering your goods to you asking for the nearest Costco or if x symptoms match y diagnosis, ChatGPT and other generative and analytical AI systems are built to extract and built upon data, funneling the profits to the highest rungs of the company.
Murgia notes that there are some valuable applications to AI, both generative and analytical, but these applications and the AI driving them are only as good as the humans behind the codes—and our internal biases can literally be multiplied into horrifying consequences.
This book was stark but thoroughly researched (if quick to be outdated based on how fast tech is progressing) overview of AI. The upsides and the many, many pitfalls, which overwhelmingly impact marginalized communities and have far-reaching consequences.
Nevertheless, despite the dark beginnings, AI has the potential to huge us. Instead of usurping our autonomy, it could elevate us. We just have to ensure that no one is crushed while a minority are allowed to climb.
Thank you to NetGalley and the publisher for the eARC!
Full Rating: 4.75 stars rounded up
In "Code Dependent: Living in the Shadow of AI," Madhumita Murgia presents a meticulously researched and deeply unsettling exploration of how artificial intelligence is reshaping humanity and society. Murgia’s central question, “How is AI changing what it means to be human?� is both poignant and haunting. This book doesn’t just delve into the mechanics of AI but interrogates its broader implications, echoing the impact of past technological revolutions like industrialization and the advent of social media.
One of the most striking aspects of Murgia’s work is her ability to articulate complex concepts such as “surveillance capitalism� and “data colonialism� in an accessible and engaging manner. The idea that "if a product is free, you become the product" is a powerful reminder of how deeply entrenched we are in capitalist systems that commodify our very existence. The book's exploration of how AI perpetuates these systems is both eye-opening and alarming, drawing parallels to historical abuses and exploitations driven by profit motives.
Murgia’s exploration of the labor dynamics within the AI industry is particularly compelling. She vividly illustrates how the tech industry thrives on cheap labor, reinforcing capitalism’s relentless drive for cost minimization at the expense of human dignity and safety. The anecdotes about exploited laborers and the emotional toll on AI trainers—who suffer from nightmares and depression due to their work—are harrowing. These stories underscore the human cost of our technological advancements and the ethical implications of prioritizing profit over people.
The book’s examination of AI’s inherent biases is equally powerful. Murgia deftly exposes the fallacy of AI’s supposed objectivity, demonstrating how human prejudices are embedded within these systems. For instance, the discussion around facial recognition technology and its disproportionate impact on marginalized communities highlights the dangers of unchecked AI deployment. The chilling reality that AI could exacerbate existing inequalities and perpetuate systemic biases is a central theme throughout the book.
Throughout the book, Murgia does not shy away from the darker aspects of AI's impact on society. She examines how AI-driven surveillance affects culture and individuality, raising critical questions about privacy, autonomy, and the potential for a homogenized society. The surveillance of the Uyghur people, for example, serves as a grim illustration of how AI can be weaponized to control and oppress.
"Code Dependent" is a sobering yet essential read for anyone interested in understanding the profound and often troubling implications of AI. Murgia’s ability to weave together technical analysis, ethical concerns, and human stories makes this book a compelling call to action. As we hurtle towards an increasingly AI-driven future, Murgia reminds us that none of us are truly free until all of us are free, urging us to scrutinize and challenge the systems that shape our lives.
📖 Recommended For: Tech Enthusiasts Interested in Social Justice, Readers Who Appreciate Intersectional Activism, Fans of Investigative Journalism, Those Curious About the Intersection of Technology and Identity.
🔑 Key Themes: The Ethics of Artificial Intelligence, Intersectionality in Technology, Marginalized Experiences in Tech, Social Justice and Cyber Activism, Exploration of Identity in Digital Spaces, Surveillance Capitalism.
Firstly, I'd like to express my gratitude to Henry Holt & Co for providing me with the ARC. I had been searching for a copy since it was longlisted for the Women's Prize for Nonfiction and out of all the books, this topic resonated with me a lot.
I've read approximately 50% up to Chapter 5 "Your Freedom" and have also read the "Epilogue" and now I have decided to DNF it.
Initially, the start was a little long-winded, with a broad narrative voice that irked me a little, but I didn't mind it that much and continued. However all of the chapters seemed to lack depth and nuance. The first chapter "Your Livelihood" was much better than the others in this aspect, providing a balanced view with pros and cons. The second chapter "Your Body" discussed the important topic of deepfake technology and how it's weaponized for promoting non-consensual pornographic content and the flimsiness of internet regulations. This chapter was interesting to read, especially because of the addition of victims' and activists' viewpoints, rather than solely focusing on commentary on the nature of internet regulation (which was discussed at quite a minute level). However, things went downhill in Chapter 3, where the author talked about the use of facial recognition. Not only did the author fail to frame the chapter in a critical point, but their points seemed to echo "facial recognition is always bad" too many times. It didn't sit well with me how someone who's supposed to be an "expert" simplified things too much. Coupled with the fact that the author cut off news on crucial parts and framed them in an ambiguous stance, such as the author's reporting on the 2021 Indian Republic Day farmers storming the Red Fort while destroying public property and hoisting the union's flag and Sikh religious flag. Chapters 4, "Your Health," and 5, "Your Freedom," were better than the earlier chapter but still seemed to echo the same sentiments and factual similarities with other chapters, along with dabbling in long and emotional narrative supposed to invoke empathy.
Then I peeked at the "Epilogue" where I had hoped the author would discuss some consolidated form of solutions which can supposedly be implemented to regulate AI from exploiting people, and darn it was quite the disappointment. It offered no real solution but propagated the same rhetoric the author always warned us against—wealthy big corps and individuals dictating the lives of marginalized people, but in the form of "more" corrupt religious institutions (which thrives on dogmatism, fearmongering, and economic corruption) signing a treaty for regulation of AI. Not to mention, the treaty seemed toothless and ornamental. This portion of the book seems like a huge disappointment. This would have been in the 3-4 star area if there weren't so many glaring faults in this book. I am rating it 2 stars out of 5 because this book did give me some, although a very tiny amount of food for thought and important factual information that I should care more about. You can read this book if you want a discussion only of the worst aspects of being dependent on AI.
Interesting as an introduction to issues that arise from contemporary technology, but not really about AI. It's a journalistic or story-by-story account of people who've been screwed over by the lack of legal accountability in the tech industry. Very accessible and interesting if you're not aware of these things by now. But personally, I have a deep interest in the subject and found it to be too simplistic and kind of already outdated from what I prefer. It's still good and I would still recommend it BUT I would first recommend my favorite books on AI so far—Kate Crawford's Atlas of AI and Matteo Pasquinelli's The Eye of the Master—for more in-depth and technical analyses of AI.
A thoughtful and nuanced investigation into the impact of artificial intelligence on human lives.
Murgia highlighted how algorithms and artificial intelligence use is often profiting on the work and lives of vulnerable and marginalised people. A very interesting read!
[audiobook] 3.5 Pierwszy rozdział, to zdecydowanie najsłabszy element książki, potem jest lepiej. Opisuje nieoczywiste aspekty wpływu technologii opartych na AI na społeczeństwo, przede wszystkim przez pryzmat zagrożeń. Nie czuję się przekonana do wszystkich tez wygłoszonych przez autorkę, ale nie muszę być ;) Książka wprowadziła mi kilka pojęć, o których będę myśleć rozważając temat, a których nie brałam wcześniej pod uwagę, więc wartościowa lektura.
There has been an explosion of books about the impact of current AI/ML on all facets of society in the last ~5 years. Murgia's 2024 contribution, Code Dependent, focuses on the interface between humans and AI/ML technology, and she travels and reports extensively in the developing world and among immigrant communities in Western countries who try to eke out a living as part of the AI/ML workforce, i.e., the eponymous "code dependent." The first part of the book focuses on people in developing countries and refugees who work to train AI/ML through tasks like image recognition, which reminded me of Dr. Fei-Fei Li's ways of training her ImageNet database by employing college students, then Amazon mechanical turk (see ). Later sections of the book focus heavily on the gig economy and people who try to make a living wage working as independent contractors for companies like Uber, UberEats, etc. (this topic has been covered pretty extensively in books like James Bloodworth's , and is evocative of older, non-tech-focused books like Barbara Ehrenreich's .
Further reading: hidden job sectors by Rachel Slade by Alden Wicker by Oliver Franklin-Wallis by Rose George by Christopher Mims
My statistics: Book 169 for 2024 Book 1772 cumulatively
Shortlisted for the @womensprize for Non-Fiction, Code Dependent by the AI Editor of the Financial Times Madhumita Murgia is a remarkable book charting the rise of generative AI and its impacts on humanity, primarily through the prism of some of the world’s poorest communities.
Murgia meets gig workers, doctors, mothers, tech workers, teenage girls, activists and many others, most of them living and working in marginalised communities around the world, from Bulgaria to Kenya, China to the Netherlands, and shines a light on the insidious effects AI algorithms have on people who are already disenfranchised and poor.
Murgia masterfully navigates through complex concepts, making them accessible to the lay reader. Her insightful analysis sheds light on the profound impact of coding on our daily lives, and will make you think twice about the apps you use daily, the companies you patronise and the impact AI has on your daily life, your health and your children’s education.
With thought-provoking narratives and meticulous research, I found this a more profound and meaningful read than ٴDZäԲ by Naomi Klein, another shortlisted book that while also interesting, gets a little repetitive. There’s also nuance and balance here; while code is not neutral, it’s not all bad either.
I listened to Code Dependent on audiobook on Audible. There’s a bonus interview at the end with the author which is interesting; I found it much more optimistic about the future of humanity than I did the book itself! I’d be delighted to see this one scoop the big prize. 4.5/5⭐️
“While I remain actively optimistic about the social value of AI, I believe no matter how exceptional a tool is, it only has utility when it preserves human dignity�.
Code Dependent (2024) by Madhumita Murgia was the perfect book to read consecutively with (2016) by Cathy O’Neil and (2018) by Meredith Broussard. Whereas the latter books taught me that artificial intelligence (AI) is not actually ‘intelligent� and that the use of proxies is harmful, Murgia explores its social consequences. In ten chapters, she explains how the use of algorithms leads to a loss of agency � the capacity to act independently and make choices � and contributes to social injustice.
Exploitation data workers In the first chapter, the author examines the activities of big tech companies in developing countries. Their promise of self-learning machines is a commercial one; in reality, feeding AI with the right data is human work, often outsourced to low-wage countries. Unskilled workers are tasked with data annotation (such as labelling photos) and content moderation (e.g. removing pornographic or violent material). While working conditions are better than in sweatshops, Murgia argues that they are still unfair: complaints can result in sanctions, such as (temporary) removal from a platform, cutting workers off from their income.
Data colonialism Following Nick Couldry and Ulises Mejias (), the author uses the term ‘data colonialism� to describe the exploitation and control of personal data by powerful entities like big tech companies, who use these data for their economic or political interests. There are some harrowing examples in developing countries, where companies, for instance, promise free healthcare to local communities. While these companies profit from vast amounts of data, the communicaties often don’t benefit from algorithms that were not designed for them. Other examples are closer to home, from deepfakes used to hyper-sexualise and intimidate women to shady algorithms mistreating Deliveroo or Uber couriers. Surveillance raises questions about discrimination and stigmas; even monitoring for ‘social� purposes is dangerous, as the technology is usually not empathic but punitive. (On this point, I found O’Neil more convincing.)
Cast Away In the final chapter, Murgia zooms in on ChatGPT and language proficiency. By training AI to use ‘human� language, the industry has managed to simulate intentions and emotions, further contributing to the smokescreen that obscures the black box. From this book, I learned to compare ChatGPT to the volleyball Wilson from the movie Cast Away: in the end, humans project their own emotions onto objects.
I’d highly recommend this book to anyone interested in learning more about AI and ethics. Find out for yourself why Franz Kafka’s is more relevant than ever.
Code Dependent explores the complex relationship between humans and the algorithms that increasingly govern our lives. The book breaks down how deeply technology has become intertwined with our daily routines, from social media feeds to job searches, and how our growing reliance on code shapes society in ways we may not fully understand.
The author's skill in explaining complex concepts in a clear, engaging manner is a standout feature of the book. This makes it accessible to all readers, regardless of their technical background. It’s a blend of informative and eye-opening, revealing how algorithms influence everything from the news we see to the opportunities we’re presented with. The book also thoughtfully examines the ethical implications of our increasing reliance on these systems, particularly how bias can be embedded and disproportionately affect marginalized groups.
One of the book’s strengths is its ability to balance technical depth with real-world examples, helping readers grasp algorithms' power and potential dangers. The writing is sharp and direct, often prompting reflection on how much control we’ve surrendered to the invisible systems that shape our choices and behaviors.
Overall, Code Dependent is a must-read for anyone interested in the intersection of technology, society, and ethics. It’s a timely and thought-provoking exploration of how our reliance on algorithms is shaping the future, for better or worse, and is particularly relevant to current societal issues.
I received a copy of this book in exchange for an honest review.
Code Dependent is shedding light on the socioeconomic and human rights issues amplified by AI. Murgia gives us good journalism and great storytelling but overall falls short other than what might be considered as admiring the problem. It takes the reader through lives of digital age gig workers and sweatshops highlighting how they're cheated by big tech. It also demonstrates how this work is used to target the very vulnerable groups of the society increasing inequality in all aspects of life. Its tone resonates with the other book I'm currently reading (Zuboff's Age of Surveillance Capitalism). It serves to create awareness around learning the tech/algorithms to live by (another goodread on the topic). Brian Christian's Alignment Problem is on my TBR and hope that would be a better read. Code Dependent ends with simple recommendations and it is impressive in its stories across the globe but little patchy in terms of claims over and coverage of AI.
Damn, AI (Dirbtinis intelektas) - man viena iš įdomiausių temų tiek fantastikos, tiek negrožinėse knygose. Tačiau šioji - ne apie dirbtinį intelektą, bet labiau apie žmones dirbančius prie jo. Nesakau, kad buvo neįdomu, bet tikrai ne tai ko tikėjausi.
My local library has bought the entire Women’s Prize for Non-Fiction shortlist, and I’m gradually reading my way through it. Madhumita Murgia's Code Dependent: Living in the Shadow of AI is currently second in my rankings, behind A Flat Place but ahead of the actual winner, ٴDZäԲ, which I felt would have been better as a long essay. As the title suggests, this one is a series of case studies of the impact of AI systems on people’s lives. At first, I found Code Dependent too journalistic and too familiar. The first three chapters showcase material I’d already seen in news reporting and on social media, dealing with ‘deepfakes�, face recognition apps, data-tagging jobs and the hideousness of getting workers in the Global South to filter out violent material from our social media feeds. All important issues, but I not only knew about them but felt they’d been addressed better in fiction, from Cory Doctorow’s prescient For The Win to Lisa Ko’s short story ‘The Contractors'. The last couple chapters, on legal and societal frameworks, were also too broad-brush for me, and I was frustrated by a throwaway sentence that referenced a much more interesting story that Murgia wrote for the Financial Times, on a woman who challenged a new algorithm the NHS uses to allocate livers for transplant (I imagine the FT didn’t allow her to reproduce it here, but such a shame!).
But the middle of the book is much stronger, with great chapters on how the Uber app screws over riders, how AI-powered diagnosis tools can improve healthcare in rural India, and how data collected from families living in Salta, in north-west Argentina, was supposed to improve outcomes for teenage mothers but ended up surveilling families pointlessly. I was most struck by Murgia’s case study of the ProKid machine learning system used in Amsterdam, that collates a list of young people supposedly at risk of committing crimes but, unsurprisingly, both over-represents teenagers of colour and labels them in a way that makes things worse. Not only did young people on the list feel set up for failure, the list was actually used by drug gangs to recruit ‘easy targets�. So much resonance here with the way young people were policed in England and Wales in the inter-war and post-war periods, when, even though there was no AI, their family circumstances and supposed vulnerabilities were used to determine what happened to them in the criminal justice system. So Code Dependent may be patchy, but it’s worth reading, and I was especially impressed by its global reach.
This is a fascinating book. Rather than being technical, it is about how the increasing use of imperfect technology impacts peoples' lives. Each chapter focuses on a different topic. We meet victims of deep fake revenge porn, Uber Eats drivers who are shorted money, and workers in developing countries who are paid a few dollars a day to be content moderators for social media platforms, often at the expense of their emotional well being. The author makes a convincing case that as it is now AI benefits the wealthy and discriminates against people of color. Essential and alarming.
An accessible, yet gutting insight into one of the buzziest topics that has taken over the cultural zeitgeist (and, indeed, all of our lives). CODE DEPENDENT brings together a plethora of aspects of our lives and culture that AI influences, and, indeed, often damages and endangers.
I thought this was a good book, I learned some things for sure. It wasn't a techy book, it focused on the personal stories of those working with AI, often in the lowest positions in content moderator and data collection. It showcased people working against the creeping influence of AI but also regular people who have no choice but to work with it (think Uber drivers).
The book ends with some recommendations for the reader to take into their life, questions to ask yourself when adopting technology (both in work and in personal life). Useful ways to critically think about how a technology has been developed and who it impacts. Depending on how much you read about tech, these might not be new recommendations but I appreciate a solution based ending.
It has a great reference section and I've collected some further reading.
This is a useful and important read if you want to know more about how AI impacts humans, especially marginalized people. It's more valuable for the information it provides than for the read itself. Maybe it's just because I'm not the most seasoned nonfiction reader, but there were so many cases and individuals to keep track of, and it changed up in each chapter. I also was hoping for more analysis of the environmental cost of AI; it wasn't mentioned here at all. Good enough read if you're ignorant about AI, as I was.
A stunner of a book that goes into the dangers of AI and how AI has become a digital Frankenstein's monster. Madhumita Murgia's "Code Dependent: Living in the Shadow of AI" divides the book into chapters that explore the effects of AI on underrepresented people: data miners, deep fake victims, shift workers, the disenfranchised, the poor, children, minority populations. etc. She methodically details how uses of AI have contributed to what she terms data colonialism--"human lives converted into continuous streams of data.� Corporations will often use the talents of "low-wage workers" to data mine and do the work needed for these AI programs to prosper. However, this is another form of exploitation as the companies hire vulnerable people (usually refugees or immigrants) who often have no other source of income. It’s maddening to read Murgia’s interviews with these workers, and how they are treated by the companies who profit so much from their work. As Murgia writes, “Data workers are the invaluable human links in the global AI supply chain.�
Murgia shows us how uber drivers, doctors, researchers, teenagers, and mothers struggle with and are often harmed by AI and its uses. We read about women who are stalked and harassed by men who use their images as part of pornographic deep fakes. Delivery drivers who are cheated and underpaid. Young girls in an impoverished part of Argentina who are scanned into databases because they are allegedly at risk for teenage pregnancy (a faulty and another cruel use of AI). We see young immigrant boys treated as criminals by programs that use AI. Again and again, Murgia highlights how AI has become so prevalent in our society, and how it has become a tool for corporations. As Murgia writes in the book, corporations now have more power than governments.
If there's a tiny drawback to the book, it's that the epilogue is a bit too long, as it makes points Murgia already covered. By the end of the epilogue, Murgia has made her case and she leaves us with ten important questions. The book is a warning to all of us who don’t pay attention to the dangers because we think it won’t impact us. It has and it will. This book is essential reading for today’s world.
The author makes a false promise in the title that this book will discuss AI. There are two chapters that discuss actual AI in the way the term is generally understood today. The author uses "algorithm" and "AI" interchangeably. I appreciated the author providing the counter arguments to her own arguments, but I found the counter arguments to be stronger than the author's.
A lot of the technology discussed is far from cutting edge for 2024, such as Uber job matching and allocation.
Nitpicky, but I wish I had counted the number of times the word "kafkaesqe" was used 😂
I felt that this book takes advantage of the recent alarmist headlines that have garnered lots of attention. Worse, I felt the author might have been disingenuous stating that she has an interest in *supporting* new technology rather than ringing alarms as a tech journalist (so therefore - she says - the fact that she is raising alarms is particularly alarming). On the contrary, a journalist will want to tell compelling stories, and if there were no concerns with the new technology, that doesn't make for a good story.
Absolutely fascinating insight into the construction and implementation of AI from the viewpoint of the citizens building it, and those largely negatively affected by it (as opposed to the more common Silicon Valley reporting we see). It gave me a greater understanding of the components of AI I hadn’t considered previously, as well as the vast scale of data colonisation at play.
This is how non-fiction should be written, give us the facts but make it personal. Murgia, a UK journalist that focuses on the impact of AI and technology on our culture, provides us with a behind the scenes look at the thousands of people who feed the monolithic AI computers with data (who will eventually become redundant) and the perils of uncontrolled surveillance and digital deepfakes. The one flaw in the book is that it felt like Murgia was pounding us with a liberal agenda, and I consider myself a liberal. The stories she tells are generally of data colonialism, how the poor and marginalized are exploited by technology. It would have felt more balanced to show some of the benefits of AI, but obviously that was not the intent of this book. I found it overall quite engaging and terrifying. We all know how insidious AI is and how integrated it is into our lives. The Chinese example is a view to our possible future if we don't stop relinquishing our moral authority to machines and figure out a way to regulate technology. With the recent advent of ChatGPT and its problematic "hallucinations" we are in the midst of grappling with exponential change using technology we don't completely understand and whose expertise is concentrated in too few people. The other terrifying reality is that Elon Musk is the player pushing the keys behind our idiot President and we have just begun to see those consequences.
Favorite quotes: --The way in which algorithms have been introduceed into society has caused an erosion of our individual feelings of autonomy, but also a diminishing of the power and agency of those we trust as experts-transfiguring our society. --As the use of AI becomes widespread, its users must become wary of automation bias, a widely studied phenomenon in which people start to beomce over-reliant on automation to do their jobs. --In a 2019 UN report highlighted the datafication of government's functions...Digital technologies, including AI, which determine who should get social protection and assistance, simply predict, identify, surveil, detect, target and punish the poor. --The science fiction writer, Ted Chiang, says we should change the term Artifical Intelligence to Applied Statistics because AI is not intelligent. With AI it feels like there is someone at the other end. But there isn't. --Arab poet, Abu Al-Fath al-Busti: "Man toils like the silkworm which spends its life weaving, only to perish, confused, inside its woven creation."
Once again, do the benefits of AI supersede the negatives? I felt this actually raised points I hadn’t read about AI before. AI still can’t learn independently, it still relies on human guidance and power. Data workers are key to AI functions in the global supply chain however they are exploited, often marginalised and vulnerable to the shifting power dynamics surrounding data ownership. Do you own your own data or is it now a form of rentier payment to monopolistic technology companies? This book argues we are now entering a form of data colonialism, where a lot of data is being collected from citizens, with little value-add on the ground, and maximum value extraction by tech corporations. The analogies between the tech moguls of today and the East India Trading Company in their bid for quasi-statehood and power are on point. Finally, a full quote: “Achille Mbembe, the Cameroonian historian and philosopher, who studies the after-effects of colonialism. Mbembe invented the term ‘necropolitics�, which describes the power of political institutions to dictate which citizens are the most precarious in a society. These vulnerable citizens, according to Mbembe, live in so-called deathworlds � enclaves in which they no longer exercise control or retain autonomy in their lives.� Will AI bring us closer to a universalised deathworld?