Embracing the Dark Side: How Tech’s Tilt Towards Dystopia is Captivating Minds

This article was featured in One Story to Read Today, a newsletter in which our editors recommend a single must-read from The Atlantic, Monday through Friday. Sign up for it here.

In the past year or so, since the public release of OpenAI’s ChatGPT, people have been making their peace with the idea that an omnipotent AI might be on the horizon. Sam Altman, the company’s CEO, “believes that people need time to reckon with the idea that we may soon share Earth with a powerful new intelligence, before it remakes everything from work to human relationships,” my colleague Ross Andersen reported after the two had several conversations. “ChatGPT was a way of serving notice.”

But OpenAI isn’t Altman’s only project, and it’s not even his only project with ambitions to change the world. He is also a co-founder of a company called Tools for Humanity, which has the lofty goal of protecting people from the economic devastation that may arise from AI taking human jobs. The company’s first major project is Worldcoin, which uses an evil-looking metallic orb—called the Orb—to take eyeball scans from people all over the world.

Those scans are converted into unique codes that confirm you are a real, individual human, not a bot. In the future, this will theoretically grant you access to a universal basic income parceled out through Worldcoin’s cryptocurrency, WLD. (You will want this because you will not be able to find work.) More than 2 million people in 35 countries have been scanned already, according to Tools for Humanity’s World ID app. Although it’s not yet available in the United States, the WLD token has been distributed elsewhere, and the company has also recruited users through cash incentives in countries such as Indonesia and Kenya.

In its coverage of the Orb, The New York Times made a reference to the 2002 sci-fi thriller Minority Report, in which Tom Cruise must replace his eyeballs in order to evade a techno-police state he helped build. On social media, people have called the concept “scary,” “nightmare fuel,” and “blackmirror asf” (asf meaning “as fuck”). Even Vitalik Buterin, a co-creator of the Ethereum cryptocurrency and a supporter of the project, acknowledged in a blog post its “dystopian vibez.” These reactions aren’t anchored in the concept of a UBI supplied through cryptocurrency, or in the idea that iris verification might someday be necessary to differentiate bots from people (though plenty of legitimate criticism has been made of both those things). No: It’s because it’s an orb and it’s scanning your eyes, specifically to prepare you for a future of which many people are reasonably terrified.

From the September 2023 issue: Does Sam Altman know what he’s creating?

Ordinarily, a solid idea for marketing something new would be to position it as the opposite of dystopian. This is what Apple did in its 1984 Super Bowl commercial, which was 1984-themed and directed by Ridley Scott. It portrayed IBM as Big Brother, a force for vaguely fascist conformity and dreariness. Meanwhile, the Macintosh was a symbol of vital energy and freedom. That ad is famous at least in part because it’s very edgy: It’s really a risk to feature lockstep-marching skinheads in a commercial at all, even if what you’re ultimately saying is that your product can be personified as their opposite (a sprinting female model in athletic shorts).

But recently, even Apple has acknowledged dark times ahead—“I’m Pretty Sure Apple Knows We’re All Going to Die Soon,” the reporter Katie Notopoulos summarized last year, after the company revealed its new satellite-emergency-calling feature and an Apple Watch that can withstand extreme weather. And more often, the tech companies lean in—they say, We are the dystopia. Nobody forced Tools for Humanity to go with the Orb. Nobody ever makes tech companies glaze their products and their marketing with upsetting science-fiction or fantasy references, but they do it all the time. They toy with us.

Peter Thiel’s highly secretive data-analytics company, Palantir, was named in reference to the all-seeing eyelike stones primarily used by the evil characters in the Lord of the Rings series. Mark Zuckerberg renamed his company Meta and went all-in on the metaverse, borrowing the term from the 1992 Neal Stephenson novel, Snow Crash, a book that is not exactly pro–virtual reality. (It’s about how awful it would be to live in an even more class-segregated society, made possible by technology that originally sounded egalitarian.) Even Google’s onetime “Don’t be evil” motto was a bit tongue-in-cheek, maybe. It suggested, at least, that the company had the capacity to do a lot of evil if it wanted to.

Probably the most famous instance of the dystopia-taunting phenomenon is the meal-replacement drink Soylent, which debuted in 2013 and was named after a product in the 1966 sci-novel Make Room! Make Room! The 1973 film adaptation of this book, Soylent Green, is better known. In the book, the soylent is soy and lentils; in the movie, the soylent is smushed-up people. The company openly winks at the gloomy connotations of knocking back a joyless blend of nutrients to stay alive while opting out of the time-consuming process of selecting and eating foods that one might actually enjoy. To announce a new mint-chocolate flavor, Soylent created ads promoting the hashtag #plantsnotpeople. “Clearly I’m wanting someone to investigate it a little deeper if I’m calling it Soylent,” a co-founder, Rob Rhinehart, told Ars Technica.

Buying a bottle of Soylent is a consumer choice. But for tech companies, inevitability is the point. They shape the world we live in, whether we want them to or not. The basic premise of Worldcoin is that everyone will need to be scanned. Not that everyone will want to be and prefer to be. The Orb is not a playful nutrient slurry; it is not meant to be a wink.

I asked Tiago Sada, head of product at Tools for Humanity, about the device’s appearance. He told me it is meant to seem “friendly” and “familiar.” When you set it down, it looks upward at 23.5 degrees, the same angle of Earth’s tilt in its orbit around the sun. Other iris scanners are “super creepy,” Sada said. “You feel like you’re going to the doctor.” I asked him: Say that you hadn’t built the Orb and were just coming across it for the first time; what would it look like to you? A Christmas ornament, he decided. To other people, it looks like a disco ball, he said. They love it. When John Patroulis, the chief marketing officer for Tools for Humanity, brought an inactive Orb to The Atlantic’s office so that I could hold it, I also asked him if he thought there was anything scary about the Orb’s appearance. No. “I think it looks cool,” he said.

In fairness, the company’s designers are in a tight spot: What should an object look like if it’s scanning your eyes to help bring about a future in which people have lost their jobs to artificial intelligence and are being paid a universal basic income as a result? I wouldn’t want it to be cute. I wouldn’t want it to be scary. Probably I just wouldn’t want it. But now that it’s here, I’m fascinated by the Orb. So I downloaded an app and made an appointment to be scanned.

On a Friday morning, I walked over to the Meatpacking District and was buzzed in to a co-working space run by a venture-capital fund. The Orb was sitting on a stool in a corner of the room, near an open supply closet. Truthfully, it did look friendly. The upward tilt of its little face made it appear curious. (Anything can be anthropomorphized!) An Orb operator named Nick walked me through the process. In the World ID app, I checked a few boxes saying I understood what was happening. Then I checked a box saying it was okay for the company to store my iris photos and use them in its training data. I did this because there was a person standing next to me and I didn’t want to seem stingy. I’m an organ donor. I always tip. And I didn’t want to be rude to the machine.

Nick held the Orb…

Reference

Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment