Who decides our tomorrow? Challenging Silicon Valley’s power
The numbers are staggering: Meta is offering AI researchers total compensation packages of up to $300 million over four years, with individual deals like former Apple executive Ruoming Pang's $200 million package making headlines across Silicon Valley. Meanwhile, OpenAI just raised $40 billion, with the company valued at $300, reportedly the largest private tech funding round in history.
But beneath these eye-watering dollar figures lies a profound transformation: Silicon Valley’s elite have evolved from eager innovators into architects of a new world order, reshaping society with their unprecedented power. This shift is not just about money or technology, it marks a fundamental change in how power is conceived and exercised.
We often talk about technology as if it exists in a silo, separate from politics or culture. But those boundaries are rapidly dissolving. Technology is no longer just a sector or a set of tools; it is reshaping everything, weaving itself into the very fabric of society and power. The tech elite are no longer content with tech innovation alone, they are crafting a new social and political reality, wielding influence that extends far beyond the digital realm.
To break out of these siloed debates, at the end of June we convened a virtual conversation with four remarkable minds: Christopher Wylie (the Cambridge Analytica whistleblower and host of our Captured podcast), pioneering technologist Judy Estrin, filmmaker and digital rights advocate Justine Bateman, and philosopher Shannon Vallor. Our goal: to explore how Silicon Valley’s culture of innovation has morphed into a belief system, one that’s migrated from the tech fringe to the center of our collective imagination, reimagining what it means to be human.
The conversation began with a story from Chris Wylie that perfectly captured the mood of our times. While recording the Captured podcast, he found himself stranded in flooded Dubai, missing a journalism conference in Italy. Instead, he ended up at a party thrown by tech billionaires, a gathering that, as he described in a voice note he sent us from the bathroom, felt like a dispatch from the new center of power:
“People here are talking about longevity, how to live forever. But also prepping—how to prepare for when society gets completely undermined.”
At that party, tech billionaires weren’t debating how to fix democracy or save society. They were plotting how to survive its unraveling. That fleeting moment captured the new reality: while some still debate how to repair the systems we have, others are already plotting their escape, imagining futures where technology is not just a tool, but a lifeboat for the privileged few. It was a reminder that the stakes are no longer abstract or distant: they are unfolding, right now, in rooms most of us will never enter.
Our discussion didn’t linger on the spectacle of that Dubai party for long. Instead, it became a springboard to interrogate the broader shift underway: how Silicon Valley’s narratives, once quirky, fringe, utopian, have become the new center of gravity for global power. What was once the domain of science fiction is now the quiet logic guiding boardrooms, investment strategies, and even military recruitment.
As Wylie put it, “When you start to think about Silicon Valley not simply as a technology industry or a political institution, but one that also emits spiritual ideologies and prophecies about the nature and purpose of humanity, a lot of the weirdness starts to make a lot more sense.”
Judy Estrin, widely known in tech circles as the "mother of the cloud" for her pioneering role in building the foundational infrastructure of the internet, has witnessed this evolution firsthand. Estrin played a crucial part in developing the TCP/IP protocols that underpin digital communication, and later served as CTO of Cisco during the internet’s explosive growth. She’s seen the shift from Steve Jobs’ vision of technology as "a bicycle for the mind" to Marc Andreessen’s declaration that "software is eating the world."
Now, Estrin sounds the alarm: the tech landscape has moved from collaborative innovation to a relentless pursuit of control and dominance. Today’s tech leaders are no longer just innovators, they are crafting a new social architecture that redefines how we live, think, and connect.
What makes this transformation of power particularly insidious is the sense of inevitability that surrounds it. The tech industry has succeeded in creating a narrative where its vision of the future appears unstoppable, leaving the rest of us as passive observers rather than active participants in the shaping of our technological destiny.
Peter Thiel, the billionaire investor and PayPal co-founder, embodies this mindset. In a recent interview, Thiel was asked point-blank whether he wanted the human race to endure. He hesitated before answering, “Uh, yes,” then added: “I also would like us to radically solve these problems…” Thiel’s ambivalence towards other human beings and his appetite for radical transformation capture the mood of a class of tech leaders who see the present as something to be escaped, not improved—a mindset that feeds the sense of inevitability and detachment Estrin warns about.
Estrin argues that this is a new form of authoritarianism, where power is reinforced not through force but through what she calls "silence and compliance." The speed and scale of today's AI integration, she says, requires us " to be standing up and paying more attention."
Shannon Vallor, philosopher and ethicist, widened the lens. She cautioned that the quasi-religious narratives emerging from Silicon Valley—casting AI as either savior or demon—are not simply elite fantasies. Rather, the real risk lies in elevating a technology that, at its core, is designed to mimic us. Large language models, she explained, are “merely broken reflections of ourselves… arranged to create the illusion of presence, of consciousness, of being understood.”
The true danger, Vallor argued, is that these illusions are seeping into the minds of the vulnerable, not just the powerful. She described receiving daily messages from people convinced they are in relationships with sentient AI gods—proof that the mythology surrounding these technologies is already warping reality for those least equipped to resist it.
She underscored that the harms of AI are not distributed equally: “The benefits of technological innovation have gone to the people who are already powerful and well-resourced, while the risks have been pushed onto those that are already suffering from forms of political disempowerment and economic inequality.”
Vallor’s call was clear: to reclaim agency, we must demystify technology, recognize who is making the choices, and insist that the future of AI is not something that happens to us, but something that we shape together.
As the discussion unfolded, the panelists agreed: the real threat isn’t just technological overreach, but the surrender of human agency. The challenge is not only to question where technology is taking us, but to insist on our right to shape its direction, before the future is decided without us.
Justine Bateman, best known for her iconic roles in Hollywood and her outspoken activism for artists’ rights, entered the conversation with the perspective of someone who has navigated both the entertainment and technology industries. Bateman, who holds a computer science degree from UCLA, has become a prominent critic of how AI and tech culture threaten human creativity and agency.
During the discussion, Bateman and Estrin found themselves at odds over how best to respond to the growing influence of AI. Bateman argued that the real threat isn’t AI itself becoming all-powerful, but rather the way society risks passively accepting and even revering technology, allowing it to become a “sacred cow” beyond criticism. She called for open ridicule of exaggerated tech promises, insisting, “No matter what they do about trying to live forever, or try to make their own god stuff, it doesn’t matter. You’re not going to make a god that replaces God. You are not going to live forever. It’s not going to happen.” Bateman also urged people to use their own minds and not “be lazy” by simply accepting the narratives being sold by tech elites.
Estrin pushed back, arguing that telling people to use their minds and not be lazy risks alienating those who might otherwise be open to conversation. Instead, she advocated for nuance, urging that the debate focus on human agency, choice, and the real risks and trade-offs of new technologies, rather than falling into extremes or prescribing a single “right” way to respond.
“If we have a hope of getting people to really listen… we need to figure out how to talk about this in terms of human agency, choice, risks, and trade-offs,” she said. “Because when we go into the , you’re either for it or against it, people tune out, and we’re gonna lose that battle.”
At this point, Christopher Wylie offered a strikingly different perspective, responding directly to Bateman’s insistence that tech was “not going to make a god that replaces God.”
“I’m actually a practicing Buddhist, so I don’t necessarily come to religion from a Judeo-Christian perspective,” he said, recounting a conversation with a Buddhist monk about whether uploading a mind to a machine could ever count as reincarnation. Wylie pointed out that humanity has always invested meaning in things that cannot speak back: rocks, stars, and now, perhaps, algorithms. “There are actually valid and deeper, spiritual and religious conversations that we can have about what consciousness actually is if we do end up tapping into it truly,” he said.
Rather than drawing hard lines between human and machine, sacred and profane, Wylie invited the group to consider the complexity, uncertainty, and humility required as we confront the unknown. He then pivoted to a crucial obstacle in confronting the AI takeover:
“We lack a common vocabulary to even describe what the problems are,” Wylie argued, likening the current moment to the early days of climate change activism, when terms like “greenhouse gases” and “global warming” had to be invented before a movement could take shape. “Without the words to name the crisis, you can’t have a movement around those problems.”
The danger, he suggested, isn’t just technological, it’s linguistic and cultural. If we can’t articulate what’s being lost, we risk losing it by default.
Finally, Wylie reframed privacy as something far more profound than hiding: “Privacy is your ability to decide how to shape yourself in different situations on your own terms, which is, like, really, really core to your ability to be an individual in society.”
When we give up that power, we don’t just become more visible to corporations or governments, we surrender the very possibility of self-determination. The conversation, he insisted, must move beyond technical fixes and toward a broader fight for human agency.
As we wrapped up, what lingered was not a sense of closure, but a recognition that the future remains radically open—shaped not by the inevitability of technology, but by the choices we make, questions we ask, and movements we are willing to build. Judy Estrin’s call echoed in the final moments: “We need a movement for what we’re for, which is human agency.”
This movement, however, should not be against technology itself. As Wylie argued in the closing minutes, “To criticize Silicon Valley, in my view, is to be pro-tech. Because what you're criticizing is exploitation, a power takeover of oligarchs that ultimately will inhibit what technology is there for, which is to help people.”
The real challenge is not to declare victory or defeat, but to reclaim the language, the imagination, and the collective will to shape humanity's next chapter.
A version of this story was published in last week’s Sunday Read newsletter. Sign up here.
Your Early Warning System
This story is part of “Captured”, our special issue in which we ask whether AI, as it becomes integrated into every part of our lives, is now a belief system. Who are the prophets? What are the commandments? Is there an ethical code? How do the AI evangelists imagine the future? And what does that future mean for the rest of us? You can listen to the Captured audio series on Audible now.
The post Who decides our tomorrow? Challenging Silicon Valley’s power appeared first on Coda Story.