We build a lot of technology and push it out into the world. When things go well, we rush to take credit for what we did. But when things go wrong, we hide our heads in the sand. This isn’t just about ignoring negative outcomes — it’s about maintaining the status quo.
Whenever I write a critical piece about technology and its impact on society, a certain kind of troll surfaces. I like to call them the “techno-whataboutist.” Their argument is always the same: “[some person] had the same concerns about [some established technology — the book, the printing press, TV, newspapers, radio, video games, cars] a long time ago, and things turned out just fine, so stop worrying.”
And it’s not just no-name, trolly commenters who run down this path. Nir Eyal pulled the same shenanigans in his piece about screens and their impact on kids. And Slate did an entire piece on the history of “media technology scares” — which, according to the author, didn’t pan out. In both cases, Slate and Eyal pulled out one of the techno-whataboutist’s favorite examples:
The Swiss scientist Conrad Gessner worried about handheld information devices causing ‘confusing and harmful’ consequences in 1565. The devices he was talking about were books.
On the surface, it’s easy to laugh at Gessner, but our relationship with technology and the way it impacts our world is complicated. Nothing is black and white. It’s all gray. If we ever hope to have a healthy, sustainable relationship with the things we create, we have to be willing to dive into those gray areas. The techno-whataboutist’s goal is to avoid all that.
Traditional whataboutism is the deployment of a logical fallacy designed “to discredit an opponent’s position by charging them with hypocrisy without directly refuting or disproving their argument.” For example, a traditional whataboutist might try to dismiss climate activism by calling out that Greta Thunberg still rides in cars (hypocrisy!). This kind of tactic was a favorite propaganda tool of the Soviet Union during the Cold War. And while techno-whataboutism doesn’t portend hypocrisy, it represents the same kind of rhetorical diversion, one designed to act as a cudgel to beat back questions about the complex nature of our relationship to technology.
The idea that the only way to think about technology is in a positive light ignores the complexity inherent in technological progress.
The first big problem with techno-whataboutism is that it presupposes that the place we have ended up, as a society, is a good one. There is no power in Gessner’s book example unless you believe everything is fine.
To even be able to make a statement like, “people worried before, but everything is fine now,” takes a significant level of privilege. Perhaps that’s why in my experience the vast majority of the people who present this argument are white men.
Sure, for many of us white guys, things are pretty good. But this is not the case for everyone. The positive outcomes associated with the advance of technology are unevenly distributed and there are often significant winners and losers in the systems we architect and the things we produce.
Let’s continue with books as an example. The invention of the book made vast amounts of knowledge both available and easily transferable. It’s hard to argue against the net positive impact of that change. But if we just stop there we willfully turn a blind eye to the full picture.
The two most distributed books in history, the Bible and the Quran, while providing spiritual support for many people, have also helped spark a staggering amount of death, destruction, oppression, violence, and human suffering, often focused on marginalized groups and those who don’t ascribe to the beliefs these books contain. Mein Kampf helped catalyze the rise of the Nazis and ultimately the Holocaust. Mao Zedong’s Little Red Book, the third-most distributed book in history, arguably helped catalyze the Great Leap Forward, resulting in the deaths of millions of people.
The capabilities that made books a transformative, positive technology also made them weapons for propaganda and abuse on a previously unprecedented scale. So was Gessner wrong to worry about the impact of books? I don’t know about you, but I’d put indoctrination on a shortlist of “confusing and harmful” effects.
I’m not suggesting that we undo the invention of books or that the positives of technology should be discounted. But the idea that the only way to think about technology is in a positive light ignores the complexity inherent in technological progress. By doing so we lose a depth of conversation and consideration that leaves us open to repeating past mistakes and reinforcing existing power structures. For example, TV, radio, and now social media have mirrored many of the positive AND negative impacts of books on an exponentially accelerating scale, not to mention that each new technology piled on its own unique set of new issues.
Comparing a book to a smartphone is like comparing a car to a skateboard.
The techno-whataboutists practice a special brand of what I like to think of as “technological nationalism,” where they assert that all innovation is “progress,” regardless of the full outcome. This thinking keeps us locked into an endless loop where our technology changes but the political and economic status quo remains the same. The people who benefit continue to benefit and the people who don’t, don’t. We fix nothing and we disrupt everything, except the things that actually need disruption.
This brings me to the second problem with techno-whataboutism: The past is not a proxy for the future. Comparing a book to a smartphone is like comparing a car to a skateboard. Sure, they both have wheels and can get you from point A to point B, but that’s about as far as the similarities go. Books deliver information, as do smartphones, but the context and capabilities are on an entirely different scale. This kind of lazy logic blocks us from considering the specific nuances of new technology.
Context changes. The power, scale, and interconnectedness of our systems grow. We move from linear impacts to exponential impacts. The world is not as it was. The question becomes, when does it matter?
The consequences of our creations fall unevenly on society, but so far, as a whole, we’ve been able to push through, and ignore much of the fallout. But when do the contexts and capabilities of our technology reach a point where the consequences can no longer be ignored?
In his 1969 book Operating Manual for Spaceship Earth, the architect and futurist Buckminster Fuller argued that while the resiliency of nature has created a “safety factor” that has allowed us to make myriad mistakes in the past without destroying ourselves, this buffer would only last for so long:
This cushion-for-error of humanity’s survival and growth up to now was apparently provided just as a bird inside of the egg is provided with liquid nutriment to develop it to a certain point…
My own picture of humanity today finds us just about to step out from amongst the pieces of our just one-second-ago broken eggshell. Our innocent, trial-and error-sustaining nutriment is exhausted. We are faced with an entirely new relationship to the universe. We are going to have to spread our wings of intellect and fly, or perish; that is, we must dare immediately to fly by the generalized principles governing the universe and not by the ground rules of yesterday’s superstitious and erroneously conditioned reflexes.
Nature’s buffer acts as a mask, hiding the true impact of our actions and lulling us into a sense of overconfidence and a disregard for the consequences of our decisions. It’s easy to ignore all of our trash when the landfill keeps it out of sight, but at some point, the landfill overflows.
Fuller was a technological optimist, but he was also realistic about the complexity of change and innovation. From his vantage point in 1969, he was able to see that we were moving to an inflection point in our relationship with the world we inhabit. As he saw it, our safety factor was all used up and our ability to “spread our wings” was dependent on a change of approach, in order to come out the other side and truly cement our place in the cosmos.
The techno-whataboutist doesn’t want to change the approach. Instead, they want you to embrace their reductionist, technological nationalism where all innovation is good—outcomes be damned. Change is impossible under this type of thinking.
The tech world loves to talk about “failing fast,” but no one ever talks about what happens after that.
We’re already starting to see the aggregate impact of our choices on the natural world, and it’s becoming harder to hide from those consequences. But nature isn’t the only thing impacted by technology. The fabric of our society, the way we live and interact, is also tightly tied to the tools we have at our disposal. Like nature, I believe that the inherent resiliency in our societal structures has created a safety factor that has similarly allowed us to ignore the way our behavior, our habits, and our interactions have changed over time. But a what point does that landfill overflow? Or has it already?
Innovation is critical to our overall progress, and we have to accept that there are inherent risks in that messy and unpredictable process. But our need to invent doesn’t absolve us from being accountable to the results. The tech world loves to talk about “failing fast,” but no one ever talks about what happens after that. Who cleans up the mess we leave for society when we fail?
We take full credit for our successes. We stand on big stages and make a big show about the amazing benefits of our newest creations, but we sneak out of the party when shit goes bad. We don’t get to have our cake and eat it too.
It is possible to hold a positive view of technology while still acknowledging its downsides. And while we can’t be afraid to push the edges of what’s possible, we have to be willing to admit when things go wrong and invest in the work to fix it. Our safety factor won’t protect us forever. This is when it matters.
—
“The Problem with the Techno-Whataboutists” was originally published in Medium on January 8, 2020.