Seeing Like a Data Structure
- by nlqip
Seeing Like a Data Structure
Technology was once simply a tool—and a small one at that—used to amplify human intent and capacity. That was the story of the industrial revolution: we could control nature and build large, complex human societies, and the more we employed and mastered technology, the better things got. We don’t live in that world anymore. Not only has technology become entangled with the structure of society, but we also can no longer see the world around us without it. The separation is gone, and the control we thought we once had has revealed itself as a mirage. We’re in a transitional period of history right now.
We tell ourselves stories about technology and society every day. Those stories shape how we use and develop new technologies as well as the new stories and uses that will come with it. They determine who’s in charge, who benefits, who’s to blame, and what it all means.
Some people are excited about the emerging technologies poised to remake society. Others are hoping for us to see this as folly and adopt simpler, less tech-centric ways of living. And many feel that they have little understanding of what is happening and even less say in the matter.
But we never had total control of technology in the first place, nor is there a pretechnological golden age to which we can return. The truth is that our data-centric way of seeing the world isn’t serving us well. We need to tease out a third option. To do so, we first need to understand how we got here.
Abstraction
When we describe something as being abstract, we mean it is removed from reality: conceptual and not material, distant and not close-up. What happens when we live in a world built entirely of the abstract? A world in which we no longer care for the messy, contingent, nebulous, raw, and ambiguous reality that has defined humanity for most of our species’ existence? We are about to find out, as we begin to see the world through the lens of data structures.
Two decades ago, in his book Seeing Like a State, anthropologist James C. Scott explored what happens when governments, or those with authority, attempt and fail to “improve the human condition.” Scott found that to understand societies and ecosystems, government functionaries and their private sector equivalents reduced messy reality to idealized, abstracted, and quantified simplifications that made the mess more “legible” to them. With this legibility came the ability to assess and then impose new social, economic, and ecological arrangements from the top down: communities of people became taxable citizens, a tangled and primeval forest became a monoculture timber operation, and a convoluted premodern town became a regimented industrial city.
This kind of abstraction was seemingly necessary to create the world around us today. It is difficult to manage a large organization, let alone an interconnected global society of eight billion people, without some sort of structure and means to abstract away details. Facility with abstraction, and abstract reasoning, has enabled all sorts of advancements in science, technology, engineering, and math—the very fields we are constantly being told are in highest demand.
The map is not the territory, and no amount of intellectualization will make it so. Creating abstract representations by necessity leaves out important detail and context. Inevitably, as Scott cataloged, the use of large-scale abstractions fails, leaving leadership bewildered at the failure and ordinary people worse off. But our desire to abstract never went away, and technology, as always, serves to amplify intent and capacity. Now, we manifest this abstraction with software. Computing supercharges the creative and practical use of abstraction. This is what life is like when we see the world the way a data structure sees the world. These are the same tricks Scott documented. What has changed is their speed and their ubiquity.
Each year, more students flock to computer science, a field with some of the highest-paying, most sought-after jobs. Nearly every university’s curriculum immediately introduces these students to data structures. A data structure enables a programmer to organize data—about anything—in a way that is easy to understand and act upon in software: to sort, search, structure, organize, or combine that data. A course in data structures is exercise after exercise in building and manipulating abstractions, ones that are typically entirely divorced from the messy, context-laden, real-world data that those data structures will be used to store.
As students graduate, most join companies that demand these technical skills—universally seen as essential to computer science work—who see themselves as “changing the world,” often with even grander ambitions than the prosaic aims of state functionaries cataloged by Scott.
Engineers are transforming data about the world around us into data structures, at massive scale. They then employ another computer science trick: indirection. This is the ability to break apart some sociotechnical process—to “disrupt”—and replace each of the now-broken pieces with abstractions that can interface with each other. These data structures and abstractions are then combined in software to take action on this view of reality, action that increasingly has a human and societal dimension.
Here’s an example. When the pandemic started and delivery orders skyrocketed, technologists saw an opportunity: ghost kitchens. No longer did the restaurant a customer was ordering from actually have to exist. All that mattered was that the online menu catered to customer desires. Once ordered, the food had to somehow get sourced, cooked, and packaged, sight unseen, and be delivered to the customer’s doorstep. Now, lots of places we order food from are subject to this abstraction and indirection, more like Amazon’s supply chain than a local diner of yore.
Facebook sees its users like a data structure when it classifies us into ever more precise interest categories, so as to better sell our attention to advertisers. Spotify sees us like a data structure when it tries to play music it thinks we will like based on the likes of people who like some of the same music we like. TikTok users often exclaim and complain that its recommendations seem to uncannily tap into latent desires and interests, leading many to perform psychological self-diagnosis using their “For You” page.
Data structures dominate our world and are a byproduct of the rational, modern era, but they are ushering in an age of chaos. We need to embrace and tame, but not extinguish, this chaos for a better world.
Machines
Historian of technology Lewis Mumford once wrote that clocks enabled the division of time, and that enabled the regimentation of society that made the industrial revolution possible. This transformation, once fully underway around the world in the 20th century, fundamentally changed the story of society. It shifted us away from a society centered around interpersonal dynamics and communal interactions to one that was systematic and institutional.
We used to take the world in and interpret it through human eyes. The world before the industrial revolution wasn’t one in which ordinary people interacted with large-scale institutions or socio-technical systems. It wasn’t possible for someone to be a “company man” before there was a corporate way of doing things that in theory depended only on rules, laws, methods, and principles, not on the vicissitudes of human behavior.
Since the beginning of the industrial revolution, workers and the natural world have been subject to abstraction. This involves the use of abstract reason over social preferences. Knowledge about the world was no longer in our heads but out in the world. So we got newspapers, instruction manuals, bylaws, and academic journals. And we should be clear: this was largely an improvement. The era of systems—of modernity—was an improvement on what came before. It’s better for society to have laws rather than rulers, better for us to lean on science than superstition. We can’t and shouldn’t go back.
The tools of reason enabled the “high modernists,” as Scott calls them, to envision a world shaped entirely by reason. But such reason was and is never free of personal biases. It always neglects the messiness of reality and the tacit and contextual knowledge and skill that is needed to cope with that mess—and this is where trouble began to arise.
Workers were and are treated as cogs in the industrial machine, filling a narrow role on an assembly line or performing a service job within narrow parameters. Nature is treated as a resource for human use, a near-infinite storehouse of materials and dumping ground for wastes. Even something as essential and grounding as farming is seen as mechanistic—”a farm is a factory in a remote area,” as put by one John Deere executive—where plants are machines that take in nitrogen, phosphorus, and potassium and produce barely edible dent corn. There’s even a popular myth that eminent business theorist W.E. Deming said: “If you can’t measure it, you can’t manage it”—lending credence to the measurement and optimization mindset.
The abstractions nearly write themselves. Though, leaving nothing to chance, entrepreneurs and their funders have flocked to translating these precomputing abstractions for the age of data structures. This is happening in both seen and unseen ways. Uber and Lyft turned people into driving robots that follow algorithmic guidance from one place to another. Amazon made warehouse workers perform precisely defined tasks in concert with literal robots. Agtech companies turn farms into data structures to then optimize the application of fertilizer, irrigation water, pesticides, and herbicides.
Beyond simply dividing time, computation has enabled the division of information. This is embodied at the lowest levels—bits and packets of data flowing through the Internet—all the way up to the highest levels, where many jobs can be described as a set of information-processing tasks performed by one worker only to be passed along to another. But this sort of computing—that’s just worn-out optimization techniques that date back to last century’s Taylorism—didn’t move us into the unstable world we’re in today. It was a different sort of computation that did that.
Computation
Today we’re in an era where computing not only abstracts our world but also defines our inner worlds: the very thoughts we have and the ways we communicate.
It is this abstracted reality that is presented to us when we open a map on our phones, search the Internet, or “engage” on social media. It is this constructed reality that shapes the decisions businesses make every day, governs financial markets, influences geopolitical strategy, and increasingly controls more of how global society functions. It is this synthesized reality we consume when the answers we seek about the world are the entire writings of humanity put into a blender and strained out by a large language model.
The first wave of this crested a decade ago only to crash down on us. Back then, search engines represented de facto reality, and “just Google it” became a saying: whatever the search engine said was right. But in some sense that was a holdover from the previous “modern” era but with a large data structure—the search engine’s vast database—replacing some classic source of truth such as the news media or the government. We all had a hope that with enough data, and algorithms to sift through it all, we could have a simple technological abstraction over the messiness of reality with a coherent answer no matter what the question was.
As we move toward the future promised by some technologists, our human-based view of the world and that of the data structures embedded in our computing devices will converge. Why bother to make a product at all when you can just algorithmically generate thousands of “ghost products,” in the hopes that someone will buy.
Scott’s critiques of datafication remain. We are becoming increasingly aware that things are continuous spectra, not discrete categories. Writing about the failure of contact tracing apps, activist Cory Doctorow said, “We can’t add, subtract, multiply or divide qualitative elements, so we just incinerate them, sweep up the dubious quantitative residue that remains, do math on that, and simply assert that nothing important was lost in the process.”
A pair of augmented-reality glasses may no longer let us see the world unfiltered by data structures but instead dissect and categorize every experience. A person on the street is no longer an individual but a member of a subcategory of “person” as determined by an AI classifier. A street is no longer the place you grew up but an abstraction from a map. And a local cafe is no longer a community hangout but a data structure containing a menu, a list of reservation options, and a hundred 5-star ratings.
Whether as glasses we look through or simply as screens on our devices, reality will be augmented by the data structures that categorize the world around us. Just as search engines caused the rise of SEO, where writers tweak their writing to attract search engines rather than human readers, this augmented reality will result in its own optimizations. We may be seeing the first signs of this with “Thai Food Near Me” as the literal name of businesses that are trying to satisfy the search function of mapping apps. Soon, even the physical form of things in the world may be determined in a coevolution with technology, where the form of things in the real world, even a dish at a restaurant, is chosen by what will look best when seen through our technological filters. It’s a data layer on top of reality. And the problems get worse when the relative importance of the data and reality flip. Is it more important to make a restaurant’s food taste better, or just more Instagrammable?
People are already working to exploit the data structures and algorithms that govern our world. Amazon drivers hang smartphones in trees to trick the system. Songwriters put their catchy choruses near the beginning to exploit Spotify’s algorithms. And podcasters deliberately mispronounce words because people comment with corrections and those comments count as “engagement” to the algorithms.
These hacks are fundamentally about the breakdown of “the system.” (We’re not suggesting that there’s a single system that governs society but rather a mess of systems that interact and overlap in our lives and are more or less relevant in particular contexts.) Systems work according to rules, either ones made consciously by people or, increasingly, automatically determined by data structures and algorithms. But systems of rules are, by their nature, trying to create a map for a messy territory, and rules will always have loopholes that can be taken advantage of.
The challenge with previous generations of tech—and the engineers who built them—is that they got stuck in the rigidity of systems. That’s what the company man was all about: the processes of the company, of Taylorism, of the McKinsey Way, of Scrum software development, of effective altruism, and of so many more. These all promised certainty, control, optimality, correctness, and sometimes even virtue: all just manifestations of a rigid and “rational” way of thinking and solving problems. Making systems work in this way at a societal level has failed. This is what Scott was saying in his seminal book. It was always doomed to fail.
Fissures
Seeing like a state was all about “legibility.” But the world is too difficult to make legible today. That’s where data structures, algorithms, and AI come in: humans no longer need to manually create legibility. Nor do humans even need to consume what is made legible. Raw data about the world can be fed into new AI tools to create a semblance of legibility. We can then have yet more automated tools act upon this supposed representation of the world, soon with real-life consequences. We’re now delegating the process of creating legibility to technology. Along the way, we’ve made it approximate: legible to someone or something else but not to the person who actually is in charge.
Right now, we’re living through the last attempts at making those systems work, with a perhaps naive hope and a newfound belief in AI and the data science that fuels it. The hope is that, because we have better algorithms that can help us make sense of even more data, we can somehow succeed at making systems work where past societies have failed. But it’s not going to work because it’s the mode of thought that doesn’t work.
The power to see like a state was intoxicating for government planners, corporate efficiency experts, and adherents to high modernism in general. But modern technology lets us all see like a state. And with the advent of AI, we all have the power to act on that seeing.
AI is made up of data structures that enable a mapping from the messy multidimensional reality that we inhabit to categories and patterns that are useful in some way. Spotify may organize songs into clever new musical genres invented by its AI, but it’s still an effort to create legibility out of thin air. We’re sending verbose emails with AI tools that will just be summarized by another AI. These are all just concepts, whether they’re created by a human mind or by a data structure or AI tool. And while concepts help us understand reality, they aren’t reality itself.
The problem we face is at once simple to explain and fiendishly difficult to do something about. It’s the interplay of nebulosity and pattern, as scholar David Chapman puts it: reality is nebulous (messy), but to get on with our lives, we see patterns (make sense of it in context-dependent ways). Generally, we as people don’t have strict rules for how to make breakfast, and we don’t need the task explained to us when a friend asks us for a cup of coffee. But that’s not the case for a computer, or a robot, or even a corporate food service, which can’t navigate the intricacies and uncertainties of the real world with the flexibility we expect of a person. And at an even larger scale, our societal systems, whether we’re talking about laws and governments or just the ways our employers expect us to get our jobs done, don’t have that flexibility built into them. We’ve seen repeatedly how breaking corporate or government operations into thousands of disparate, rigid contracts ends in failure.
Decades ago, the cracks in these rational systems were only visible to a few, left for debate in the halls of universities, board rooms, and militaries. Now, nebulosity, complexity, and the breakdown of these systems is all around for everyone to see. When teenagers are training themselves to see the world the way social-media ranking algorithms do, and can notice a change in real time, that’s how we know that the cracks are pervasive.
The complexity of society today, and the failure of rigid systems to cope, is scary to many. Nobody’s in charge of, or could possibly even understand, all these complex technological systems that now run our global society. As scholar Brian Klaas puts it, “the cognitive shortcuts we use to survive are mismatched with the complex reality we now navigate.” For some, this threat demands dramatic action, such as replacing some big system we have—say, capitalism—with an alternative means of organizing society. For others, it demands throwing out all of modernity to go back to a mythical, simpler golden age: one with more human-scale systems of order and authority, which they imagine was somehow better. And yet others see the cracks in the system but hope that with more data and more tweaks, it can be repaired and our problems will be definitively solved.
However, it’s not this particular system that failed but rather the mode of society that depends on rigid systems to function. Replacing one rigid system with another won’t work. There’s certainly no golden age to return to. And simpler forms of society aren’t options for us at the scale of humanity today. So where does that leave us?
Tension
The ability to see like a data structure afforded us the technology we have today. But it was built for and within a set of societal systems—and stories—that can’t cope with nebulosity. Worse still is the transitional era we’ve entered, in which overwhelming complexity leads more and more people to believe in nothing. That way lies madness. Seeing is a choice, and we need to reclaim that choice. However, we need to see things and do things differently, and build sociotechnical systems that embody this difference.
This is best seen through a small example. In our jobs, many of us deal with interpersonal dynamics that sometimes overwhelm the rules. The rules are still there—those that the company operates by and laws that it follows—meaning there are limits to how those interpersonal dynamics can play out. But those rules are rigid and bureaucratic, and most of the time they are irrelevant to what you’re dealing with. People learn to work with and around the rules rather than follow them to the letter. Some of these might be deliberate hacks, ones that are known, and passed down, by an organization’s workers. A work-to-rule strike, or quiet quitting for that matter, is effective at slowing a company to a halt because work is never as routine as schedules, processes, leadership principles, or any other codified rules might allow management to believe.
The tension we face is that on an everyday basis, we want things to be simple and certain. But that means ignoring the messiness of reality. And when we delegate that simplicity and certainty to systems—either to institutions or increasingly to software—they feel impersonal and oppressive. People used to say that they felt like large institutions were treating them like a number. For decades, we have literally been numbers in government and corporate data structures.
Breakdown
As historian Jill Lepore wrote, we used to be in a world of mystery. Then we began to understand those mysteries and use science to turn them into facts. And then we quantified and operationalized those facts through numbers. We’re currently in a world of data—overwhelming, human-incomprehensible amounts of data—that we use to make predictions even though that data isn’t enough to fully grapple with the complexity of reality.
How do we move past this era of breakdown? It’s not by eschewing technology. We need our complex socio-technical systems. We need mental models to make sense of the complexities of our world. But we also need to understand and accept their inherent imperfections. We need to make sure we’re avoiding static and biased patterns—of the sort that a state functionary or a rigid algorithm might produce—while leaving room for the messiness inherent in human interactions. Chapman calls this balance “fluidity,” where society (and really, the tech we use every day) gives us the disparate things we need to be happy while also enabling the complex global society we have today.
These things can be at odds. As social animals, we need the feeling of belonging, like being part of a small tribe. However, at the same time, we have to “belong” in a technological, scientific, and institutional world of eight billion interconnected people. To feel connected to those around us, we need access to cultural creativity, whether it be art, music, literature, or forms of entertainment and engagement that have yet to be invented. But we also need to avoid being fragmented into nanogenres where we can’t share that creativity and cultural appreciation with others. We must be able to be who we are and choose who we associate with on an ever-changing basis while being able to play our parts to make society function and feel a sense of responsibility and accomplishment in doing so. And perhaps most importantly, we need the ability to make sense of the torrent of information that we encounter every day while accepting that it will never be fully coherent, nor does it need to be.
This isn’t meant to be idealistic or something for the distant future. It’s something we need now. How well civilization functions in the coming years depends upon making this a reality. On our present course, we face the nihilism that comes with information overload, careening from a world that a decade ago felt more or less orderly to one in which nothing has any clear meaning or trustworthiness. It’s in an environment like this that polarization, conspiracies, and misinformation thrive. This leads to a loss of societal trust. Our institutions and economic systems are based upon trust. We’ve seen what societies look like when trust disappears: ordinary social systems fail, and when they do work, they are more expensive, capricious, violent, and unfair.
The challenge for us is to think how we can create new ways of being and thinking that move us—and not just a few of us but everyone—to be able to at first cope, and then later thrive, in this world we’re in.
Fluidity
There’s no single solution. It’ll be a million little things, but they all will share the overall themes of resilience in the form of fluidity. Technology’s role in this is vital, helping us make tentative, contextual, partial sense of the complex world around us. When we take a snapshot of a bird—or listen to its song—with an app that identifies the species, it is helping us gain some limited understanding. When we use our phones to find a park, local restaurant, or even a gas station in an unfamiliar city, it is helping us make our way in a new environment. On vacation in France, one of us used our phone’s real-time translation feature to understand what our tour guide was saying. Think of how we use weather apps, fitness apps, or self-guided museum tour apps to improve our lives. We need more tools like this in every context to help us to understand nuance and context beyond the level we have time for in our busy lives.
It’s not enough to have software, AI or otherwise, interpret the world for us. What we need is the ability to seamlessly navigate all the different contexts in our life. Take, for instance, the problem of understanding whether something seen online is true. This was already tricky and is now fiendishly difficult what with the Internet, social media, and now generative AI all laden with plausible untruths. But what does “true” mean, anyway? It’s equally wrong to believe in a universal, singular, objective truth in all situations as to not know what to believe and hold everything to be equally false (or true). Both of these options give propagandists a leg up.
Instead, we need fluidity: in Chapman’s terms, to be able to always ask, “In what sense?” Let’s say you see a video online of something that doesn’t seem physically possible and ask, “Is this real?” A useful technology would help you ask, “In what sense?” Maybe it’s something done physically, with no trickery involved, and it’s just surprising. Maybe it’s a magic trick, or real as in created for a TV show promotion, but not actually something that happened in the physical world. Maybe it was created by a movie special effects team. Maybe it’s propaganda created by a nation state. Sorting through contexts like this can be tedious, and while we intuitively do it all the time, in a technologically complex world we could use some help. It’s important to enable people to continue to communicate and interact in ways that make us feel comfortable, not completely driven either by past social custom or by algorithms that optimize for engagement. Think WhatsApp groups where people just talk, not Facebook groups that are mediated and controlled by Meta.
Belonging is important, and its lack creates uncertainty and a lack of trust. There are lessons we can learn from nontechnological examples. For example, Switzerland has a remarkable number of “associations”—for everything from business groups to bird watching clubs—and a huge number of Swiss residents take part. This sort of thing was once part of American culture but declined dramatically over the 20th century as documented in Putnam’s classic book Bowling Alone. Technology can enable dynamic new ways for people to associate as the online and offline worlds fuse—think of the Internet’s ability to help people find each other—though it must avoid the old mindset of optimization at all costs.
We all struggle with life in our postmodern society, that unplanned experiment of speed, scale, scope, and complexity never before seen in human history. Technology can help by bridging what our minds expect with how systems work. What if every large institution, whether a government or corporation, were to enable us to interact with it not on its terms, in their bureaucratic language and with all the complexity that large systems entail, but with computational tools that use natural language, understand context and nuance, and yet can still interface with the data structures that make its large systems tick. There are some promising early prototypes, such as tools that simplify the process of filling out tedious paperwork. That might feel small, almost trivial. But refined, and in aggregate, this could represent a sea change in how we interact with large systems. They will come to feel no longer as impersonal and imposing bureaucracies but as enablers of functioning and flourishing societies.
And it’s not all about large scale either. Scale isn’t always desirable; as Bill McKibben wrote in Eaarth, we’d probably be better off with the Fortune 500,000 than the Fortune 500. Scale brings with it the ills of Seeing Like a State; the authoritarian high modernist mindset takes over at large scale. And while large organizations can exist, they can’t be the only ones with access to, or ability to, afford new technologies. Enabling the dynamic creation and destruction of new organizations and new types of organization—and legal and technical mechanisms to prevent lock-in and to prevent enclosure of public commons—will be essential to keep this new fluid era thriving. We can create new “federated” networks of organizations and social groups, like we’re seeing in the open social web of Mastodon and similar technologies, ones where local groups can have local rules that differ from, but do not conflict with, their participation in the wider whole.
This shift is not just about how society will work but also how we see ourselves. We’re all getting a bit more used to the idea of having multiple identities, and some of us have gotten used to having a “portfolio career” that is not defined by a single hat that we wear. While today there is often economic precarity involved with this way of living, there need not be, and the more we can all do the things that are the best expressions of ourselves, the better off society will be.
Ahead
As Mumford wrote in his classic history of technology, “The essential distinction between a machine and a tool lies in the degree of independence in the operation from the skill and motive power of the operator.” A tool is controlled by a human user, whereas a machine does what its designer wanted. As technologists, we can build tools, rather than machines, that flexibly allow people to make partial, contextual sense of the online and physical world around them. As citizens, we can create meaningful organizations that span our communities but without the permanence (and thus overhead) of old-school organizations.
Seeing like a data structure has been both a blessing and a curse. Increasingly, it feels like it is an avalanche, an out-of-control force that will reshape everything in its path. But it’s also a choice, and there is a different path we can take. The job of enabling a new society, one that accepts the complexity and messiness of our current world without being overwhelmed by it, is one all of us can take part it. There is a different future we can build, together.
This essay was written with Barath Raghavan, and originally appeared on the Harvard Kennedy School Belfer Center‘s website.
Tags: Internet and society
Source link
lol
Seeing Like a Data Structure Technology was once simply a tool—and a small one at that—used to amplify human intent and capacity. That was the story of the industrial revolution: we could control nature and build large, complex human societies, and the more we employed and mastered technology, the better things got. We don’t live…
Recent Posts
- Hackers breach US firm over Wi-Fi from Russia in ‘Nearest Neighbor Attack’
- Microsoft rolls out Recall to Windows Insiders with Copilot+ PCs
- Five Companies That Came To Win This Week
- The 10 Hottest Semiconductor Startups Of 2024
- Cybersecurity Snapshot: Prompt Injection and Data Disclosure Top OWASP’s List of Cyber Risks for GenAI LLM Apps