Most Active Stories
13.7: Cosmos And Culture
The Mind Is An Open Book
I think we're starting to see some sort of conflict between law and values, on the one hand, and technology, on the other. The conflict is unfolding in connection with the law surrounding privacy and protection from undue government and corporate surveillance.
I'm not exactly sure why people are so surprised by Edward Snowden's illegal release of information about the scope of information collection by the United States government on its own citizens. It isn't news that the U.S. has sought to monitor all electronic communications. This was reported in USA Today back in 2006; last year Wired ran an article on the construction in Utah of what may the largest building in America: a complex devoted to housing the computers needed to keep track of who is saying or sending what to whom. I mentioned the article here in a 13.7 post back in June 2012.
Surprising or not, it is hard to be reassured by the president's insistence that no one is actually listening in. Many people believe that information about who is calling whom, and when, is private data, not merely company metadata. Our phone records feel personal to us.
But there is a deeper reason why President Obama's words fail to reassure, or why they should fail to reassure. This is where technological change comes in. The distinction between data and data-about-data (metadata), while superficially straight forward, is actually very delicate. And in the age of Big Data — in the age of computational systems capable of discerning patterns that are invisible to naked human intelligence — the very distinction threatens to collapse. If a sufficiently powerful entity knows the metadata, it knows the data too, at least to a close approximation.
Perhaps you read about the case a few year's ago involving Target. The store began sending coupons for pregnancy related products to a teenage girl. The girl's father was incensed. Were they trying to get her to have a kid in high school? Target's management was embarrassed and apologized profusely. It turned out the girl was pregnant. Target — no one exactly, a computerized pattern detecter — surmised this before she had told her father, based on information about her shopping behavior. It is possible, as this case makes clear, to know what someone knows before she has made the information public to anyone at all
It is easy to imagine that with a little more information — gleaned from downloads, electronic swipes on public transportation, cell phone records, data about who she is in contact with and when by phone, email, social networking software, etc. — we could find out that a person is pregnant before she knows it herself. We could also find out that she doesn't know it yet, but that she ought to.
But this is just the beginning. If I have enough information about your personal history, and I know a lot about the environment in which you find yourself, then it's no mystery what you are thinking and feeling. In a way, this is just common sense. You can tell whether the person you are with is paying attention to you. A pedestrian can tell whether the car at the stop sign has registered his or her presence by seeing how the driver acts.
With enough information, minds are open.
And that's the thing. In the age of Big Data, there is — or there will be — enough information.
Actually, the point is more far-reaching still. What it is to be thinking this or that, what it is to intend this or that, is precisely for one to be integrated, in the right sort of way, in a complex causal or informational network. This is controversial, but it is remarkably well established.
Indeed, it is the very foundation of the theory of computation. Computers aren't smart because they have, inside them, clever thoughts. No. What makes the micro-electronic states of a computer intelligent, or just contentful — for example, what makes it the case that a computer is performing this or that task — is the way those internal states are hooked up, causally, and systematically, to the right kinds of inputs and outputs. Computers don't need to understand what's going on inside of them to solve problems. They are simply physical gizmos. Its the way they are hooked up to the world around them — to put it technically: the way the transitions in their internal states preserve isomorphisms with computationally significant states — that let them perform cognitively significant tasks.
Let's give a super-simple case. The coin in your pocket. It means something. It's a coin. It's currency. It has a value. It's worth 25 cents, let's say. In what does this value consist? Not in the coin itself, thought of as a piece of metal, or thought of as an artifact. No, the value consists, roughly, in the way the coin gets used and on its place in a complex web of relationships, practices and institutions.
And here's the point: if you know everything there is to know about that web, then you know everything there is to know about the coin's value. And it doesn't matter what the coin is made of (wood, plastic, metal) or even whether the coin is virtual (as so much money is and always has been).
This is a general point. Meaning, value and such like, are not intrinsic properties of things in the way that their mass or shape is. They are relational properties. Meaning is use, as Wittgenstein put it. Meaning is not intrinsic, as Dennett has put it.
This is true of us no less than it is true of coins. How could the stuff in your head carry meaning all on its own? How could brains make meaning? They can't, not any more than coins can. Our brains carry information and significance only thanks to the way they, and we, are embedded in complex causal networks.
We like to think that our thoughts are inside. We reveal them to others by making them external in the form of action, words, writings, messages and the like. That's all well and good for describing ordinary life. We can keep secrets. We can publicize our deepest yearnings.
But actually, there is no inside. Or rather, use any device you like — from the scalpel to the brain scan — and you won't find meaning, significance, value, in the head, just as you won't find value in the coin's material body. The very inside/outside distinction breaks down.
These conceptual points are at the heart of 20th century philosophy and cognitive science. They are at the heart of computer science.
But today, as we enter the era of Big Data collection and processing, they take on a new relevance for our political culture. We are on a collision course.
If you collect enough metadata, then you have collected me.
I'm not inside and my messages are not like sealed envelopes you need to open to read. The meaning isn't inside the message.
The meaning, rather, is displayed in the way the message hovers in place in the network of actions, communications, responses, needs, problems and situations that make it the message it is. And this — given the resources of Big Data — is open for your inspection.