Google, Facebook, and our privacy: We're all in denial

What does it mean to have a "right to privacy?" We have a right to vote, and too few of us use it. I heard it explained to me once, a human right is like a vegetable garden. You have to nurture it, take care of it, and harvest it. Otherwise you have a plot of dirt.

The Internet is not like a vegetable garden. Perhaps that test is appropriate, then, for lawmakers worldwide considering whether the "right to Internet access" follows from the right to free speech -- there are places in the world where is this actively being considered. If a person is denied access to the Internet, the argument goes, her free speech rights are being violated, or at least abridged.

By that same logic, the extent to which one makes use of the Internet, must therefore abridge that person's own right to privacy. At least, by that same logic.

"I think judgment matters..."

Quite a bit has been said about Google CEO Eric Schmidt's comment, in a recent interview with CNBC's Maria Bartiromo for a documentary, that if an Internet user truly expects privacy, then he should consider whether there's something about himself that he doesn't want the world to know.

Perhaps just as important as Schmidt's response was Bartiromo's question, which often gets cut out of the excerpt: "People are treating Google like their most trusted friend. Should they be?"

"Well, I think judgment matters," Schmidt responded. "If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place. But if you really need that kind of privacy, the reality is that search engines, including Google, do retain this information for some time. And it's important, for example, that we are all subject in the United States to the Patriot Act. It is possible that that information could be made available to the authorities."

Scott Fulton On Point badge (200 px)Now, if I were a psychologist, or even if I played one on TV, I could write a field manual on this single paragraph. It's amazing what associations can tell you about the way one's mind works; and Google is, after all, in the business of making associations -- literally, creating contexts and selling ad space on them. Schmidt's second and third sentences imply that his company retains information on what people shouldn't be doing. And Schmidt's third and fourth sentences imply that Google is bound by law to opening up that treasure trove to law enforcement.

The question was, should people trust you? And the answer was, hey lady, I could turn you over to the cops at any time.

Somewhere within the PR department of Google, a hundred heads simultaneously sank.

The sad truth is, we really don't know the extent to which Google or Bing or Facebook or Twitter retains information about its users for any length of time. But if there truly were some type of conspiracy to construct a data mine that plows into the individual privacy rights of billions, it would require a collective resource called intelligence, the abundance of which is contra-indicated by available evidence. It would require the type of concerted effort, thoughtful engineering, and unadulterated ingenuity with which the Internet itself has never been blessed.

The Internet, for lack of a more poetic description, happened. Most of its technology just fell into place, like the result of a classic game of "52-Pickup." The opportunities were there for every Web connection to be encrypted, for every transaction to be certified, for each individual to specify the extent to which she wants her personal information to be utilized -- or not -- and for servers to comply. The specifications were written, and the proposals were made. Google "HTTP-Next Generation" (HTTP-NG) sometime and see what you find.

So why didn't it get done?

The typical response is, because somebody -- often somebody specific -- failed to care enough to see the project through. When big problems culminate in colossal failures, step one in the process is typically to assign blame. Earlier this week, after the embarrassing revelation of information in sensitive TSA documents not having been properly redacted, the Dept. of Homeland Security immediately responded by saying it is tightening its noose around who's to blame. And that should make us all feel better. But in the back of our minds, we know that noose needs to be made wider, not narrower.

The most prominent failure of policy, whether it be in a government or a computer network, is its failure to exist. When there are no rules to be followed, what ends up happening is typically some well-meaning person's best attempt at a solution. That's probably the case with respect to the TSA worker who could not delete the text in the PDF document because his old version of Acrobat didn't let him, so he just blacked it out with a rectangle.

It's probably a very similar case with respect to the architecture of the Internet as a whole. The multitude of privacy violations that take place every day to thousands of individuals whose names aren't "Transportation Safety Administration," are not the fault of one person upstream someplace who failed to click the right button. We -- as in the oversized first word in the US Constitution, "We" -- have failed to work this problem out. Instead, we're trusting someone else to do the job for us.

"People are treating Google like their most trusted friend," asked Maria Bartiromo so poignantly. "Should they be?"

"Get over it."

The Internet was not originally designed to be a communications system. Back in the late 1960s, DARPA's engineers were not looking to replace the telephone with something digital. What they designed was a network of dynamic, switchable routes that connect terminals in one place to databases in another. The Internet was, and is, a networked database.

Like any database, what it gives us is a function of what we give it. Data in, data out. And yes, I'm sounding like Scott McNealy, when he told a JavaOne conference in 1999, "You have no privacy. Get over it." But if McNealy was speaking on behalf of his database -- which, indeed, he was -- then it seems sensible to say that a database isn't going to give you something that you don't give it. Something intangible such as, say, privacy.

To make the database give someone else less than what you give it, there needs to be policies. There needs to be a system in place that enables individuals to specify, first of all, the information that belongs to them. That system needs to enable individuals to specify who can use that information, and who cannot.

But that system will not be given to us. It will not be bestowed upon us, like a human right.

Next: "Give us our privacy!"


Click here to listen to What Are We Learning Today? Betanews podcast (MP3 format, approx. 20:00 min.)


© 1998-2014 BetaNews, Inc. All Rights Reserved. Privacy Policy.