Lessons learned by IT in 2009 #1: 'Net neutrality' is a myth
Betanews begins its transition to the new decade with an examination of the critical issues that taught us valuable lessons in the past year. If you're old enough to remember 1999, you may remember the sense of wonder, possibility, and dreams yet to be fulfilled that was drummed up by what used to be called the "media," during the much-celebrated rollover of the odometer. The first decade of the new millennium hit us squarely between the eyes, awakening us to the colder, more tangible reality that before we start cultivating new problems for our descendants to solve, we have to resolve all the old ones we've been sweeping under the rug.
Time Magazine thought the past decade sucked. We were all saddened to hear how disappointed the editors at Time were by the performance of the 2000s, especially when compared to brighter, livelier, more dramatic decades such as the 1940s, 1960s, and that harbinger of great times and cool tunes, the 1060s.
If the 2000s from an historical perspective can be characterized in my personal memory by a single phrase, I think it should be the Decade of Whining -- a decade most appropriately commemorated by the tone at Time. The great voices that defined our society sounded less like Thomas Jefferson, Archibald MacLeish, or Ronald Reagan and more like Andy Rooney, Howard Stern, and Rush Limbaugh -- people wasting time listing the things they hate, without motivating themselves so much as to stand up.
In the absence of anything motivational or inspirational or spiritual -- or, in many cases, factual -- whining rises to the top, becoming a powerful, exploitable force. It conjures just enough public discontent with the status quo to evoke the minimum response in one's favor, without provoking the whiner into undertaking such a revolutionary level of reaction that the cause of the whining itself becomes quashed. Exploiting whining doesn't require much effort -- for example, painting the word "Change" on a billion signs. The response can be compared to a cacophony of babies insisting on a change of diapers, the collective outcry of a populace unequipped to do the job for themselves.
All bits are not created equal
In 2009, the net neutrality debate became a textbook example of harnessing the latent power of whining. In some countries, the question of whether an Internet service provider should have the right to manage data traffic based on the applications for which that data was being used, was elevated to nothing less than a human rights debate, on the order of a citizen's right to speak freely or to vote.
"Civil rights are fundamentally about protecting fairness, equality and freedom for all people. Net Neutrality is about protecting fairness, equality and freedom for all online data," writes SaveTheInternet.com online activist Garlin Gilchrist II. "From a values perspective, these two concepts are functionally equivalent."
The existence of net neutrality as a cause came about as the result of legislation proposed in 2005 by the then-Republican-controlled Congress, that would have created an alternative national licensing system for ISPs. In lieu of tax breaks as incentive for ISPs to choose national over existing municipal licenses (what cable companies such as Comcast and Time Warner must contend with, in every municipality in the US), the Senate considered relaxing regulations for national licensees -- for example, enabling them to build premium tier services for Internet applications they could then resell at a premium. The progress toward passing this legislation was effectively defeated by mostly Democratic opposition who successfully encapsulated the creation of "fast lanes" for potential high-bandwidth customers such as Google, as an issue of promoting unfairness and anti-competitive behavior.
The earliest tangible form of opposition to national licensing came in the form of legislation designed more to solidify the legal concept of net neutrality than to be passed by a floor vote. The latest evolution of that legislation, however, is now actively being considered in Congress for passage as law: HR 3458, the Internet Freedom Preservation Act.
Though the bill's provision prohibiting an ISP (such as Comcast) from blocking a user's access to a particular application (such as BitTorrent) receives the most attention, another prominent provision harks back to the root of the whole debate.
"Each Internet access service provider shall have the duty to...not provide or sell to any content, application, or service provider, including any affiliate provider or joint venture, any offering that prioritizes traffic over that of other such providers on an Internet access service," reads the bill's current draft.
The Web happened
The prohibition of any method of premium packet prioritization seems fair if one considers the Internet the way Google has most often characterized it: in short, as the Web by another name. Google proposed a formal regulatory definition for net neutrality to the Federal Communications Commission in June 2007. Citing the co-author of the original TCP/IP Reference Model -- its own Chief Internet Evangelist, Vint Cerf -- Google wrote, "The Internet's open, neutral architecture has provided an enormous engine for market innovation, economic growth, social discourse, and the free flow of ideas. The remarkable success of the Internet can be traced to a few simple network principles -- end-to-end design, layered architecture, and open standards -- which together give consumers choice and control over their online activities."
History reveals a very different story. In Andrew S. Tanenbaum's 1996 edition of his famous reference guide, Computer Networks, he compared TCP/IP to the then-recently-defeated OSI model of data interchange:
"The TCP/IP model and protocols have their problems too," Prof. Tanenbaum wrote. "First, the model does not clearly distinguish the concepts of service, interface, and protocol. Good software engineering practice requires differentiating between the specification and the implementation, something that OSI does very carefully, and TCP/IP does not. Consequently, the TCP/IP model is not much of a guide for designing new networks using new technologies.
"Although the IP and TCP protocols were carefully thought out, and well implemented, many of the other protocols were ad hoc, generally produced by a couple of graduate students hacking away until they got tired," the Dutch professor continued later. "The protocol implementations were then distributed free, which resulted in their becoming widely used, deeply entrenched, and thus hard to replace. Some of them are a bit of an embarrassment now."
That was 14 years ago, of course. In an effort to provide some support for the development of applications that were emerging on the Internet at the time, Tanenbaum chronicles the creation of the Internet Society (ISOC) in 1992. That group would define the principal applications of TCP/IP as: e-mail, news (NNTP), remote login (Telnet), and file transfer (FTP), though ISOC would refrain from serving as a governing body. The Web, however, developed despite that definition, as an application unto itself.
While there had been consideration over the mechanisms and methods that could be used to regulate and prioritize traffic -- for example, so that congestion due to FTP didn't drag down e-mail -- the consideration of the social stance of the Internet as an entity is a much later invention. So while Google refers to the "overarching rationale" for the Internet's creation as "the revolutionary intention not to have an uninvited gatekeeper anywhere in the network, but instead to give ordinary end users ultimate control over what to do, where to go, and whom to communicate with," the truth is that the concept of the Internet as a communications medium between individuals, rather than -- as ISOC originally proposed it -- a network of bridges between research organizations and universities, is a relatively recent discovery.
Next: Gaining the most leverage...