Investigating the Exaflood
written by JOSH STERN
For a long time, Andrew Blakey, 21, a University of Waterloo student and his family used the Internet as much as they wanted. He had been with Rogers for 10 years and as a current subscriber to the unlimited Internet service package, Blakey knew he could browse the Internet, use Skype and download legal content as much as he wanted.
But all that changed in December 2007. One day, seemingly out of the blue, when Blakey opened up his browser an official message from Rogers appeared up on the screen saying that he had exceeded 75% of his monthly-allotted bandwidth.
No matter how many times he refreshed the screen or changed web sites, the message wouldn’t disappear. The only way it would go away was if he checked the box entitled “I acknowledge that I received this message” which he eventually did.
On his next bill he discovered a small notation that indicated he had to monitor his Internet usage from then on and that he only had 95 Gigabytes (GB) of bandwidth a month to use and that if he went over that, he would be charged per GB to a maximum of $25 on top of the service bill.
These changes upset him and when he called up Rogers to ask what was going on, they told him that there was nothing that they could do.
A November 2007 study entitled, “The Internet Singularity, Delayed: Why Limits in Internet Capacity will Stifle Innovation on the Web,” conducted by the Nemertes Research Group, a research advisory firm based out of Illinois, found that we could potentially be reaching the end of accessing the Internet as we know it.
The study looked at the current Internet infrastructure as well as existing and expected Internet traffic. It found that over the next five years, and as early as 2010, the Internet’s current set-up would be inadequate to handle traffic, which could result in service brownouts.
At the heart of the study is the application of a version of Moore’s Law. In a paper written in 1965 by Intel Co-founder Gordon E. Moore, the author posited the law that stated the number of transistors that can be cheaply put on an integrated circuit would increase exponentially, doubling every two years.
The study applies Moore’s Law to the growing innovations that have come out of the Internet and its applications. The study estimates that in order to “bridge the gap between demand and capacity” a global investment of $137 billion would be required which is “roughly 60-70 percent above and beyond the $72 billion service providers are already planning to invest.”
“This groundbreaking analysis identifies a critical issue facing the Internet – that we must take the necessary steps to build out network capacity or potentially face Internet gridlock that could wreak havoc on Internet services,” said Larry Irving, co-chairman of the Internet Innovation Alliance, a non-profit association based in Washington, in the accompanying press release.
“It’s important to note that even if we make the investment necessary between now and 2010, we still might not be prepared for the next killer application or new internet-dependent business like Google or YouTube. The Nemertes study is evidence the Exaflood is coming.”
The term “Exaflood” was coined by Bret Swanson in a February 2007 Wall Street Journal op-ed piece. In the piece, Swanson, a Senior Fellow at the Progress and Freedom Foundation, a think tank based in Washington, DC, argues a case similar to that of the findings of the Nemertes study.
The Exaflood is, “a surge of data traffic mostly created by video and other forms of new rich media flowing over the Internet. So the prefix, exa, refers to 10 the 18th power,” explained Swanson from his home in Indiana.
“We’re talking about exabytes of data flowing through the internet . . . we thought [the term] was descriptive of the way video and new rich media would affect Internet traffic.”
According to a January 2008 paper written by Swanson and George Gilder for the Discovery Institute in Washington, DC, one exabyte holds the equivalent data to a trillion books of about 400 pages each. Currently, the amount of exabytes traveling over the Internet would amount to over 15 trillion books, which if stacked on top of each other would reach twice as far as the sun.
There is predicted “a huge increase over time in data traffic requiring constant investments and upgrades,” he explained. “Building new networks, upgrading existing networks and building broadband fiber-optic links to homes and businesses, more wireless capacity [and] more capacity in the networks [is essential].”
Swanson sees the history of the Internet as divided into three phases. The first phase began in 1969 when research projects that existed in the U.S. and in Canada used a “decentralized network” to communicate with other, a primitive sort of email.
The second phase began in the early 90’s when the World Wide Web and email was opened up to the average user; when according to Swanson, from 1994-96 alone, Internet traffic increased a hundred times.
With rich content sites such as YouTube, Swanson says we are now in the midst of the third phase. The Internet isn’t just for email and simple text anymore. It’s Facebook, YouTube, iTunes, voice over Internet protocols, P2P applications and more that take up huge amounts of bandwidth, and could potentially cause the Exaflood to occur if the networks aren’t properly upgraded.
He stresses how important funding is to prevent potential future issues. As well, he cautions that despite the potential for problems, people have been taking his findings out of context. “Some people have misconstrued or misrepresented our argument that the Internet is going to crash. We don’t think that. But it certainly requires lots of new investments” he said.
Indeed even the authors of the Nemertes study wrote: “It’s important to stress that failing to make that investment will not cause the Internet to collapse. Instead, the primary impact of the lack of investment will be to throttle innovation – both the technical innovation that leads to newer and better applications, and the business innovation that relies on those technical innovations and applications to generate value. The next Google, YouTube or Amazon might not arise, not because of lack of demand, but due to an inability to fulfill that demand.”
However, many news organizations have been playing up the potential catastrophic nature of the Nemertes study. CBC News even branded this influx of bandwidth usage to be “the end of the Internet as we know it.”
And that’s where the average consumer such as Blakey comes in to the mix.
Rogers and Bell Canada, the two biggest Internet Service Providers (ISP) in Canada have been using bandwidth congestion as reason to modify the way we use the Internet.
Under the guise of helping consumers to have an equal playing field online, these two ISP’s have been engaging in a practice commonly known as traffic shaping, targeting users or bandwidth hogs, who have exceeded, according to the ISP’s definition, their fair allotment of bandwidth, and therefore they say, have been slowing down their Internet speeds.
“What we discovered is this very small percentage of people, somewhere in the ballpark of 5 per cent, end up consuming upwards of 30 or even higher per cent of broadband relegating the other 95 per cent to a lesser amount of actual broadband,” explained Jason Laszlo, the associate director for media relations at Bell
“What we have done is implemented a network management policy which slows down the flow of peer-to-peer (P2P) traffic during peek times during approximately 4:30 [pm] to 2 in the morning to free up space for the majority of users.”
Laszlo failed to go into specifics, but Bell’s practices have already generated a large amount of negative feedback from users.
Rogers, has equally come under fire. They used current bandwidth issues to usher in new service plans divided up by GBs of bandwidth usage essentially forcing their existing customers to switch to a more expensive plan in order to keep doing what they do online.
What they haven’t been entirely forthright with, however, is that even if someone were to sign up for their highest plan, their service would still be subject to traffic shaping. The same goes for Bell.
As well, they’ve begun targeting specific file types. Files transferred over P2P applications such as BitTorrent, a popular file-sharing program, appear a certain way, which enabled Rogers to slow down their transfer.
In order to combat this, many P2P users have begun encrypting their files, which then made Rogers start slowing down all encrypted files regardless of whether it was one of the targeted files or not. Rogers declined to comment for this article.
Both ISPs have been continuously citing congestion problems as reasons for their actions, and this has recently been thrown more and more into doubt by consumers and the media and the ISP’s actions have been affecting everyone from the user at home to businesses that use high speed internet for video conferencing and other such services.
And despite the apparent issues, even Laszlo admits that Bell has already spent billions to fix whatever problems they did have.
Michael Geist, a columnist with the Toronto Star and University of Ottawa professor in Internet and E-commerce law, feels that these issues could be simply fixed by instituting true metered pricing, which would entail charging by the amount of bandwidth used.
“If we went to true metered pricing it would probably work out very well for the overwhelming majority of users. My concern is that many of the network providers want to have their cake and eat it too.” He said.
“And they do that in two ways: One, they charge you for X amount of bandwidth on a monthly basis, but yet make it very difficult for people to actually use that by throttling back the amount of available bandwidth for certain applications. so that you pay for 60 gigs, but you never can come close to that, because they actually limit the amount of available bandwidth.
“And the other is that if you were going to go to true metered pricing, of course what they would find is that many consumers would pay far less than they pay right now. They don’t really want true metered pricing, they want everybody paying a minimum, fairly high flat fee and then only targeting a bunch of others.”
But now, things have been getting serious. As of this writing, the Canadian Association of Internet Providers (CAIP) has been lobbying the CRTC to force Bell to stop traffic shaping.
In the CAIP’s application to the CRTC they explain how Bell has been using technology that flags certain files, opens them up, and then depending on what’s in them decides how much bandwidth to allot it.
“This aspect of Bell’s wholesale throttling activities gives rise to concerns that Bell’s actions violate the privacy of the communications of its wholesale customers (as well as that of their own end-user customers),” the CAIP wrote.
“It also gives rise to concerns that Bell has violated its duty under section 36 of the Act not to control the content or influence the meaning or purpose of telecommunications carried by it for the public.”
Furthermore, in a recently released legal brief they pointed out the odd coincidence of how they began “throttling” users Internet at the exact same time that they began offering their new unlimited plans, which users would, of course, feel the need to switch to.
Bell submitted its own application to dismiss the claims on April 15th, but Laszlo refused to comment on the CAIP’s allegations directly.
For now it’s up to the CRTC to make a ruling. Up until now they have been quiet when it comes to these issues more broadly known as net neutrality.
The extend of the bandwidth congestion problem has been widely debated on message boards across the Internet, and given that much data says that the Exaflood will eventually come in some form or another in the coming years, how the CRTC reacts will largely affect how the major ISPs will deal with it when the time comes.
Because when it does, how they react really will change the Internet forever.
Back in Waterloo, Blakey is annoyed at what happened with Rogers since he had no choice but to upgrade to a higher bandwidth package.
As well, his family is trying to connect his Internet at home with the Internet at his cottage but doing so requires certain encrypted codes that can be misconstrued as P2P traffic which makes it difficult to set up the system.
Blakey has been adjusting but he just wishes that this change had never happened.
“It’s kind of sad that we pay 50 dollars a month for something and ... it doesn’t work for us. It doesn’t do what we need it to do.”
(With files from: Jef Catapang, Krista Cyr & Laurie Wilson)