A Member Rants

By David F. Carr  |  Posted 2007-01-16 Email Print this article Print
 
 
 
 
 
 
 

Booming traffic demands put a constant stress on the social network's computing infrastructure. Here's how it copes.

A Member Rants

On his MySpace profile page, Drew, a 17-year-old from Dallas, is bare-chested, in a photo that looks like he might have taken it of himself, with the camera held at arm's length. His "friends list" is weighted toward pretty girls and fast cars, and you can read that he runs on the school track team, plays guitar and drives a blue Ford Mustang.

But when he turns up in the forum where users vent their frustrations, he's annoyed. "FIX THE GOD DAMN INBOX!" he writes, "shouting" in all caps. Drew is upset because the private messaging system for MySpace members will let him send notes and see new ones coming in, but when he tries to open a message, the Web site displays what he calls "the typical sorry ... blah blah blah [error] message."

For MySpace, the good news is that Drew cares so much about access to this online meeting place, as do the owners of 140 million other MySpace accounts. That's what has made MySpace one of the world's most trafficked Web sites.

In November, MySpace, for the first time, surpassed even Yahoo in the number of Web pages visited by U.S. Internet users, according to comScore Media Metrix, which recorded 38.7 billion page views for MySpace as opposed to 38.05 billion for Yahoo.

The bad news is that MySpace reached this point so fast, just three years after its official launch in November 2003, that it has been forced to address problems of extreme scalability that only a few other organizations have had to tackle.

The result has been periodic overloads on MySpace's Web servers and database, with MySpace users frequently seeing a Web page headlined "Unexpected Error" and other pages that apologize for various functions of the Web site being offline for maintenance. And that's why Drew and other MySpace members who can't send or view messages, update their profiles or perform other routine tasks pepper MySpace forums with complaints.

These days, MySpace seems to be perpetually overloaded, according to Shawn White, director of outside operations for the Keynote Systems performance monitoring service. "It's not uncommon, on any particular day, to see 20% errors logging into the MySpace site, and we've seen it as high as 30% or even 40% from some locations," he says. "Compare that to what you would expect from Yahoo or Salesforce.com, or other sites that are used for commercial purposes, and it would be unacceptable." On an average day, he sees something more like a 1% error rate from other major Web sites.

In addition, MySpace suffered a 12-hour outage, starting the night of July 24, 2006, during which the only live Web page was an apology about problems at the main data center in Los Angeles, accompanied by a Flash-based Pac-Man game for users to play while they waited for service to be restored. (Interestingly, during the outage, traffic to the MySpace Web site went up, not down, says Bill Tancer, general manager of research for Web site tracking service Hitwise: "That's a measure of how addicted people areā€”that all these people were banging on the domain, trying to get in.")

Jakob Nielsen, the former Sun Microsystems engineer who has become famous for his Web site critiques as a principal of the Nielsen Norman Group consultancy, says it's clear that MySpace wasn't created with the kind of systematic approach to computer engineering that went into Yahoo, eBay or Google. Like many other observers, he believes MySpace was surprised by its own growth. "I don't think that they have to reinvent all of computer science to do what they're doing, but it is a large-scale computer science problem," he says.

MySpace developers have repeatedly redesigned the Web site's software, database and storage systems to try to keep pace with exploding growth, but the job is never done. "It's kind of like painting the Golden Gate Bridge, where every time you finish, it's time to start over again," says Jim Benedetto, MySpace's vice president of technology.

So, why study MySpace's technology? Because it has, in fact, overcome multiple systems scalability challenges just to get to this point.

Benedetto says there were many lessons his team had to learn, and is still learning, the hard way. Improvements they are currently working on include a more flexible data caching system and a geographically distributed architecture that will protect against the kind of outage MySpace experienced in July.

Most corporate Web sites will never have to bear more than a small fraction of the traffic MySpace handles, but anyone seeking to reach the mass market online can learn from its example.



<12345678910>
 
 
 
 
David F. Carr David F. Carr is the Technology Editor for Baseline Magazine, a Ziff Davis publication focused on information technology and its management, with an emphasis on measurable, bottom-line results. He wrote two of Baseline's cover stories focused on the role of technology in disaster recovery, one focused on the response to the tsunami in Indonesia and another on the City of New Orleans after Hurricane Katrina.David has been the author or co-author of many Baseline Case Dissections on corporate technology successes and failures (such as the role of Kmart's inept supply chain implementation in its decline versus Wal-Mart or the successful use of technology to create new market opportunities for office furniture maker Herman Miller). He has also written about the FAA's halting attempts to modernize air traffic control, and in 2003 he traveled to Sierra Leone and Liberia to report on the role of technology in United Nations peacekeeping.David joined Baseline prior to the launch of the magazine in 2001 and helped define popular elements of the magazine such as Gotcha!, which offers cautionary tales about technology pitfalls and how to avoid them.
 
 
 
 
 
 

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters