Part 2

By David F. Carr  |  Posted 2007-12-20 Print this article Print

Will this be the year that the Web development architecture known as REST invades the enterprise, derailing or fundamentally altering the orthodox approach to web services and service oriented architecture (SOA) that's been built up over the past several

When you use the web, your browser issues HTTP GET commands every time you enter a new address or click on a link and HTTP POST commands for data entry forms. REST suggests that these standard operations and a couple of others (such as PUT and DELETE) are also a natural way of designing machine-to-machine communications. For example, a program can explore the programming interface of a remote service with a series of GET commands that bring back links to other resources.

Where SOAP is a specific XML protocol, REST is an architectural style – a set of rules for designing networked application. Applications can be described as more or less "RESTful" depending on how closely they adhere to those principles, but there's no official standard to implement, other than the basic protocols of the Web. That's one of the attractions for web-centric businesses because it means REST applications can run on the same caching, load balancing, and security infrastructure as other web applications, without requiring additional middleware.

Fielding finds it annoying that SOAP and related standards became synonymous with web services, given that their workings aren't particularly web-like. "I never considered them web services – at best, they're XML services," he says.

SOAP, which was originally known as the Simple Object Access Protocol, became less "simple" as the result of efforts to create sophisticated enterprise services on top of it. SOAP started out as a way of using XML and invoke the functions of remote software objects or components over the web's Hypertext Transfer Protocol (HTTP) and followed in the tradition of other types of remote procedure call, distributed object computing, and message oriented middleware systems. The SOAP specification itself defines the format of an XML "envelope" that wraps around the actual message or "payload" and specifies its destination. Since it was introduced in the late 1990s, with the backing of both Microsoft and the vendors of Java-based middleware, SOAP has been at the center of the marketing of the concept of web services and, more recently, SOA.

The promise of these technologies has always been that they would bring new levels of reuse, flexibility, and agility to enterprise systems. But even though there are many success stories to be told, Gartner's Gall says enterprise architects are starting to feel some jealously for what's happening outside the corporate firewall. "These guys are looking over their shoulders at the true web – at what's going on with Google and Amazon and mashups – and saying, 'Hey, wait a minute, how come we're not getting that level of flexibility out this stuff you sold us called 'web services,' " he says.

Clients who are disappointed by the payoff from their SOA efforts have often created too much unnecessary complexity, Gall suggests. One of the reason that web-oriented architectures like REST have an advantage is that they're simpler and therefore easier to reuse and mash up, he says.

Even if it's not practical for an enterprise to make a wholesale shift from SOAP to REST, the organization can use the contrasting example of public web services as an impetus for simplification. For example, SOAP services are typically described with WSDL (Web Services Description Language) files. Often, by asking how many WSDL files the client has created, Gall says he discovers the enterprise has created hundreds of "one-off" WSDL files. Often, one reason the underlying services aren't particularly reusable is that they lack common data definitions. One way of changing that is to combine SOA efforts with master data management and making sure services, for example, use only master lookup keys to identify customers.

Next page: Part 3

David F. Carr David F. Carr is the Technology Editor for Baseline Magazine, a Ziff Davis publication focused on information technology and its management, with an emphasis on measurable, bottom-line results. He wrote two of Baseline's cover stories focused on the role of technology in disaster recovery, one focused on the response to the tsunami in Indonesia and another on the City of New Orleans after Hurricane Katrina.David has been the author or co-author of many Baseline Case Dissections on corporate technology successes and failures (such as the role of Kmart's inept supply chain implementation in its decline versus Wal-Mart or the successful use of technology to create new market opportunities for office furniture maker Herman Miller). He has also written about the FAA's halting attempts to modernize air traffic control, and in 2003 he traveled to Sierra Leone and Liberia to report on the role of technology in United Nations peacekeeping.David joined Baseline prior to the launch of the magazine in 2001 and helped define popular elements of the magazine such as Gotcha!, which offers cautionary tales about technology pitfalls and how to avoid them.

Submit a Comment

Loading Comments...
eWeek eWeek

Have the latest technology news and resources emailed to you everyday.