Leigh Dodds, Mark Nottingham, and Ryan Tomayko have all suggested that, while client-side HTTP toolkits may contribute to HTTP abuse, server-side frameworks are more fundamentally to blame. Sam Ruby made a similar point in his post-ETech critique of SAJAX, a server-side framework for delivering AJAX-style code to the client.
Have I been looking at this through the wrong end of the telescope? Perhaps so. Ryan Tomayko sums up nicely:
Let's face it, if you want to do something outside of exposing well-known static representation types from disk for GET, or process application/x-www-urlencoded data via POST, you're off the radar for most web frameworks.
...
We need a good implementation of HTTP/1.1 that provides a real framework for building standards based web applications. We then need to advocate and illustrate the correct use of HTTP/URIs/XML as a killer technology that has been hiding right under our noses by showing the benefits of using the system correctly. Until we get this stuff straightened out, expecting people to use GET properly is unrealistic. [Ryan Tomayko: On HTTP abuse]
I'm inclined to agree that the burden of making it easy to do the right thing lies more heavily on server-side toolkits than on client-side ones. In either case, or both, consider this reminiscence by John Montgomery:
I can remember sitting in a room nearly seven years ago with Andrew Layman, Gopal Kakivaya, and Don Box (the primary authors of SOAP) arguing about whether and how SOAP 1.0 should use Get and Post. Seven years and people are still having to ask these questions. It kind of makes you wonder how long it takes developers to really understand a technology. [A View From Elsewhere: Get and Post: Once more unto the breach]If HTTP's measly handful of verbs haven't been sorted out after more than a decade, can there be any hope for WS-*? I'm pretty sure there can, but we're struggling with an apparent contradiction:
abuse and emergence: When I first began using Perl I found that I was immediately productive, long before I really had a clue about how to use the language. I thought that was a good thing, and I still do, and I think the same principle applies to HTML and HTTP. If doing stuff in these environments had required deep understanding and/or specialized tools and frameworks, then all sorts of important stuff wouldn't have gotten done. From this perspective you could argue that resiliency in the face of widespread abuse is a key quality of technologies that give us the emergent effects we want, that REST has this quality, and that Web services technologies need more of it.
respect and emergence: I recently mentioned Coral, an open content distribution network. It's a good example of Geoffrey Moore's paradigm shift from "Internet-enabled client-server architecture" to "a bus." HTTP-abusing applications are not going to play nicely with things like Coral. From this perspective you could argue that a next generation of emergent effects will be stillborn until best practices are widely respected, that specialized tools and frameworks are the only way to get there, and that in this regard REST and Web services are in exactly the same boat.
Yet another Necker cube. The trick is probably to use both perspectives appropriately.
Former URL: http://weblog.infoworld.com/udell/2005/04/25.html#a1221