The top item today in my experimental del.icio.us affinity feed is del.icio.us director, an alternate interface to del.icio.us from the the guy who created the Gmail agent API last summer, Johnvey Hwang. In this case, the automated recommendation was redundant; I'd already tuned into the conversation about this new development. What the del.icio.us recommender really needs to do, I suspect, is find people and links that aren't so directly related to my interests, and that I wouldn't otherwise find. I've tried using lower thresholds of personal and link affinity, but so far the results are inconclusive.
Recommendation strategies aside, Johnvey Hwang's effort is yet another stunning remix. It loads your del.icio.us bookmarklets into the browser and creates powerful new modes of navigation and search. "These days," wrote Deane at Gadgetopia, "if you don't like how a site works, just change it."
Indeed. There's a whole lot of remixing going on lately! Dan Phiffer's Wikipedia Animate, which I illustrated on Friday in a screencast, was only one of four implementations submitted to Andy Baio's Wikipedia history animation contest. The others are: Corey Johnson's WikiDiff, John Resig's AniWiki, and Colin Hill's Better History.
As I looked through the code for all of these, I thought about the dialogue among Stephen O'Grady, Bill de hÓra, and Dare Obasanjo on the fragility of screenscraping, and I arrived at this question:
How do you design a remixable Web application?
We've focused a huge amount of attention on RESTful versus WS-* APIs, but that distinction isn't central to the current mashup craze. In many cases, as a matter of fact, neither kind of API plays a central role. Paul Radermacher's HousingMaps, for example, relies on nothing like a conventional API from either Google Maps or craigslist. The craigslist data is acquired by screenscraping (I presume) and the mapping behavior arises from the interpolation of that data into the Google Maps interface. It's the same with the Wikipedia animations. The history data is acquired by screenscraping; the animation interpolates that data into extensions of the existing interface.
It's remarkable that these effects are achievable with no explicit support from the origin sites. But what if those sites did want to offer explicit support? What would that look like?
I think we've known the answer for a long time. A website that wants to be remixable will deliver content as XML and behavior as script. These aspects can be, and will be, combined server-side for Web 1.0 clients, but Web 2.0 clients will increasingly be able to do this processing for themselves. So there will be two ways to remix: by intercepting the server-side combination of XML content and scripted behavior, or by recombining on the client.
What's unclear, at least to me, is whether -- or to what degree -- the content and code will be augmented by external descriptions that can be used by automated tools, and can insulate derivative works from change. Could a lightweight schema for the content, and a lightweight service description for the code, make these remixes more resilient without imposing too much overhead? It's a great experiment for someone to try.
Former URL: http://weblog.infoworld.com/udell/2005/06/27.html#a1258