Results tagged “ajax”
August 7, 2009
Google Wave is an impressive set of technologies, the kind of stunningly slick application that literally makes developers stand up and cheer. I've played with the Google Wave test sandbox a bit, and while it's definitely too complex to live up to the "this will replace email!" hype that greeted its launch, it certainly has some cool features. So the big question is whether Wave will succeed as overall in becoming a popular standard for communications on the web, because Google has made an admirable investment in documenting the underlying platform and making it open enough for others to build on and extend. I think the answer is no, and the reason is because the Wave way is not compatible with the Web way.
What do I mean by "the Web way"? Well, if we look at the history of new technologies being adopted to extend web sites and enhance communications, we see a few trends emerge:
- Upgrades to the web are incremental. Instead of requiring a complete overhaul of your technical infrastructure, or radical changes to existing behaviors, the web tech that wins is usually the sort of thing that can be adopted piecemeal, integrated as needed or as a normal part of updating one's websites or applications.
- Understanding new tech needs to be a weekend-sized problem. For a lot of web developers, long before they start integrating a new protocol or platform into their work, they hack together a rough demo over a long weekend to make sure they truly grasp how it works. And a weekend-scale implementation on a personal site usually translates roughly into a 90-day implementation cycle in a business context, which is a reasonably approachable project size. (In tech, three days in personal effort often translates to three months of corporate effort.)
- There has to be value before everybody has upgraded. This is basically a corollary to Metcalfe's Law. While we know networks increase in value as they add more nodes, the nature of web tech is that, in order to be worthwhile, it has to provide value even if the people on the other end haven't upgraded their software or web browsers or clients or servers. Otherwise you're shouting into an empty room.
- You have to be able to understand and explain it. Duh.
Now, if we take a look at some examples of what has worked, we can see how various successful technologies have displayed these traits. One great example is feeds. When RSS feeds were new, it was easy to understand their potential immediately, and since I was working at a newspaper at the time, I just spent an afternoon understanding the format and hacking together a quick feed of headlines that anybody could subscribe to. If nobody had adopted feedreaders yet, that was no problem, since there was no cost to just having the feed sit there with no subscribers — the "nobody's upgraded" problem would only result in me having wasted a few hours.
Ajax had a similar adoption pattern. It took a little bit more time to comprehend, but not much more than an afternoon, and the development effort required for adding Ajax enhancements to an application started as a weekend-scale project and has only gone down over time. Following the principles of progressive enhancement, well-designed implementations performed just fine on older browsers or systems that couldn't handle the new features. And most sites that have added Ajax features have done so by adding the requirements as a checklist item in the course of normal ongoing updates, not as standalone efforts to migrate to a new technology.
This brings us to Wave. Wave offers excellent opportunities to extend its core features and to add richness to its "wavelets", and I have no criticisms over its utility as a developer platform that third parties can build upon. But the fundamental Wave protocols are, I fear, a bit too complex to ever be fully and correctly implemented by anyone other than Google. Interoperability is likely to be a challenge that plagues the platform for its entire existence. In short: It's likely that nobody will ever build a fully-compatible clone of Wave that competes with Google's own implementation.
Why is that true? Let's look at what's built in to Wave:
- Powerful realtime collaboration features
- Unlimited versioning of content
- Built around robust XMPP protocol
- Combines chat, document editing, and message threading — wikis + blogs + comments + IM
- Delivered as a very polished rich user interface
Each of these is a very compelling experience. But a lot of developers' reactions to seeing them was not just "I can't wait to use that!" but also "I want to add that one feature to my own existing application!". And that's where it gets tough. Let's take a look at Joe Gregorio's list of the protocols that power Wave. (Joe works at Google, but made this list before he was working on Wave. I appreciate his research and openness on this topic, and presenting his work here is a tribute to what makes Wave great, not a criticism of his effort.)
- Federation (XMPP)
- The robot protocol (JSONRPC)
- The gadget API (OpenSocial)
- The client-server protocol (As defined by GWT)
That's a lotta stuff! XMPP alone is a bear to implement, let alone to deploy at large scale. (I can't think of anyone outside of Google, Earthlink and LiveJournal who have deployed XMPP to millions of users.) But if you wanted to make another application that truly interoperates with all that Wave can do, combining all of these pieces would just be the starting point.
And people aren't looking for a replacement for email, or instant messaging, or blogs, or wikis. Those tools all work great for their intended purposes, and whatever technology augments them will likely offer a different combination of persistence and immediacy than those systems. Right now, Wave evokes all of them without being its own distinctive thing. Which means it's most useful in providing reference implementations of particular new features.
If a developer wants one of the compelling individual features of Wave, like near-realtime collaboration, they're more likely to use something like (wait for it...) Pushbutton technologies. The infrastructure afforded by the components of the Pushbutton Platform comes nowhere near the richness and polish displayed by Google Wave. Pushbutton isn't even designed to offer the benefits demonstrated by Wave. But to its credit, Pushbutton displays nowhere near the complexity of Wave in its interoperability requirements. More importantly, integrating Pushbutton features into a website or application isn't a monolithic process of building dozens of cutting-edge features, but rather can be deployed incrementally by even non-expert webmasters.
In this context, it might help to think of Pushbutton tech as a "micro-Wave". As Gina Trapani said in mentioning Google Reader's support for PubSubHubBub:
Huh-wha? you ask. Yeah, I know. It's no Google Wave. But that's what makes this exciting. This kind of small Pushbutton implementation is how real web pages will easily use existing technology to notify one another of new updates. The Google Reader/FriendFeed integration is just the first tiny step in what will be a broad deployment of realtime-enabled sites. These sites and services will let one another know when they have new data to share without the sucky inefficiencies of polling. Check out how fast FriendFeed updates when you share an item in Google Reader in the video above.
In short, it's almost zero latency.
Why is this clearly "inferior" technology going to win? Well, as just one example, XMPP is way too complicated for any normal human to deploy. Whereas if you're reading this, you probably already have access to a regular HTTP web server that could talk to a Pushbutton hub. In fact, the only two backers I know who have worked extensively with XMPP are Brad Fitzpatrick and Artur Bergman, who co-created Djabberd. And they are both excited about PubSubHubBub. Realistically, someone like Yahoo might try to do all of this, and inevitably one or two open source projects will try to lash together open implementations of each of these pieces to make a kind of FrankenWave application. There are probably already one or two teams working on the inevitable "Enterprise Wave Server" platforms as well, though I haven't heard about them myself. These efforts may succeed, but that doesn't mean they'll ever be robust enough that people will trust them for communicating on the web.
More to the point, I'm a regular blogger who knows a little bit about scripting on a normal web server. I can poke around the documentation and add a few tweaks to my RSS feed (or, in my case, do nothing and have Feedburner automatically handle it for me), and all of a sudden my blog's feed is part of the Pushbutton web, ready for others to build on. I literally wouldn't even know where to start with the Wave developer documentation if I wanted to integrate it with my site or any of the little apps I like to hack on during a long weekend. What seems more realistic — that someone will figure out a way to incrementally build on top of realtime feeds to enable Wave-like experiences, or that all this talk of Waves, wavelets and blips is going to suddenly become easy to understand.
In short, web-way tech like feeds, Ajax and Pushbutton win because people who make good sites and applications have a place to start with it. Does this mean we get fancy realtime simultaneous editing right away, now that Pushbutton exists? Nope. In fact, Wave might even get the early jump on those kinds of features for web apps, simply because it's pioneered that part of the user experience. But Wave only runs to its full potential on the most cutting-edge web browsers. And there may only be a dozen companies in the world with the in-house expertise to clone the entire complement of technologies underlying Wave in order to make a full-fledged competitor. Worse, the monolithic nature of the Wave experience means it will even be a challenge to make a full-fledged open source competitor to the official Google service.
I hope that Wave succeeds, because I love to see ambition and innovation rewarded. But I think it's mostly likely that Wave's success will be in inspiring people to create similarly compelling experiences by adding incremental enhancements to their existing sites. That's how the web's always advanced in the past.
April 4, 2006
Last month, I wrote a bit about Copy and Paste, the history of technologies like rich content embedding, and how this stuff will evolve in the world of Ajax applications. The next day, Microsoft announced Live Clipboard, which was followed shortly by a draft spec. There was also some interesting feedback on my post from Digg.
Now it seems like there's some even stronger advances, which I'm super excited to see. The brilliant team at Zimbra has just blogged about ALE - Ajax Linking and Embedding. The key points:
AJAX Linking and Embedding (ALE) provides the ability to embed rich content into an editable document and to then interact with and edit that content in much the same way as it is done with traditional office suites and applications in a desktop environment. A key difference is that instead of embedding objects that are backed by installed desktop applications (e.g. a spreadsheet or drawing application), within the ALE world the embedded objects are AJAX components that are embedded into an editable HTML document. These components adhere to a set of design patterns specified by the ALE specification.
Sweet! Now, I realize it makes me a super-nerd to be excited about this stuff, but someone has to be. There's a demo spec up already, I'm curious to see if anybody else will implement this and test out the possibilties for interop.
Update: Jon Udell's screencast on Live Clipboard is a great resource to check out if you're interested in this stuff.
September 6, 2005
Curious about what technologies and techniques are going to be popular in the coming months and into the next year? Well, our crack team of editors here at dashes.com (that is to say, me) have assembled a list of up-and-coming trends that you should keep an eye on. Call it vocational education for people building Web 2.0.
Some of the overall areas of focus are integration (as always) and front-end technologies that have highly visible impacts on end user experience. People won't pay for a service or rely on it if it doesn't have a robust back-end infrastructure, but they'll be happy to pay for it if the front-end is attractive and at least seems usable.
Here, then, is a random assortment of new web development trends to be ready for in 2006.
These technologies go by a lot of names, but in general, dampening is the softening of a user interface through gradual transition instead of immediate state changes. The demand for dampening reflects the front-end focus that is being rediscovered in web applications, but it can require server-side changes in order to enable some effects. The best-marketed example of dampening is the yellow fade technique, but overall, user interface elements will be sliding and collapsing instead of simply disappearing.
Key influences on the user experience here are things like the iPod screen backlight fading out instead of merely shutting off, or soft-close doors on newer automobiles.
What's the quick synopsis? How 'bout building a form dynamically by doing this:
var html = <html/>; html.head.title = "Hello, World."; html.body.form.@name = "hello"; html.body.form.@action = "test.php"; html.body.form.@method = "post"; html.body.form.@onclick = "return foo();"; html.body.form.input = ""; html.body.form.input.@name = "Submit";
April 9, 2002
The current world wide web consists almost entirely of pages that are either stories or tools. A few ambitious sites combine these two types of web pages in varying ratios, with results that range from unsatisfying to disastrous. But I asserted a few days ago that the next stage of the web is going to come from the native form that evolves from, and incorporates elements of, these two existing structures. Even after this form emerges, however, the web will still be populated with plenty of stories and tools, of course, just as television retained the idiom of an anchor at a desk authoritatively reading us the news, even after the invention of the situation comedy and the game show.
If you take a look at the pages we have today, one thing becomes clear: Stories on the web just plain work. The obvious, and so far ultimate, display of this is The Fray, of course, which sets out in its very mission to tell stories. It's the definitive example. But less obvious examples are abundant and instructive. Every news item proffered on whatever portal or provider you prefer is presenting a story. The content presented in web interfaces to Usenet and email are largely story-oriented. In a medium originally designed to present structured documents, the natural divisions and regular formatting of stories was destined to be a good fit, even if they technically fell outside the precise realm envisioned by the web's creator.
This brings us to the other kind of web page, the kind that just plain doesn't fit into how the web was envisioned: tools. Web pages that aren't stories are tools that you use to perform a task. You've probably seen these. Amazon is one. Your bank's online payment system is another. You probably use a web email tool like Hotmail. This site's been using Blogger for a few years now. Hell, Yahoo, MSN, Netscape, and most other common start pages are more tools than story already. And none of them work right.
That's not surprising; they're not supposed to.
Think of Hotmail. It's designed to give you a place to write emails, read them, and move them into folders. These kinds of functions in a desktop program like the Mac Finder or the Windows Explorer are automatic. You just drag and drop. But to enable that kind of ability in a web page, programmers have to jump through hoops, trying to make a story act like a tool.
And notice who has to do that? Programmers. But HTML isn't a programming language. And it's designed to be written by authors, not programmers. There are tags to describe the parts of a structured story. There's in fact a formal Document Type Description for hierarchially structured documents, that's what XHTML is. Curiously absent, unfortunately, is a description for a Web Tool Document.
Look at Oddpost, as it's been making the rounds lately. It solves most of the problem. It's beautiful, useful, powerful. And, much to the chagrin of Mac partisans and Unix enthusiasts, it only works on recent Microsoft browsers on Windows. That's not the bad part about Oddpost, though, that's just smart use of limited development resources. What's truly bad is that Oddpost's HTML doesn't make any sense outside of a very limited context, and it's incredibly hard to debug or reuse or make useful on a PDA or a non-standard web platform.
It was this opening that Flash MX stepped into. With a cry of anguish from standards and open-source advocates, and amidst shouts of glee from newly-empowered Flash developers, Macromedia recognized the enormous opportunity to make tools easy to build. And if it just so happens that most of the money in the web development business comes from helping businesses build tools, not from helping businesses tell stories, well that's just plain old good luck for Macromedia. And it was an inevitable opportunity because the web was always, and only, designed to be stories. All else is kludge.
A Way Out?
What was needed was a formal DTD for describing elements of a web-based application. Common GUI widgets like a tree control, a select or combo list, a spinner control, some drop-down menu controls, toolbar buttons, scrollbars, and all the other usual trappings of a modern GUI application. And some basic logic for loops and form processing. This Application Markup Language would just be a specific XML implementation, with the unique part being that all of the controls would be rendered as GUI-native, state-aware real widgets. But these HTML applications would still have all the benefits of web applications, as benefits like CSS styling, device-neutral rendering, and simple data sourcing would be preserved.
Thus was born AppML. It's beautiful, really. But nobody's using it. And it requires server-side logic to execute. And it only exists in theory; I couldn't build Oddpost with it in any reasonable amount of time or under any reasonable budget.
Does that mean we have to give up? The choice is shitty apps or an extension of Microsoft's hegemony? (For the record, I don't think that idea's nearly as evil as an extension of the tyranny of desktop applications and humongous computer form factors.) Well, no. I think there's a third way.
A widely-distributed, standards-compliant, browser and platform-independent library of functions that would perform the basic user interface functions for a web-based tool, relying on the server side only for the logic and data sourcing. Well, whadaya know? We've got one. Yeah, it's still a work in progress, and it doesn't support nearly enough platforms yet. But DOMAPI is spectacular already. And it'll evolve, and it'll be good enough a year from now to be the basis of a large, stable array of applications, guaranteeing its future development and viability.
So what does all this mean? It means that we've finally got something that works for the two main uses of today's web. We can stop fretting and take these two essential pillars of the future web for granted. Stories we've got beat. Applications we've got a little distance to travel yet. But now... now, we're ready to figure out what the web will actually be used for.
Have we really almost finished writing the first chapter of the web's history?