Web 2.0 – my experiences in 2008

I have been reflecting recently on this year and the various different projects I have worked on. What is interesting is that in the last 18 months of working fairly and squarely in the “Web 2.0” space, the bits of Web 2.0 that I’ve seen most interest in is around the architectures that underpin modern web applications rather than purely the social aspects of it. In particular the following themes seem to resonate:

Aggregation of data in the form of feeds on the client.

Sounds like plain old common sense but in the “Web 1.0” days, mingling of presentation and data channeled many an application down into a dead-end silo, despite the best intentions of things like Struts. The flexibility to bind applications together in a way that isn’t necessarily intrusive to the application server but doesn’t require embracing the complexities of WS-* is something people see value in because it means that future change will potentially cost them less. Further more, aggregating feeds on the client means they are not necessarily forced to migrate a whole load of applications to different application server technology just to integrate with a new vendor. Again a cost consideration.

Well defined widgets and loosely-coupled browser component models.

Again, flexibility facilitates change and therefore potentially reduces costs. Frameworks like Dojo and the OpenAjax Hub mean that a modern web GUI can be constructed of well defined components with loose linkages between them — we design with change in mind at the outset rather than consider change to be some other project in the future. All of these things are resonating.

The browser as a platform and software-as-a-service

Sounds obvious but it’s worth stating — people where possible want to commoditise the desktop (or indeed mobile) platform so they aren’t locked in to one vendor’s view of the world and the hardware is increasingly disposable. Any modern operating system comes with a browser supplied and web applications can be centrally managed — no IT skills required to deploy the application, it just arrives on demand via a URL. Google clearly recognise this with Chrome which reveals a fairly clear strategy of creating a robust and performant browser that increasingly offers some of the qualities of service we’d associate with a desktop OS (witness their focus on better thread management for starters). For many organisations managing multiple native installers and the various things that can go wrong is a pain they are keen to lose.

“Mashups” and composite applications

People and organisations I have spoken to and with others about are starting to see the value that mashups can bring to their organisation, particularly in terms of making use of (note I avoided the word “leveraging”) data they already have in the enterprise without the need for deep IT skills to help them realise that aim. This does drive a slightly different thought process when discussing the merits of an architecture based on separation of concerns between data (i.e. feeds) and rendering widgets since adopting such an architecture facilitates a move towards mashups in the future.

I’ve included under the mashup banner the term “composite applications”. Now many organisations already have at least some legacy line of business applications that they cannot replace quickly, in terms of the sheer cost and risk of “rip and replace” and the skills and investment they have already made. This is where products like Lotus Expeditor become compelling because it allows you to aggregate browser-delivered content in its Composite Applications Environment (CAE) whilst maintaining a centrally-managed model for the client platform.  Effectively what is created is a richer client “mashup” environment, where heterogeneous desktop technologies can be wired together with browser content and Java/native programming logic. This also provides a pathway to transition over time to a thinner client solution if required as the rich client components are migrated onto a browser-delivered model. Sounds a bit boring I know but the reality is that it’s a “brown-field” world out there.

Anyway, those are the things that I personally have seen and heard working in this space. Part of the reason for my recording them for posterity here is to review this next year and see how things have changed, especially as budgets are inevitably pressurised even further in the current economic climate.


One response to “Web 2.0 – my experiences in 2008

  1. Pingback: Gartner Top 10 Strategic Technologies for 2009 « Martin Gale’s blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s