New Delhi: Spending on the so-called Web 2.0 technologies—a term used to describe tools, social networks and Internet applications designed to increase communication and collaboration—is estimated to grow at an annual pace of 43% in the next five years, according to a report by market research firm Forrester Research Inc.
Click here to listen to the interview
How do you see collaborative technologies developing?
I think we are in a fairly early age of collaboration. We are still looking for electronic versions of physical collaboration channels. Email is essentially the electronic form of fax, or regular mail, we use cellphones for conversations when two people are not in the same place and video conferencing, (attempts) to mimic a physical meeting. I think the second generation of collaboration tools like wikis, and Twitter are not fundamentally about electronification of a well known channel. They are fundamentally about how you can increase the reach of a person as well as his/her awareness.
Twitter (www.twitter.com) is a good example of this. There is no analogy for Twitter in the physical world...what twitter enables a group of people to do is to stay constantly aware of each other or a bunch of subjects with very short messages at frequent intervals.
In the same way, wikis, when collaboration across multiple people happens in the open. People who have things to contribute jump in, and then the other tools such as Twitter are aimed at increasing awareness. And, as these two things begin to converge—short, constant, stream of thought communication in a group, and openness in collaboration—that will result in something very different from what we perceive as collaboration.
A report by Forrester in April predicted enterprise spending on Web 2.0 technologies would reach “$4.6 billion globally by 2013, with social networking, mashups, and RSS (Really Simple Syndication—a standardized internet format used to publish continually updated sites like blogs) capturing the greatest share”. Do you see that happening?
I am familiar with the Forrester report, and I’ll come right out and say—they have no idea what they’re talking about. There’s nothing called Web 2.0. It is a big myth created by O’Reilly (Tim O’Reilly, founder of O’Reilly Media, who is credited with coining the term) for marketing purposes.
Web 2.0 is not a technology, but a state of the Web—that I would agree with. Now, within Web 2.0, there are three important technological components, which, while they may enable collaboration, are not fundamentally collaboration technologies.
One is the rich Internet applications, such as Adobe’s Air or Microsoft’s Silverlight, that provide a much richer user experience on the Web.
The second suite of technologies that generally gets lumped under this is mash-ups. (A mash-up, such as Wikimapia, combines data from different tools and sources to create a single, hybrid Web application.) Mash-ups enable you to get data from both internal and external sources and do some simple data manipulation. The mash-ups market is maturing to the point where it could make a very big dent in enterprise intellectual property.
And in the third category are things such as widgets (a chunk of code, performing a specific task or outcome, that can be when required in any web page), and Web services and so forth, essentially providing open standards by which people can publish data that other people can access and use in creating these mash-ups, and applications and so forth.
These three are real technological components (that) are going to be big markets. But when it comes to things such as social networks, or tagging, I’m not yet convinced that the enterprises, even a large enterprise, have enough of a critical volume within themselves to make it work for them internally.
My speculation would be that sites such as Facebook and Myspace will begin to create enterprise editions, so that enterprises, whether its GM or Accenture, would have rather than building their own internal social network, they would rent out a corporate version, software as a service. I think thats the direction corporations are likely to go.
With interoperability and open standards, what will that mean for proprietary software?
Open standards do not necessarily mean open source. The open standards essentially mean you can still have proprietary technologies, but they will be inter operable. Obviously, tech companies do not want their products to be commodified, in a sense, by openness—and different companies approach standards in different ways.
Some companies approach open standards by completely adhering to the standard, remaining inter operable, but provide a superset of capabilities over the standard, with the hope and assumption that companies use that superset more and more. So even though the product is inter operable, as more people get addicted to their specific superset, the product locks in the company.
Tech talk: Accenture Ltd’s chief scientist Kishore Swaminathan says that the mash-ups market is maturing to the point where it couldmake a very big dent in enterprise intellectual property.
That’s one approach to deal with openness and open standards, while still retaining competitive advantage and keeping consumers locked in.
The other approach is to...they put themselves in the driver’s seat to define the standard, and even if they deviate from the standard, everybody else has to follow suit.
Do you see non-gaming virtual worlds such as Second Life becoming sites of business or places for collaboration?
I don’t think non-gaming virtual worlds have much to offer. There are specific components or ideas within virtual worlds that are interesting. But for enterprises, investing or setting up shop in a virtual world is a huge business risk, especially in worlds with proprietary currency, such as Second Life’s Linden Dollars. They add very little value except maybe in building design, or architecture, where it might potentially be useful to have a 3-D avatar walking through a 3-D world.
I think virtual worlds will only remain an interesting curiosity, and within a few years, they will all most likely be gone, but some of the ideas they generatedmay survive.