3. What is Web 2.0?
The concept of Web 2.0 According to O’Reilly (2005) “began with a conference brainstorming session between O'Reilly and MediaLive International”. The idea of Web 2.0 appeared after the collapse of dot-com in 2001 to transform the way that we look at the Web as a medium. To put it simply in the words of MacManus (2005) “Web 2.0 is a vision of the Web in which information is broken up into “microcontent” units that can be distributed over dozens of domains. The Web of documents has morphed into a Web of data. We are no longer just looking to the same old sources for information. Now we’re looking to a new set of tools to aggregate and remix microcontent in new and useful ways.” O’Reilly (2005) highlights some of the core principle features of Web 2.0 as follow:
- The Web as Platform
- User Generated Content
- Software above the Level of Single Device
- Data is the Next Intel Inside
- Lightweight Programming Models
- Harnessing Collective Intelligence
- Rich User Experiences
According to Gehtland (2005) Web 2.0 represents the maturation of Internet standards into a viable application development platform. The combination of stable standards, better understanding and a unifying vision amount to a whole that is greater, by far, than the sum of its parts.
Graham (2005) argues that till recently he thought that term Web 2.0 did not mean anything. At the time it was supposed to mean using "the web as a platform”. It seems that web based applications are the main component of Web 2.0. And the most successful web based application is AJAX which was used to design the Google Maps (http://maps.google.co.uk/maps) in 2005. It seems that Web2.0 has acquired a meaning now.
So is that it? Does Web 2.0 mean Ajax? Not really. Take Wikipedia (http://en.wikipedia.org/wiki/Main_Page) for example. This is one of the examples to show that amateurs can overtake professionals as long as they have the right kind of tools in their disposal. The main factor of Wikipedia success is that it is free. So people will use it. If you need professional opinions on the Web you have to pay for it. Web 2.0 means using the web as it was meant to be used, and Wikipedia does it, Google does it, Itune does it, Amazon does it and Microsoft is struggling not to lift behind.
Probably the most up to date use of Web 2.0 is Windows Live at: (http://www.live.com/) Microsoft Virtual Earth is a detailed 3D imagery. US users with Vista-ready Windows computers and IE 6 or 7 will be able to navigate through an aerial view of cities with enough detail to discern the texture of buildings and read clickable billboards from the likes of Fox, Nissan and John L. Scott Real Estate. Virtual Earth 3D is expected to expand to cover up to 100 cities around the world by the end of next summer. Unlike Google Earth, Microsoft’s Virtual Earth is experienced directly inside of IE as part of search results. The imagery was taken from planes and processed with proprietary algorithms. See Kirkpatrick (2006)
Web 2.0 means trying to understand what is happening. Measuring the nerve of the market and finding out what technologies are under development and trying to find out how to use those technologies effectively to make profit. It is not all. Web 2.0 means to have your eyes open for the future. What will happen and be there when it does and use the opportunity to turn the tide in your way. As stated by Graham (2005) “that's the way to approach technology-- and as business includes an ever larger technological component, the right way to do business.” On the other hand Web 2.0 is another way of using the Internet to smooth the progress of the engagement of consumers efficiently and outlet those to generate cheaper and faster profit.
There is no hard boundary for Web 2.0, but rather, a gravitational core (O'Reilly 2005). Web 2.0 could be visualized as a set of principles and practices that tie together sites that display some or all of those principles. The subsequent sub headings will discuss these principles in detail.
3.1. The Web as Platform
The Internet is currently undergoing a surgery. The standard web browser technologies used to display documents and contents have been pushed beyond their limits by the increasingly sophisticated web based services. Many enhanced, smarter and more affluent Internet technologies are being introduced or they are under development to take over the old and incompetent web browsers’ place. Most of these technologies would be able to deliver this affluence by using only modern local machines.
The ultimate goal of the Internet is that all computers in the world connect to each other in order to create a giant network resource. One of the main goals of the Web 2.0 is to turn this giant web to a platform where local and remote computing becomes interchangeable and the computer users are not aware whether they working on local or remote machine.
To accomplish this grand goal the business web site owners have to let the control go, release the information how to use their data, share their code and publish their APIs. As LaMonicka (2005) argues this process will give the outside individuals the tools needed to pull data from websites and to combine it with another information source to create something new. This in effect means giving a great deal of power in the hands of outside individuals and to transform websites into programmable machines. Some businesses such as Amazon, Google and Ebay already doing so by hiring outside developers and programmers and it seems that the run through not only is working but is also profitable.
One other point to chew over with all of the above is that everything we do in Windows, Mac and other applications could be done through a web browser with Ebay, Amazon, Skype or Google Mail. The Web is not that mature yet but this is the vision and the goal and this will happen for sure. As the Web move towards this goal there will be a need to engage more and more of the users to the process. This will create an exhilarating opportunity for everyone to help the Web evolve to a ground-breaking and a non scrutinized platform to implement and play rich application technologies. The more people participate in this process the better it becomes and the more people use these applications the better they will get.
3.2. User Generated Content
User Generated Content (UGC) is a term that has been introduced during 2005. The term refers to the content which is generated by the users of website rather than by companies or traditional methods. It started with the realization that the information is no longer the monopoly of big companies. The development of plenty of smart and accessible Web technologies such as blogging, video blogging, podcasting, Wikkies and many more speeded up this process. MySpace (http://www.myspace.com/ ) YouTube (http://www.youtube.com/ ), Wikipedia (http://www.wikipedia.org/ ) and FourDocs (http://www.channel4.com/fourdocs/about/index.html/ ) are some of the examples of popular websites based on User Generated Content. As a result of democratisation of the Web most of the users and authors of these websites are ordinary people who have a chance to publish their work in famous places. The trend is changing the balance of power. Most of the activities are happening outside of the sphere of media producer companies. People are discovering new ways of creating all forms of media, they share those file on the Internet and spend more time online rather than sitting idly on the sofa watching some boring TV programmes.
Declining TV viewing probably hinted BBC for example to set up a User Generated Content Hub for ordinary citizens. The BBC (2006) claims that “The massive explosion of the oil depot near Hemel Hempstead at six o'clock on Sunday morning underlined the obvious, if painful, fact that news stories don't always break between 9-5, Monday to Friday. That story resulted in around 15,000 images being sent to the BBC, the first arriving at 0616, just 13 minutes after the initial explosion. We also received 20,000 emails from people who had seen or heard the explosion, from Folkestone to Nottingham”.
Not everyone is happy with the trend. Some argue (Chin: 2006) that despite the benefits of UGC such as breaking the boundaries and restrictions and discovering more areas of knowledge there is a down side to it. UGC might cause content overlap, questionable source of the provider and lack of moderation may end up in a mash up and chaos Web.
In an essay on his weblog “User-generated content vs. reader-created context” Udell (2006) writes that everything about this buzzphrase annoys him. He points out that calling people the users dehumanizes and it should be stricken from the IT vocabulary. In his opinion the word “content” reminds more of sausage than storytelling. He also adds that writers and editors do not “generate”, they tell stories that inform, educate and entertain. At the end he proposes an alternative for this annoying phrase: “reader-created context”.
User Generated Content websites are the fastest growing brands on the Internet. Zeller (2005) in an article in The New York Times wrote that "according to the Pew survey, 57 percent of all teenagers between 12 and 17 who are active online - about 12 million - create digital content, from building Web pages to sharing original artwork, photos and stories to remixing content found elsewhere on the Web. Some 20 percent publish their own Web logs." The article continues: “Most teenagers online take their role as content creators as a given. Twenty-two percent report keeping their own personal Web page, and about one in five say they remix content they find online into their own artistic creations, whether as composite photos, edited video productions or, most commonly, remixed song files.”
A press release by Yahoo Finance (2006) published on August 10 wrote that Neilson/NetRatings “a global leader in Internet media and market research, announced today that user-generated content sites, platforms for photo sharing, video sharing and blogging, comprised five out of the top 10 fastest growing Web brands in July 2006”. The press release continues: “Among the top 10 Web brands overall, MySpace was the No. 1 fastest growing, increasing 183 percent, from 16.2 million unique visitors in July 2005 to 46.0 million in July 2006. Google ranked No. 2, growing 23 percent, from a unique audience of 76.2 million to 94.0 million. Ebay rounded out the top three, increasing 13 percent, from 51.1 million to 57.8 million unique visitors”.
Table 3.1. Top 10 Brands on the Web, re-ranked by Year-Over-Year Growth, July 2006 (U.S.)
Brand | Jul'05 UA (000) | Jul'06 UA (000) | % Growth |
MySpace | 16,239 | 46,025 | 183% |
Google | 76,198 | 94,031 | 23% |
eBay | 51,122 | 57,759 | 13% |
MapQuest | 39,269 | 43,585 | 11% |
Yahoo! | 98,485 | 106,224 | 8% |
MSN/Windows Live | 91,049 | 95,593 | 5% |
Amazon | 35,891 | 37,595 | 5% |
Real Network | 35,707 | 36,685 | 3% |
AOL | 74,095 | 74,507 | 1% |
Microsoft | 92,457 | 88,042 | -5% |
Source: Nielsen//NetRatings, August 2006 (http://www.nielsen-netratings.com/pr/PR_060810.PDF)
Whether the experts agree or disagree or we like it or not the UGC is here and it is growing fast. It changes the Internet and the perception of it. Millions are involved in this trend. They participate on UGC websites on not to get famous or rich, but because they want to express their ideas and points of view. This is a democracy in practice. The vision of a world full of freelance writers, film makers, journalists and software developers are not a dream anymore it is around the corner albeit the experts are threatened or annoyed.
3.3. Software above the Level of Single Device
This is another feature of Web 2.0. This means that the Web will be no longer limited to a single platform such as PC. Applications that are limited to a single device are less desirable than those that cover more than one device. There are many consumer devices such as PDA’s, mobile phones, cameras, mobile Digital TV players, game devices which are increasingly getting connected together and it transforms the way we access Internet and get our information. As O’Reilly (2005) notes, “one of the defining characteristics of internet era software is that it is delivered as a service, not as a product”. This actuality forces the big software developers to change essentially. They have to update and add to their applications constantly in order to accommodate other devices as they are developed.
Consumers are getting wireless increasingly. This leads to less reliability on a single device. Now you can access the internet on the go. It is very hard to envisage what happens in next five or ten year time. But one thing is clear that engaging the users as testers and finding out how they use the new features is one of the reasonable approaches to software development.
The best example of Software above the Level of a Single device is perhaps iTunes. This exemplary application flawlessly connects a PC or Mac, a handheld device the iPod and dynamic server. This is a practical example of an application that it not only works on a PC, but it also works on a mobile device. The present speed of open source technologies development suggests that it is not long before the Web is accessible across any consumer device. The only fixation these devices may have in common will be a universal browser.
3.4. Data is the Next Intel Inside
O’Reilly believes that data in the next generation of the Web is as important as a CPU in a PC. Every major Internet based business employs a database system such as I tune’s database of songs, Ebay’s database of products and users, and Paypal’s database of members. Database administration is a central part and one of capability of Web 2.0.
If a company wants to improve their network application in an inexpensive way, they should let the users enhance the data with their own. This in effect means to invite the user to get involve beyond the simple design and test process. This means sharing of data. Some companies such as Amazon do it the other way around. They capture the data from users and add it to their own.
Web 2.0 services and data. Web 2.0 services unavoidably have a bulk of data which is merged with software. EBay for example sells the products and the seller’s data. This is very different from packaged software such as Adobe Photoshop where only software is sold not data.
3.5. Lightweight Programming Models
Opposed to common understanding the success of the Web is owed to simplified and lightweight programming models. Lightweight Programming Models is all about removing the redundant complexities and constraints coupled with the traditional heavyweight corporate Programming Models. Lightweight programming also reduces the development process, debugging and testing time.
Involving users in developing, and enhancing process of lightweight programming is much practical as opposed to heavyweight corporate programming. Lightweight, programming model could also be useful to exploit the operating system and browsers fully. The heavyweight corporate sponsored programming models are designed for a small number of people. On the other hand, lightweight programming models are designed to reach as many people as possible.
Open Gardens (2005) argue that simpler technologies like RSS and AJAX are the dynamic force behind web 2.0 services as opposed to the full fledged web services stack using mechanisms like SOAP. These technologies are designed to associate rather than coordinate. They are thus opposite to the traditional corporate attitude of controlling access to data. They are also designed for reprocess. Reprocess in the logic of reusing the service and not the data.
Tate (2005) summarizes the core principle and philosophy of Lightweight Programming Models as: It incorporates process, technology, and philosophy. It is about simpler technologies. It is built on solid, lightweight foundation. It strives for the best possible transparency. It uses ideas that give the users and developer influence, reliance, booster and AOP.
3.6. Harnessing Collective Intelligence
O’Reilly (2006) in his latest article on Harnessing Collective Intelligence dismisses the notion that User Generated Content and Harnessing Collective Intelligence are the same concept. He adds that Wikipedia for instance exhibits super intelligent performance when it is more wide-ranging and more current than encyclopaedia Britannica. Britannica has the brand name, but Wikipedia has the intelligence on board. And with very smallest software, Wikipedia brings together millions of minds to craft a new and superior kind of encyclopaedia. That's not just user-generated content. It's a cognitive community exhibiting super intellectual performance.
You Tube (www.youtube.com) which was rated the best invention of 2006 by Time, is another example of Harnessing Collective Intelligence. You Tube, according to Time (2006) started with a video of a trip to the zoo in April last year now airs 100 million videos and another 70000 videos are added to its database by users every day. You Tube is a portal for millions of people around the globe that want to become celebrities over night. This in itself is an Internet era revolution which was unpredicted a year ago.
3.7. Rich User Experiences
As Internet matures the demand for richer and interactive experience on the Web is grows with it. It is not long before you say goodbye to convention way of Web browsing.
New and rich applications are developed to make this happen. If you use Google Maps for instance, the way you can pull selected parts of the map into your view suggests the notion that you have all of the maps stored locally on your computer, for your easy handling. Imagine how out of favor this application would be if every time you tried to pull the map the page disappeared for a while waited for browser to refresh. The application would be so slow that no one would use it. See (Perry, 2006)
Figure 3.2. Google Maps. You can drag or zoom into a part of the map without waiting for browser to refresh.
Richer experience does not end with Google Maps. Yahoo also has changed its search form fields. Now you can switch search categories by hitting the tab key, just the same way as windows applications, without reloading anything.
What make this magic happen? What technologies are involved and who develop them? One of the burning words of today’s pundits is Ajax. So what is Ajax?
Ajax stands for Asynchronous JavaScript and XML. Ajax is not a technology in itself, but it brings together some deep-rooted web technologies and uses them in new and appealing ways. See (McLaughlin, 2006)
Whenever we use a desktop application, we anticipate the outcome of our work to be made accessible straight away, and without us having to wait for the whole monitor to be reloaded by the program. While using Microsoft PowerPoint, for example, if we change the text size, font type, appearance of the slides, and type of animation we anticipate seeing the result of those changes straight away. This type of interactivity has hardly ever been obtainable to users of web-based applications before Ajax was introduced. Ajax introduces a way out to this dilemma. By working as an extra coating between the user's browser and the web server, Ajax handles server exchanges in the background, submitting server requests and giving out the returned data. The results may then be incorporated flawlessly into the page being viewed with no need for the whole page to be reloaded. (Ballard, 2006)