Stephen Colbert was the inspiration for my title. He coined the phrase commedically referring to phrasing that some self help and self success books used.
My view on Web 2.0 sides with the critics in the Wikipedia article on Web 2.0 that it is just a term to make the web look fresh. “Other critics labeled Web 2.0 “a second bubble” (referring to the Dot-com bubble of circa 1995 to 2001), suggesting that too many Web 2.0 companies attempt to develop the same product with a lack of business models. For example, The Economist has dubbed the mid- to late-2000s focus on Web companies as “Bubble 2.0”.
While, I don’t disagree that the technological bounds do merit a new phrase to go along with it, I think that Web 2.0 is just the golden grahams cereal with a different box. It psychologically tastes different but it’s still made by the same makers of regular golden grahams. So, if the web is just an attempt by businesses to grasp at a new strategy then Web 2.0 is just camouflage.
I disagree in general that it really is a technologically advanced as well. Windows 95 and 98 might have been a technological advance at the time but they were still driven by DOS. That being said, maybe what will come out of this is a Windows 2000 and XP, and 7. The new web interface?
Take a look at PeppermitOS (http://lifehacker.com/5594027/peppermint-ice-is-a-webapp-based-linux-distro-without-excluding-native-apps).
Web based app desktops are something that Linux has already pushed at. Windows 8 was Microsoft catching up with what Linux had already pioneered. However, as the reviewer of Peppermint Os remarks, web applications are great if you have a good internet connection (http://mylinuxexplore.blogspot.com/2013/06/peppermint-os-four-review-linux-mint-of.html).
Web 2.0 is based on current technological systems of distribution. Server, Network, Client. Essentially, it is turning a desktop to a thin client (http://www.infoworld.com/t/computer-hardware/debunked-5-myths-of-thin-client-computing-227483), or a computer without local processing. You essentially log onto a remote server but it looks like you are connected to a desktop. I had users in that company I worked for, that didn’t know any better, act as if the box that was there had all their information on it. No, no, it’s like your TV, you change channels but everything originates from somewhere else.
So, it is interesting to read on the Wikipedia article about Web 3.0 considering this article about Silicon being overextended and the development of human brain behaving processing technologies (http://www.techradar.com/us/news/computing/beyond-silicon-the-processor-tech-of-2035-1039114). Web 2.0 is already potential child’s play and is already bordering on obsolete. Yet, If that is the case, and including as O’Reilley said,”It’s a network in permanent beta.” It actually achieves human consciousness. Yay humanity. Because humanity is always in beta, if he means that humanity is constantly striving to make itself better and updating itself dynamically. How organic, O’Reilly. I really do approve Web 2.0 in that way.
Yet, I remain cynical about who or what is driving Web 2.0 architecture, and what is slowing down this transition beyond silicon. I candidly argue that Web 2.0 is based on Thin Client concepts and is just ripe for business rebranding and worrying extended control over our daily decision-making. However, it looks so good because a thin client is cheaper than high priced electronics. See, the businesses saved money and so can you. Cloud Computing! Compute from anywhere. (Don’t call it Web 2.0, it confuses non technophiles, call it something cute like Cloud Computing.) But hey, it’s easier to deploy support on thin clients because the support can screen shadow you and make sure you’re clicking your mouse the right way in a program. And honestly (and the ticker) what do you do on the web anyways, huh? Do you just go to Facebook? Why do you even need a word processor on your computer? You can just compose your entries right on Facebook.
However, the issue is still double sided, when considering the wikipedia article saying that Web 2.0 allows more social interaction, rather than one way. I disagree. Solely because I asked the question: this is based on who’s sole authority. The question of the matter is and formulated in a similar way in Bruce Sterlings article: does humanity even have the authority to designate what is better than itself? Because, Web 2.0 standard would place the authority with the people or crowdsourcing. I really don’t think so, again because the personal computer was personal, but thin client based architecture is, well, not so personal. The example also is last year’s Volkswagen commercial. Essentially the commercial was cited as being racist. However, the Jamaican government came back and expressed approval of it. So, not racist? Essentially, they went back to authority and the Jamaican government. It is our authority as the content creators or promoters to change web architecture and technology one way or the other. The Wikipedia article hints at a viewpoint that Web 3.0 will just harken back to the authority figure. And, with Net Neutrality back in the forefront of recent events, I hope this organic transition isn’t stifled. I hope, that if this is an organic human invention, that some open organization patents it so that no single person can attempt to claim it for their own private interests like private companies tried to patent the human genome project. Literally, that’s what scares me. But, with the advancement of Artificial Intelligence, the evolving web seems to point at, according to that article: will we be asking next about non-human intelligence rights.
However as Bruce says “I really think it’s the original sin of geekdom, a kind of geek thought-crime, to think that just because you yourself can think algorithmically, and impose some of that on a machine, that this is intelligence. That is not intelligence. That is rules-based machine behavior. It’s code being executed. It’s a powerful thing, it’s a beautiful thing, but to call that intelligence is dehumanizing. You should stop that. It does not make you look high-tech, advanced, and cool. It makes you look delusionary.” A software and database programmer I know agrees with him. I finally understand why because of what Bruce says. Artificial intelligence goes back to a waste of time, my computer programming acquaintance said.
So, what is computing then if it isn’t trying to enhance the daily lives of people? Because this is what it makes me think of? Do we go back to the days of human calculators? Do we give everyone jobs in the factory doing menial tasks, or do we try to evolve past Web 3.0 and make ourselves and information and culture driven world like futuristic Utopias say we should.
That being said, I’ve used the web to send out my creative writing and post it to art sites such as Deviantart. I am a strict user of Web 2.0. The art community sites are the updated form of an artist’s portal. However, I took the advice of http://skinnyartist.com/ and made a regular web presence for myself. The site I have is a wordpress based site and allows for comments directly to my site’s blog. I have plugins that cross post to Tumblr, Twitter, Facebook, and Google+ right as I post to the blog. I see my online experience as participatory. I used to hover and take in information but since the junior year of high school I’ve learned about basic web design. From 2007 onward, I started to become much more actively engaged.
The downside is, if I wanted to crosslink my Deviantart to my website and vice-versa, I have to pay. The art sites do not allow much for syndication. I find that I have to post to Deviantart as well as to my website. I recently found that search engines do not like this. Duplicate content is frowned upon and considered a way to cheat search engine rankings.
Otherwise, I attempt to contribute to the community by reading as much as I post.