Comments
yourfanat wrote: I am using another tool for Oracle developers - dbForge Studio for Oracle. This IDE has lots of usefull features, among them: oracle designer, code competion and formatter, query builder, debugger, profiler, erxport/import, reports and many others. The latest version supports Oracle 12C. More information here.
Cloud Expo on Google News
SYS-CON.TV
Cloud Expo & Virtualization 2009 East
PLATINUM SPONSORS:
IBM
Smarter Business Solutions Through Dynamic Infrastructure
IBM
Smarter Insights: How the CIO Becomes a Hero Again
Microsoft
Windows Azure
GOLD SPONSORS:
Appsense
Why VDI?
CA
Maximizing the Business Value of Virtualization in Enterprise and Cloud Computing Environments
ExactTarget
Messaging in the Cloud - Email, SMS and Voice
Freedom OSS
Stairway to the Cloud
Sun
Sun's Incubation Platform: Helping Startups Serve the Enterprise
POWER PANELS:
Cloud Computing & Enterprise IT: Cost & Operational Benefits
How and Why is a Flexible IT Infrastructure the Key To the Future?
Click For 2008 West
Event Webcasts
AJAX – A Search Engine Killer
"Search engines don't run JavaScript. Oh no, not ever, no way José."

 "Search engines don't run JavaScript. Oh no, not ever, no way José." So once you have AJAXified your Web site, huge areas of it are now hidden to search engines, never to be spidered, indexed, or found. AJAX, in short, has its limitations. Ashok Sudani discusses...

Web 2.0 is a strange thing in that it doesn't really exist. You can't buy Web 2.0; you can't buy a Web 2.0 programming language, and you can't buy Web 2.0 hardware. In many ways, the phrase "Web 2.0" is a marketing phrase like "paradigm shift" or "the big picture". The reason for this vagueness is that Web 2.0 doesn't have a tightly defined definition. What the phrase Web 2.0 tries to express is that modern websites are so much better than early websites that they'd better be given a different name. So it is down to marketing.

Web developers need to demonstrate that they may use the same Internet, the same web browsers and the same web servers as their competitors, yet their websites are in fact an order of magnitude better. “Our competitors only do websites. We do Web 2.0 websites!"

The client is, of course, hugely impressed that his new website will be a Web 2.0 website. But what should he expect to see for his money? What is the client's view of what Web 2.0 should offer? Is it all smelling of roses or are there some thorny issues too?

I would contend that there are in fact three facets to a Web 2.0 website:

1. AJAX

2. Social Networking (Building Communities)

3. Broadband

AJAX is technical and can only be performed by a technically skilled developer, social networking is vague, woolly and is based more on marketing models than web skills, and broadband has been popular for a long time. Even stranger is the fact that AJAX has been available to developers for at least 5 years, and social networking has been around even longer. It is simply the re-branding of these things that is causing the rise in the popularity of these old but current "buzzword" technologies.

AJAX is a mash-up of technologies. We've had asynchronous JavaScript and XML for many years, but until Jesse James Garrett said "I name this mash up - AJAX" it remained out of the mainstream. The same goes for social networking. Forums, blogs, and community-based websites have been around for many years, but giving it a title like "social networking" combined with the success of websites such as www.YouTube.com and www.LinkedIn.com makes it mainstream and popular. And to cap it all, the new names invented to re-brand existing technologies are combined into the all encompassing name of "Web 2.0." Web 2.0 is simply rebranding the rebranded.

In summary, we've had the ability to create Web 2.0 websites for years. It is not new technology; it is simply the renaming and repackaging of something we already have and enjoy. Marketing has made buzzwords of what we already knew and the public and developers are lapping it up.

The third facet of Web 2.0 was broadband, or as I prefer to call it, broadband abuse. Many developers believe that Web 2.0 is defined by how long it takes to download a website or the size of the broadband connection required to view the site comfortably. They believe that the bigger the connection required or the longer the website takes to download, the more Web 2.0ish the website must be. In my opinion, however, adding vast images, video footage, badly implemented rounded corners and streaming music does not make a Web 2.0 website. It simply makes a regular website that is bloated and annoying.

Presuming that you understand what makes a Web 2.0 website and you are keen to build one, there is an important area that you should consider before you start. And that is the area of Search Engine Optimization.

So what about search engines? Do Web 2.0 websites perform well on search engines? Do search engines need to change to keep pace with development? If we ignore the broadband abusers and look at the two key facets of Web 2.0, AJAX, and social networking we get two very different answers.

Working somewhat in reverse here, the conclusion is that AJAX is a search engine killer. Adding AJAX functionality to your website is like pulling the plug on your search engine strategy. Social networking sites on the other hand typically perform exceptionally well on search engines due to their vast amount of visitor provided content.

The reason AJAX is a search engine killer is pretty obvious once you know how the technology works, and at the risk of offending all the people who know this already, I'll recap in a brief paragraph.

Simply put, AJAX removes the need to refresh a page in a browser. Say for example, you are on the product-finding page of a website, you can type in a search phrase for the product you want to find and press the submit button. Without refreshing the page, the asynchronous JavaScript runs off, grabs the results of the search, and inserts the details of the found products into the very same page as you sit and look at it.

For the website user this addition of AJAX to the website feels fantastic. No page reloads, no browser flicker, no click noise, but sheer joy. And so the rush for AJAX websites begins, because the visitors will love it.

But what about the search engines, what will they make of web pages that use AJAX to find content? Importantly, search engines don't run JavaScript. Oh no, not ever, no way José. So the search engine will never run your AJAX. To the search engine, huge areas of your website content are now hidden, never to be spidered, indexed, or found. This really limits the usefulness of AJAX in many applications.

An ideal application of AJAX is Google Maps, where as you drag the map around the browser window, the newly exposed areas of the map are retrieved and shown on the page without a page refresh—smooth, seamless, and very impressive. Does Google care if the single map page gets found by searching? Certainly not!

A very poor application of AJAX is the product portfolio where you can find and view product details for hundreds of products without ever refreshing the page. Nice to use? Yes. Navigation friendly? No—try hitting the back button when the browser ignores your last 20 clicks because you have remained on the same page! Search engine friendly? Forget it. You are invisible.

So what is the solution to the AJAX invisibility cloak that Master Harry Potter himself would be proud of? There are 5 options:

Build two websites, one using AJAX that is lovely for visitors and another using more traditional techniques for search engine spiders to find. If you can find a client to finance both, you have found a client with too much money!

Drop AJAX. Let the visitors suffer the page refresh.

Run with AJAX anyway and just put up with the fact that your perfectly formed website will receive no search engine visitors.

Lobby the major search engines to rebuild their spidering algorithms to take into account AJAX pages and to run JavaScript on the pages they index. This option might take some time :-)

Increase your Google Ad words payments and ramp up traditional advertising to counteract the missing website traffic from the search engines.

And so, a bleak picture of AJAX is painted and by implication of Web 2.0 as well. The good applications of AJAX and Web 2.0 are few and far between, but when you do find them they are fantastic. Do you remember that feeling when you fist used Google Maps? Do you find that all other mapping websites now feel old fashioned? I would go as far as to say that it was Google Maps that single-handedly bought the technology of AJAX to the masses.

The second most impressive application of AJAX is another Google idea, where when typing in the search field on the Google website, AJAX is used to find results even as you type the words—incredibly quick to use, fantastic for the website visitor, and really demonstrating the technology in a great light.

Isn't it hugely ironic then that the one website that demonstrates so well the very technology that, if used on our own websites, will force us to spend more on Google Ad words, is in fact Google. This Viewpoint article appeared originally at http://ashko.blogspot.com/2006/11/web-20-and-ajax.html and is republished with the kind permission of the author.

About Ashok Sudani
Ashok Sudani blogs at http://ashko.blogspot.com.

In order to post a comment you need to be registered and logged in.

Register | Sign-in

Reader Feedback: Page 1 of 1

AJAX IS search-engine-friendly if used right. I blogged this at http://www.boonex.com/news/archive/AJAX_is_NOT_A_Search_Engine_Killer_An...

We made a forum (Orca), which is 100% AJAX-based but is totally search_engine/user friendly.

Yet I see elsewhere on the site today that there is a company called BoonEx that claims its AJAX-based software is "search engine friendly" - Ashkok how do you explain that?

Here is their URL: http://www.pr.com/press-release/23685

So what's the conclusion: AJAX-good or AJAX-bad? or does it all depend on WHERE the AJAX is being implemented?

So what's the conclusion: AJAX-good or AJAX-bad? or does it all depend on WHERE the AJAX is being implemented?

That's a good point. But the problem is: Google and other SE may see this as cloaking, a method to cheat search engines delivering different contents to humans and crawlers, and you can be dropped from the result pages.

I think another way to get your AJAX'd pages spidered is put the corresponding link in href and do not follow it, like this:

< a href = "myContent.htm" onclick="myAjaxMethod();return false;" > My text < /a >

The 'return false' will make the browser not follow myLink.htm, but will call myAjaxMethod. SE spiders WILL follow myContent.htm and index contents.

This works and you won't be penalized by search engines.

PS: Sorry by my bad bad english :)

crawlers will only search for HREF, SRC or some other link tags. I never saw a web crawler that is searching for AJAX JavaScript proxies. I think web developers only see the benefit of AJAX for the speed/performance advantage of web sites. If you want to build web sites that will be reachable through search engines you have to write two sites, one with the usage of AJAX for the human visitor and one for web crawlers that will need the complete HTML/text of the page.

Backbase.com is one of the web sites that is using two versions:

AJAX version: http://backbase.com/index.php?loc=content/home/company/news/008_ohra_backbase.xml#[0]
search engines: http://backbase.com/go/home/company/news/008_ohra_backbase.php

We have to think about this problem when we are designing web sites, not web applications. On real web applications the need of be search engine compatible is not as high as for news and product information web sites. Or will Google change their search engine?


Your Feedback
Andrey Sivtsov (BoonEx) wrote: AJAX IS search-engine-friendly if used right. I blogged this at http://www.boonex.com/news/archive/AJAX_is_NOT_A_Search_Engine_Killer_An... We made a forum (Orca), which is 100% AJAX-based but is totally search_engine/user friendly.
Contradiction wrote: Yet I see elsewhere on the site today that there is a company called BoonEx that claims its AJAX-based software is "search engine friendly" - Ashkok how do you explain that? Here is their URL: http://www.pr.com/press-release/23685
quezztion wrote: So what's the conclusion: AJAX-good or AJAX-bad? or does it all depend on WHERE the AJAX is being implemented?
quezztion wrote: So what's the conclusion: AJAX-good or AJAX-bad? or does it all depend on WHERE the AJAX is being implemented?
ThiagoRS wrote: That's a good point. But the problem is: Google and other SE may see this as cloaking, a method to cheat search engines delivering different contents to humans and crawlers, and you can be dropped from the result pages. I think another way to get your AJAX'd pages spidered is put the corresponding link in href and do not follow it, like this: < a href = "myContent.htm" onclick="myAjaxMethod();return false;" > My text < /a > The 'return false' will make the browser not follow myLink.htm, but will call myAjaxMethod. SE spiders WILL follow myContent.htm and index contents. This works and you won't be penalized by search engines. PS: Sorry by my bad bad english :)
mschwarz wrote: crawlers will only search for HREF, SRC or some other link tags. I never saw a web crawler that is searching for AJAX JavaScript proxies. I think web developers only see the benefit of AJAX for the speed/performance advantage of web sites. If you want to build web sites that will be reachable through search engines you have to write two sites, one with the usage of AJAX for the human visitor and one for web crawlers that will need the complete HTML/text of the page. Backbase.com is one of the web sites that is using two versions: AJAX version: http://backbase.com/index.php?loc=content/home/company/news/008_ohra_backbase.xml#[0] search engines: http://backbase.com/go/home/company/news/008_ohra_backbase.php We have to think about this problem when we are designing web sites, not web applications. On real web applications the need of be search engine compatible is not as high as...
Latest Cloud Developer Stories
They say multi-cloud is coming, but organizations are leveraging multiple clouds already. According to a study by 451 Research, only 21% of organizations were using a single cloud. If you've found yourself unprepared for the barrage of cloud services introduced in your organizati...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management to...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achiev...
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intend...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heter...
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
Click to Add our RSS Feeds to the Service of Your Choice:
Google Reader or Homepage Add to My Yahoo! Subscribe with Bloglines Subscribe in NewsGator Online
myFeedster Add to My AOL Subscribe in Rojo Add 'Hugg' to Newsburst from CNET News.com Kinja Digest View Additional SYS-CON Feeds
Publish Your Article! Please send it to editorial(at)sys-con.com!

Advertise on this site! Contact advertising(at)sys-con.com! 201 802-3021



SYS-CON Featured Whitepapers
ADS BY GOOGLE