Ajax Fundamentals – Part 3

Index | Part 1 | Part 2 | Part 3 | Part 4 | Part 5

3. Security Issues

The XMLHttpRequest object is restricted to run within the browser’s security sandbox. Any resources requested by the XMLHttpRequest object must reside within the same domain from which the calling script originated. Therefore the XMLHttpRequest object is constrained to requesting resources that reside within the same domain from which the script was originally served. The W3C states, that in the future The XMLHttpRequest Object specification will define a way of doing cross-site requests.

There is an overabundance of online documentation stating that Ajax introduces multiple security threats. These threats include fake requests, denial of service, cross-site scripting (XSS), reliance on client-side security, and more. Jeremiah Grossman’s article, Myth-Busting AJAX (In)security, maintains that these security issues existed before Ajax and the recommended security best practices remain unchanged. Part of internet security basics is to distrust the client. Ajax is a client side technology and requests need to be treaded with the same caution as all other calls.

4. Disadvantages

4.1. Usability Problems

Web users have become familiar with using classic web pages. Users have become accustomed to using browser features such as the back and next buttons. Ajax calls are not loaded onto the browsers navigation stack. Therefore the back button will not undo the last ‘Ajax operation’. Developers need to explicitly cater for undo operations.

Ajax enabled pages have a notion of state; this state is altered as the user navigates through the site. The browser is unaware of this state, and it is not reflected in the address bar. Therefore users are not always able to book mark a certain page state. Developers need to consider this when deciding on when to use Ajax.

The asynchronous nature may make page updates difficult for the user to notice. The developer needs a way of drawing the user’s attention to the modified section of the page. The ‘yellow-fade technique’ has become common practice. In this technique the changes are highlighted with a yellow background, and the yellow fades gradually. These and other techniques are becoming familiar to web users.

4.2. JavaScript Disabled

If a user has disabled JavaScript no JavaScript will be executed and certainly no Ajax requests will be sent. A site that relies too heavily on JavaScript will be crippled if the user disables JavaScript, Gucci is an example of this. This site used to show a blank white page if a user browsed to it after disabling JavaScript. Now it explains that you need to have JavaScript enabled to use the site.

A site that relies on JavaScript for core functionality restricts its users to those who have JavaScript enabled. Having said that we should look at how many users are allowing JavaScript to run in there browsers. The following statistics from The Counter show JavaScript statistics.

Wed Mar 1 00:01:01 2000 – Fri Mar 17 23:59:01 2000
Javascript 1.2+: 260153365 (79%)
Javascript <1.2: 7790575 (2%)
Javascript false: 59884983 (18%)

Thu Mar 1 00:01:01 2007 – Thu Mar 22 13:58:01 2007
Javascript 1.2+: 56607094 (95%)
Javascript <1.2: 193622 (0%)
Javascript false: 2524217 (4%)

These stats show that most users are using JavaScript, therefore many stake holders would approve of the use of Ajax on their sites. Although there are cases where limiting access is unacceptable.

4.3. Search Engine Indexing

Web crawlers automatically navigate through the web by following link tags e.g. href, src etc. Current web crawlers do not examine client side scripts. Search engines skip the JavaScript code that executes the Ajax calls. Content loaded by Ajax calls will not be indexed by search engines.

Multiple solutions have been proposed to solve this problem. Most of these are workarounds and do not solve the underlying problem. In fact some approaches are considered as cloaking and can get the site black listed by Google. Google have stated that they will improve there searches, at present they do not follow Ajax calls. In order to make sure that specific content will be found by search engines, the site should have standard HTML links that web crawlers can follow.

Index | Part 1 | Part 2 | Part 3 | Part 4 | Part 5

5 thoughts on “Ajax Fundamentals – Part 3

  1. Pingback: Ajax Fundamentals - Part 2 « Project Entropy

  2. Pingback: Ajax Fundamentals - Part 1 « Project Entropy

  3. Pingback: Ajax Fundamentals - About & Index « Project Entropy

  4. Pingback: make browser button go back 2 pages javascript

  5. Pingback: Ajax Fundamentals - Part 4 « Project Entropy

Leave a Reply

Your email address will not be published. Required fields are marked *