Join 3,572 readers in helping fund MetaFilter (Hide)


Other domains don't necessarily have cooties
April 7, 2010 8:29 AM   Subscribe

Why does the cross-domain restriction exist on the XMLHttpRequest object?

Plenty of sites explain that the XMLHttpRequest object cannot fetch data from a different domain for security reasons. This makes plenty of sense.

However, there is no such restriction on the script tag/element. It's trivial to write out a script tag via the DOM and fetch any old script that was necessary. In fact, from what I understand that's how JSONP works (with the support of a remote script) to circumvent the cross-domain restriction.

If the cross-domain restriction is easily circumvented, isn't it simply a roadblock on the path of legitimate developers?

Thanks!
posted by burnfirewalls to Computers & Internet (10 answers total) 2 users marked this as a favorite
 
If the cross-domain restriction is easily circumvented, isn't it simply a roadblock on the path of legitimate developers?

First, you don't have access to the response to a script request — the browser executes it but other code can't read it. If you control the server to which you're making a request, you can do things like JSONP, but if you don't, it's more difficult to, say, take advantage of the user's session to scrape their Gmail account this way — something that would be trivial with XMLHttpRequest. (Although at one point Gmail was providing your contact list as a JSON array literal and some clever person figured out you could redefine the new Array() constructor which [] invokes to read the array as it was parsed by the browser — so, yes, it's not bulletproof.)

Also, there are a number of kinds of requests you can't do with the script element that you can do with XMLHttpRequest — non-GET HTTP methods, request payloads, etc.
posted by enn at 8:39 AM on April 7, 2010 [1 favorite]


Well, webpages are already able to issue arbitrary crossdomain GET requests (via images, etc.). The web's security model is a bit murky, but in theory it should be safe to issue arbitrary GETs; XmlHttpRequest can also do POSTs and whatnot, so it's more dangerous. It's also more capable of interpreting arbitrary result data than IMG or SCRIPT or IFRAME are.

Of course, being able to issue arbitrary GETs still opens you up to cross-site request forgeries and cookie-hijacking... but I think the above is the rationale for the restriction on XmlHttpRequest.
posted by hattifattener at 8:42 AM on April 7, 2010


Without the cross-site restriction, XMLHttpRequest would let you read user data from other sites. For example, suppose a user was logged into MetaFilter and visits your site. If you made a cross-site XMLHttpRequest to MetaFilter and could read the result, you could read any private data displayed by MetaFilter for that user. This could be used to read bank data, Facebook data or data from any site that the user is logged-in to.

Image tags are less of an issue because pixel data cannot be read from a third-party site. Script tags are an exception, presumably because developers assume that private data will exist in JavaScript files less often (external JavaScript files are not often dynamically generated.)
posted by null terminated at 8:56 AM on April 7, 2010


null terminated has it. The same origin policy (SOP) is not there to protect the user's computer. It's there to protect remote servers.

Specifically, say EvilSite.com had some Javascript that did an XmlHttpRequest for metafilter.com/email. If the SOP didn't exist, then EvilSite's code could get a copy of your email from Metafilter, then upload it back to EvilSite. (Note that EvilSite's XHR is sent with the cookies for the metafilter.com domain, because unfortunately that's how browsers work. I don't understand why they didn't make the SOP just be a restriction on cookies sent, instead of all requests, maybe there's some other attacks?)

Remote scripts, including JSONP, are considered "safe" because the server has specifically chosen to serve the data as an executable Javascript. There's an assumption that nothing private will be in scripts on Metafilter that EvilSite could exploit. This is a bad assumption these days in the era of JSONP, and no doubt there are servers whose data can be stolen because of it. But it's how we work.

The browser security model is remarkably complex because of all the private data it's managing and trying to sandbox between different sites. There's a whole world of cross-domain security exploits. The SOP closes one really simple obvious attack, but IMHO the underlying security model is broken.
posted by Nelson at 9:05 AM on April 7, 2010


Thanks for the answers! So in other words, it's not that the restriction prevents all cross-domain exploits, but prohibits a large class of dead simple cross-domain exploits.

enn: I wanted to clarify something for future readers of this post because at first glance I was confused by what you said. It is possible to access the response of a script request, at least the Javascript that would normally be accessible. I'm not sure about the response headers, but in terms of actually reading the Javascript that's loaded from a dynamically-written script tag which calls a remote JS file, then other scripts on the page have access.
posted by burnfirewalls at 10:10 AM on April 7, 2010


burnfirewalls: Can you explain how you would access the JS response to a script tag? My understanding was that response is executed, and of course if that code sets any global state that state is accessible to other code running on that page, but the actual text of the response is not accessible.
posted by enn at 10:31 AM on April 7, 2010


It is not possible to access the result of a script response directly. The browser takes the script response (which is JavaScript) and runs that code. You can learn about this code, but only through JavaScript. For example, you might be able to call x.toSource() to read the source of a function in Firefox if you know that a certain script includes var x = function(){...}.

This distinction is the basis for one particular (hacky) protection, which is to add either garbage or a infinitely looping JavaScript to the top of an JSON response. For example, on the top of many JSON results you will see things like "for(;;);" or "while(1);". This prevents loading using a script tag and is how Gmail initially protected against this attack (and still might).
posted by null terminated at 10:32 AM on April 7, 2010


On a more practical level, there is a backwards compatibility issue. <script> was first mentioned in passing in HTML 3.2 (Jan 1997) and fully fleshed out in HTML 4.0 (Dec 1997) and XMLHttpRequest first came to life in IE5 (Mar 1999) but didn't actually come into its own until much later. Since <script> came first and allowed remotely hosted scripts, there would have been a ton of broken sites if browser implementers changed the <script> policy when implementing XMLHttpRequest.
posted by Rhomboid at 12:09 PM on April 7, 2010


enn & null terminated: if the remote script is loading information into the global namespace, then isn't the information accessible to other scripts on the page?
posted by burnfirewalls at 1:53 PM on April 8, 2010


Yes, of course, but there is a stark difference between "you can see the side-effects of executing a piece of code, if there happen to be any" and "you can see the piece of code", particularly when the "piece of code" isn't actually code at all, i.e. the attacks where you want to scrape the HTML response of an authenticated cross-site request.
posted by Rhomboid at 2:02 PM on April 8, 2010


« Older What are some interesting ways...   |  Strategies for dispatching the... Newer »
This thread is closed to new comments.