to LooselyCoupled.com homepage
 
 Weekly emails: how to advanced search
 Glossary lookup:

 

> opinion > vendor viewpoint


Trust comes first in XML security

by Mark O'Neill
December 13th, 2004

"Trust then verify" was Ronald Reagan's maxim for international arms control. A similar maxim can be applied to security architectures in cross-firewall XML projects: "Trust then validate".

 
• print  • comment

A company needs to question the motives of a vendor who recommends a solution where XML validation is performed at the perimeter, on untrusted XML documents.

Mark O'Neill is CTO of XML security vendor Vordel, and author of the book Web Services Security, published by McGraw-Hill. Vordel has deployed its products for many organizations to enforce security over XML. In all cases, both trust and validation were required.


Glossary terms: firewall, SAML, XML, endpoint, lookup tool

Most XML projects that extend beyond the firewall involve closed user groups of trusted partners exchanging XML with each other. A CIO whose ERP system has a web services interface would never consider exposing this interface to the entire world. But she may consider linking an order processing module to the systems of certain suppliers, to avoid re-keying errors and to streamline business. Trust is vital to enable this type of integration. The nightmare scenario is that an untrusted entity could send data into the ERP system. The other potential threat to counter is if invalid data found its way into the ERP system.

If you are implementing a cross-firewall XML project, you must ensure that for any incoming XML document, the sender is trusted and the XML it contains is valid. "Valid but not trusted" is no good, and neither is "trusted but not valid". These two concepts have special meanings for XML and security:

  • Trust means having confidence in the reliability and validity of an identity. Identity is established either directly via a credential such as a password, or indirectly whereby a third party vouches for an identity (by issuing a digital certificate or a SAML assertion).

  • Validate has a special meaning where XML is concerned. An XML document is considered valid if it conforms to a particular XML schema or DTD. For security, though, it is necessary to take validation a step further. For a start, DTDs should not be used, since they introduce security vulnerabilities (it is partly for this reason that SOAP 1.2 specification recommends not using DTDs). Another issue is malicious content. An XML document can be valid against a schema which only enforces structural rules on the XML, but may still contain threatening content such as an attempt at SQL Injection. Therefore, from a security point of view, a schema must control not only structure, but content as well. 'Restriction' elements are added to schemas in order to ensure that the content of documents is as it should be, and does not contain threats.

Wrong way round
Why is "Validate, then trust" wrong? Imagine you are the CIO of a company who must implement an XML gateway to talk with a group of partners. The requirement is to ensure that only the partners can connect (trust) and that the XML they send does not contain threats (validate). It is tempting to think that it will suffice to have an XML firewall performing XML validation at the edge of the network, which passes messages through to the internal network for authentication. This is "Validate then trust." However this approach is flawed.

Why? Processing power is wasted by validating XML documents from senders before determining whether the sender is trusted or not. Why bother looking through an XML document for a SQL Injection attack if that XML document has come from an untrusted source? In that case, the sender has succeeded in wasting your processing resources, and if they send a large enough amount of XML data, they can perform a denial-of-service attack.

Another concern is that an attacker sending a valid XML document will get that document through the perimeter successfully, into the internal network. The XML firewall will have wasted cycles validating it, and, even worse, will allow it through because the XML is valid and the firewall is not checking the trust of the sender. The authentication layer will block the document, but only after wastefully expending cycles and network bandwidth beforehand.

Setting best practice
A company needs to question the motives of a vendor who recommends a solution where XML validation is performed at the perimeter, on untrusted XML documents. Is the vendor really selling XML acceleration technology and is therefore keen to ensure the perimeter XML firewall is heavily loaded with documents to validate? Wouldn't it be preferable to initially establish trust by performing authentication and then to validate only trusted documents?

Best practice should use the following principles:

Establish trust at the perimeter. Anyone not trusted should not be admitted to any further processing layers. This means that if an untrusted sender sends a malicious XML message, then the attacker will not be authenticated, and their XML is rejected outright without cycles being wasted trying to validate it.

Authentication should be done in a safe manner — with no credentials stored at the perimeter on an untrusted network segment.

Following authentication, a security token (eg SAML Authentication Assertion) should be inserted into the incoming XML message, to indicate that the message has been authenticated as coming from a trusted source.

Do everything else at the service. Authorization and XML validation should be done on a per-service level, enforced at the service endpoint itself. By enforcing content-validation only on authenticated senders, you are limiting yourself to senders who have mistakenly sent you malformed XML.

A further advantage of performing authorization at the service endpoint is that it ensures an attacker who bypasses the perimeter is detected and blocked. Because a security token was 'injected' into the XML at the perimeter, you can tell when a message has been sent by an attacker who has bypassed the perimeter. 'Deperimeterization' attacks are a growing trend, due to wireless LANs, VPNs, and consultants who work on-site.

Only a "Trust, then validate" architecture ensures that untrusted XML is blocked as early as possible. Beware of architectures where XML validation is performed before you know the sender is trusted. Such architectures have their security priorities the wrong way round.


More on this topic


Related

Security rules in SOA management
Which comes first in an SOA strategy — security or management?

Solving the web services identity crisis
The objective of loosely coupled integration is often completely lost when security integration is added ...

Information access waits on ID standards
Managing user identities is becoming a bottleneck in many web services projects ...


 
 


Copyright © 2002-2006, Procullux Media Ltd. All Rights Reserved.