Last September, Microsoft dropped a bombshell when they announced that they were dropping development of the Threat Management Gateway (TMG) product along with their decision to cease production of on-premises ant-virus products. The problem for the Exchange community was that TMG had become the de facto choice as a reverse proxy deployed alongside Internet-facing Client Access servers to handle inbound client traffic.
Since the original announcement, Microsoft has done its best to reassure customers and explain that TMG support remains in place until April 2015. In a nutshell, although no more TMG licenses can be bought, you can continue to run TMG alongside Exchange 2007, 2010, and 2013 until support expires.
But thinking about the situation after a thought-provoking discussion with Greg Taylor of Microsoft, I wonder whether the function served by TMG and ISA Server, its immediate predecessor, is focused on the needs of the past rather than the present. If you go back to a time when Outlook Anywhere started to popularize HTTPS connectivity instead of running MAPI RPCs over a VPN, the target infrastructure was Windows 2003 servers and Exchange 2003 SP2. External threats abounded as hackers attempted to penetrate past corporate firewalls to attack unhardened internal systems, including Exchange.
So it was logical to deploy multiple levels of protection, starting at the firewall and going through servers to perform tasks such as packet inspection before any traffic was allowed to go to an internal server. The approach worked and has served IT well as long as IT exerted strict control over networks, devices, and servers.
The same conditions do not exist today. On the plus side, the latest version of Windows and application servers like Exchange are more secure than they were in the past, thanks to customer pressure to drive improvement and changes in Microsoft’s engineering practices to enforce “secure by design”. On the downside, infrastructures have to cope with connections coming in from a multitude of device types, not all of which are “approved” because of the popularity of BYOD.
The latest versions of Exchange demand nothing more than TCP (port 443) to be open on corporate firewalls before clients can connect. The question then is what additional processing needs to happen before a sanitized traffic stream from the firewall hits an Exchange server. And as it turns out, the answer is “not much”, largely because Windows and Exchange have the capability to protect themselves against suspect packets and because the latest generation of firewall-cum-load balancer products are capable of doing much more than simply blocking inbound traffic. If this assertion is true, then what value does a product like TMG or UAG deliver? And is that additional product even required to maintain a secure environment?
Strong opinions will no doubt be voiced on this topic. Security professionals take their job very seriously and abhor anything that might expose a company to risk. But in defence of advocating the heresy of passing traffic direct from firewall to Exchange, I point out that some in the security community have considered that erecting strong barriers and depending on them for protection against network threats has been a fool’s errand for many years. The Jericho Forum, part of the Open Group has led the charge to encourage the development of systems that can function without risk as part of the Internet without the kind of traditional barriers that have been erected to date. To get an insight into their work, you could do worse than reviewing a presentation called “The business case for removing your perimeter” given at the RSA conference in April 2008. It makes interesting reading.
I was responsible for HP’s security strategy during the 2004-2007 period. When I worked in that role I had the chance to debate the changing nature of security with members of the Jericho Forum. I always thought that they had interesting but maybe impractical ideas. Now it seems that their thinking might have been a little ahead of its time. Perhaps it is now appropriate to ask the question whether the now-traditional approach should be applied to protecting modern versions of Exchange and other Windows applications that are built to consume and filter HTTP traffic.
Security traditionalists and those who worry about protecting infrastructures against penetration will probably still argue that strong barriers have to be maintained. Their concerns should be taken into account when any security strategy is constructed as threat evolves and flexes all the time – and drives an entire industry dedicated to protection against malware, trojan horses, viruses, and the like. At the end of the day, the decision as to how deploy and protect servers depends on the security requirements and profile of individual companies, but I think that it’s worth thinking about how the attack surface of modern Windows servers differ from their predecessors and whether this influences your protection strategy.
Follow Tony @12Knocksinna
Leave a comment