Do PSTs contain anything of value?


In musing about the news of the PST Capture tool that Microsoft plans to release soon, I started to consider the question of whether tools like this can actually find and recover any useful information. The worry of executives and lawyers is that PSTs contain all manner of corporate secrets and other information that they should control. They fondly imagine that if they could only recover this information from users, they would have much better sight over corporate assets such as intellectual property, contracts, product information, and so on. The reality is sadly different.

Traditionally, PSTs have acted as pressure valves to relieve the pressure of restricted mailbox quotas. As users approach or exceed their mailbox quotas, they move items from their online mailbox into a PST. Life can continue, mail can flow, and the user doesn’t have to care too much about their PST until the next time that a quota emergency approaches or they need to find something that they moved. Most users are “pilers” and aren’t very good at filing. At least, they might make an effort early on but soon the additional steps necessary to file messages into appropriate folders becomes too much trouble. The upshot is that their mailbox tends to be organized into the default folders (Inbox, Sent Items, etc.) and a small number of other favorites such as “Personal”.  PSTs are probably not going to be much better organized as they often mimic the filing structure used for the primary mailbox. Fortunately, the advent of good search mechanisms and improvements in both Exchange and Outlook has meant that the need for folders has largely disappeared and it’s OK to have mega-folders containing 50,000 items or more, assuming of course that you’re running Exchange 2010 and have been assigned a large quota.

Given that PSTs have acted as a pressure valve for mailboxes, is it likely that they really hold much valuable corporate data that can be harvested and secured with a tool such as the PST Capture proposed by Microsoft or similar software sold by Transvault, Symantec, and Sherpa Software? The answer is “it depends”.

The PSTs used by executives might well be interesting – but in most companies executives don’t have to worry about mailbox quotas and therefore don’t need to resort to PSTs. All of their valuable information is likely to be online in a well-organized mailbox run by their administrative assistant.

PSTs used by “knowledge workers”, people who generate intellectual property such as engineers, developers, and other professionals, might be a happier hunting ground in terms of a search for valuable corporate information. There’s a fair chance that some items of interest will be lurking in PSTs such as drafts of patent applications or invention disclosures, descriptions of deals being developed with customers, proposals for corporate alliances, budgets and financial reports, and marketing plans are examples of the kind of data that a PST capture tool might usefully recover and import into an online mailbox, archive mailbox, or other repository where the information can be indexed and exposed to corporate control.

Alas, the vast bulk of the PSTs in use probably don’t contain very much of interest and the danger therefore exists that any effort to discover and recover information from PSTs held on user PCs will end up in a massive import of absolute rubbish into Exchange or another repository. And once that rubbish gets online, it will act like cholesterol clogging up the arteries of Exchange.

If you doubt my premise you might consider examining the contents of some user PSTs, assuming that users will allow you to go through their folders. I’ll bet that you’ll find items such as:

  • Old calendar appointments and responses lovingly preserved to record the fact that the user was actually invited to attend such a gathering.
  • Delivery receipts and non-delivery notifications – no, I don’t know why people keep these things, but they do.
  • Examples of junk mail kept just in case the user feels the urge to purchase drugs, sell gold, or send money via Western Union to a correspondent in Nigeria.
  • Banal and inane interactions between users of the type that should be immediately deleted upon receipt. Chain letters are in this category as is anything to do with fluffy cats or lost dogs.
  • MP3 and MPEG files downloaded from doubtful web sites.
  • Some useful information that needs to be retained (small percentage).

Don’t get me wrong. I am sure that there is much useful data stored in PSTs. For example, Stephen Griffin’s blog explains the problem posed by one law firm who used a separate PST for every case. Apparently PSTs became the method used to transfer the information relating to cases between lawyers and users commonly opened several hundred PSTs at one time. Stephen, who’s probably Microsoft’s primary MAPI developer, goes on to explain how they overcame the challenges and that Outlook can open up to 300 PSTs but really runs out of steam after 100. The term he used was “noticeable performance issues”! I’m not actually sure how this company could move away from PSTs because they have clearly built a form of workflow around transferring files between individuals nor am I clear about how they’d use online mailboxes for the same purpose. However, getting back to the point in hand, this company is the exception that proves the rule and in general the problem with the data held in PSTs is how to separate the wheat from the chaff.

I see two dangers ahead. The first is that some administrators will be so excited by the availability of the PST Capture tool that they’ll go ahead and import every PST they can find and thus cause Exchange to collapse under the strain of the CPU and I/O processing required to introduce a vast amount of junk into the Store, not to mention the increased disk and backup requirements. The second is that administrators will believe the PR that large mailbox quotas are good because Exchange 2010 does such a fantastic job of supporting mega-mailboxes so they’ll go and enable large quotas for all, leading eventually to the storage of even more of the crap described above – this time online.

The net learning from all of this is that PST find and import tools are good, but only when used properly to find and recover information that is valuable to the company. I hope that people remember this when Microsoft eventually releases their new tool to the community.

– Tony

Posted in Exchange 2010, Outlook | Tagged , , , | 3 Comments

Moving a database between DAGs


A recent email discussion between MVPs focused on a topic that I totally missed in my Microsoft Exchange Server 2010 Inside Out book (also available at Amazon.co.uk), so think of this as an update to chapter 7. The problem arises after you set up a Database Availability Group (DAG) with mailbox servers and multiple databases and copies and then discover that you need to move a database to another DAG. What’s the best way to proceed?

Database portability was a major focus for Exchange 2010 as it laid the foundation for the native high availability features centered on the DAG that now exist in the product. The unbreakable connection that exists in all previous versions of Exchange to tie a database to a specific server no longer exists and the active copy of a database can be mounted on any server within a DAG. But a limitation still exists in that Exchange defines the boundary for a database’s “switchability” to be the DAG that “owns” the database (beneath the surface, the Active Directory object for the DAG maintains a list of all the databases in the DAG). This small but important detail means that an Exchange 2010 mailbox database is much more portable than previously but boundaries still exist in some circumstances. The notion of a totally portable database is not yet implemented.

Databases owned by a DAG

Every mailbox database is “owned” by a mailbox server or a DAG. When a mailbox database is created on a standalone server, its owner or “master” is set to be that server. If the mailbox server is then added to a DAG, the databases owned by the server are taken over by the DAG as this then allows the DAG to move the database between servers in the DAG once copies are created.

The obvious solution is to use the move mailbox function to move all the mailboxes from the database to a database or databases in the second DAG. This is easy for a small number of mailboxes but requires some planning for databases that hold hundreds or even thousands of mailboxes, but it’s very feasible. In fact, Microsoft moves mailboxes all the time inside their Office 365 datacenters to balance mailboxes across the databases used for their Exchange Online service.

The steps involved are:

  • Identify the mailboxes to be moved. You can use the Get-Mailbox -Database command to create a list of the mailboxes and output it to a text file or simply sort the display of EMC by database.
  • Decide how and when to move the mailboxes. One approach is to move the mailboxes in batches and use the “suspend when ready to complete” flag. This tells the Mailbox Replication Service (MRS) to work in the background to create the new mailbox in the target database, enumerate the data in the source mailbox and copy it to the new mailbox and then pause to wait for you to release the move to completion. The copying of the data might be performed overnight or over a weekend when overall system and network demand is low. You can then check the move reports to ensure that all has gone well before resuming the moves. When the moves are resumed, MRS enumerates the source mailbox again to discover what changes have occurred since the move and then copy the delta to the new mailbox before switching pointers in Active Directory to complete the move.
  • Check whether any arbitration mailboxes or discovery search mailboxes need to be moved. Remember that discovery search mailboxes can hold very large amounts of information retrieved through searches so these mailboxes can take a long time to move.
  • After all the mailboxes have been moved, you can remove the mailbox database and its copies from the DAG and eventually delete it.

Moving mailboxes works well but can become a mundane exercise in shuttling data to and fro. A second idea was proposed by Tim McMichael of Microsoft. He thought that it would be possible to use a “swing server” to transport the database from one DAG to another. In this scenario, you would commission a specific mailbox server and introduce it into the first DAG. You then transfer a copy of the mailbox database to be transferred to that server and make it active. Next, you remove all other copies of the mailbox database so that the only copy is now present on the swing server. You then remove the swing server from the first DAG and add it to the second DAG before completing the process by creating whatever number of additional copies of the database are necessary in the second DAG. Finally, you remove the copy of the database from the swing server and remove it from the second DAG.

There’s no doubt that the swing server technique will work and I see a great deal of value in it in a situation where either multiple databases have to be moved between DAGs or the volume of the data in the mailboxes to be relocated is so large that you cannot use the move mailbox technique to accomplish the goal in a reasonable timeframe. However, there are some downsides to the swing server technique. First, it obviously requires additional hardware. Second, there will be a time when the mailbox database (or databases) lose redundancy as you have to reduce them to just one copy before it’s possible to move between DAGs. The redundancy can be reintroduced as soon as the databases join the second DAG but even so, it will take time for the copies to be seeded. Last, this is a moderately complex multi-step technique that I am loathe to recommend within a production environment.

The complexity can be largely addressed by scripting the steps to move databases, remove copies, remove from the first DAG, join the second DAG, and recreate copies but that script does not exist today and it needs to be developed, tested, and debugged. By comparison, we have lots of experience of moving mailboxes and Exchange 2010 allows the moves to be performed offline in the background, paused before final completion, and batched in convenient sets for movement. I therefore prefer the move mailbox approach.

Now that I have poured cold water on Tim’s idea, let me conclude by telling you that he is an acknowledged expert in all matters to do with Windows Failover Clustering and DAGs. Therefore, you should consider coming to listen to Tim and many other experts (including Kevin Allison, General Manager of the Microsoft Exchange engineering group) speak at the Fall 2011 Exchange Connections conference in Las Vegas (October 31 to November 3). An early bird discount of $100 is available until August 1. If you add the code “SPKR” to your registration, you’ll get an additional $50 discount.

– Tony

Posted in Exchange 2010 | Tagged , , , , | 23 Comments

On PST ingestion: Microsoft brings a new tool to the table but…


Today’s announcement that the Microsoft Exchange development group plans to release a tool to allow administrators to discover Personal Storage (PST) files on a network and then import the data into Exchange 2010 on-premises servers or Exchange Online (Office 365) left a couple of unanswered questions floating. Don’t get me wrong. I hate PST files as I think they are the work of the devil – unstable, unprotected, and out of control. It’s an area of the Exchange/Outlook product combination that has needed attention for years and it’s surprising that Microsoft has taken so long to make a move in this direction, some two years after the launch of Exchange 2010 and the introduction of online archive mailboxes.

The first question that comes to mind is exactly when will Microsoft release the tool? There’s no firm date in the announcement, just “later this year“. Maybe it will be a Christmas gift.

Second, what will be Microsoft’s added value over the proven PST ingestion tools that companies such as Transvault and Sherpa Software have developed over many years? The cynic in me says that the added value will come through the zero dollar value price point, which is fine as far as it goes and certainly better than the current situation. However, it’s also fair to say that third-party software vendors have invested a great deal of intellectual capital in their products and that they deserve your attention and support in terms of investigation before you assume that any software written by Microsoft to perform a particular task will automatically be the best. Some testing of available PST ingestion tools in your own environment is therefore required to identify whether, for instance, one tool is more automated than another or whether one allows greater flexibility in terms of deciding what data should be ingested, ideally through easy-to-understand and easy-to-deploy policy settings that can truly unearth PSTs lurking on PCs scattered around the enterprise.

I’m sure that Microsoft will avoid the trap of providing a tool that makes it too easy to ingest vast quantities of PSTs without some control. There’s no point in ingesting tens of gigabytes of absolute crap from user PSTs into an online mailbox database just because you can…

Automation is a big thing for me when I look at tools that are designed to help administrators and is particularly valuable if the software saves time. Remember that time is money and a paid-for automated tool that saves an Exchange administrator a lot of time is so much better in the long run than something that’s free and does a basic job.

Of course, I’m writing this without any idea whatsoever of the functionality that Microsoft will bring to the table and it is entirely possible that they will deliver a fantastic automated all-singing and all-dancing PST ingestion tool that allows administrators to sit back, admire, and watch the gigabytes flowing into mailbox databases. If this is the case then I have another concern and that is why Microsoft is investing time in tools like this that eliminates an area covered by third-party software vendors when there are other areas that they could deliver real value in, such as figuring out the long-term future for public folders.

Microsoft can rightly say that they are simply answering customer concerns and that’s a fair and valid statement. Microsoft can also say that competition spurs progress and that the third-party software vendors will increase the feature/functionality set in their products to stay ahead of the free tool. I hope this is true but I do worry when the elephant in the Exchange market threatens to squash some of the smaller players who have helped to make Exchange such a success.

Last, what does Microsoft mean when it refers to “pirate growl” when it mentions the “dreaded PST scourge” in the blog post? I think I understand the aforementioned scourge to mean the uncontrolled sprawl of potentially valuable information held in the worst-secured file format on the face of the planet, but I am baffled at the reference to pirate growl.

Lots to think about – and I look forward to investigating just how well Microsoft’s PST ingestion tool works when it is released to the public.

– Tony

Posted in Exchange 2010, Office 365 | Tagged , , , , | 7 Comments

A day walking the Somme battlefields


Somme poppies

I’ve been interested in the battles that flowed across France in both World Wars for many years. My interest was partially awakened by spending time close to the D-Day beaches in Normandy but I also had the chance to live with a French family in Verdun in 1973 for a number of weeks. The purpose for my French trip was a student exchange where I’d learn to speak passable French, but more importantly I recall happy memories of eating vast quantities of strawberries grown in the garden of Monsieur Perinel, swimming in the Meuse river, and many opportunities to poke around the battlefield where German and French armies had bled themselves dry in 1916 as they fought over landmarks such as the Fort de Douaumont, Fort de Vaux, and Mort-Hommes. The Ossuaire made a huge impression on me as did the many museums that held relics of the battle and those who had fought and died around Verdun. If you want to read about the battle, I suggest you try Alistair Horne’s The Price of Glory: Verdun 1916, which provides a good overview of the strategic and tactical struggle that took place around the town.

Aside from Verdun, northern France is full of battle sites, none of which is more emotive than the Somme. This year marks the 95th anniversary of the opening of the Battle of the Somme (1 July 1916) when the British and French armies attacked along a reasonably long front in an area of the front that was deemed to be “quiet”. The advance was preceded by a week’s artillery bombardment that was too light to do the necessary damage to the German dugouts or wire and the net result was a depressing loss of life for little gain in territory. Nevertheless, the Somme remains an interesting and provocative area to explore that I have visited several times over the years.

On this occasion I wanted to walk some of the ground that the Ulster (36th) division had attacked on 1 July 1916 and visit some other interesting sites – the Somme is full of places to explore and I only had a day. My primary guide for exploration was Peter Barton’s amazing The Somme. There are tons of books that have been published about the Somme including my favorites Lyn MacDonald’s Somme and Martin Middlebrook’s The First Day on the Somme 1 July 1916, but Barton’s book contains many panoramas taken (with pretty basic photographic equipment) by the British Army from 1915 onward (when they took over the Somme sector from the French) in preparation for the advance that they hoped would win the war. The book also contains up-to-date pictures that broadly match the older panoramas and allow you to compare and contrast the landscape of today with the situation that faced the British tommies in 1916.

Another, out-of-print, book that is a valuable resource is “Somme: Then and Now” (John Giles – ISBN 0-900913-41-X). Originally published in 1986 and reprinted three times since, this book contains a lot of photos taken in the 1970s and 1980s that allow you to compare the battleground of today with its state thirty or forty years ago. In particular, you can see how trees and other growth have covered more of the battlefield in that time.

There are other books that are more specifically focused on providing details of walks that you can take through the battlefield sites. The best of these include Major and Mrs Holt’s Battlefield Guide to the Somme (nice expensive pocket book), Middlebook Guide to the Somme Battlefields (much bigger and more expensive), or Walking the Somme (a nice balance). If you don’t want to buy a specific guide book, there are other references available online, such as the BBC’s “Six stands in one day“.

I prepared by visiting the Museum of the Great War in Peronne. There are other museums and collections of military hardware dotted around the Somme, notably the Somme 1916 museum in Albert. However, I find that the Peronne museum provides an excellent overview of the war and the pressures that were exerted on those who fought.

Heading out of Peronne on July 1, my first stop was Delville Wood, which is close to Longueval village. Delville Wood is the site of the South African national monument to those killed in both World Wars, chosen in part because of the enormous sacrifice of the South Africans in July 1916. At 3:25am on July 14, the South African infantry brigade attacked Delville Wood with 121 officers and 3,032 other ranks. They were relieved during the evening of July 20 and numbered just 29 officers and 751 other ranks.

The South African memorial at Delville Wood

Delville Wood is only some 154 acres in size and is now preserved after being purchased by the South African government in 1920. It has regrown to become a thick tangle of trees, chiefly oak and beech and is intersected by grassy rides. However, if you attempt to venture off the grassed area into the wood, you quickly find that the ground underfoot is not easy due to the shell holes that are hidden by undergrowth. During the fighting in 1916, the South African and British troops in the wood were sometimes assaulted by frightening concentrations of artillery: one estimate for a bombardment on 18 July put the number of guns used at 116 field guns and upwards of 70 howitzers and heavy guns. The bombardment lasted for seven and a half hours…

Peace today in Delville Wood

There are some preserved trenches in Delville Wood that wind their way through the trees. You can also see a Hornbeam tree, a survivor from the fighting whose trunk contains some shell fragments.

Poppy crosses at the last remaining Hawthorn tree

From Delville Wood I headed through Longueval to High Wood to have a view of some of the other wood-dotted terrain that created natural blockages and obstacles for the Allied advance. In this part of France, woods resemble islands in a sea of crops and although there wasn’t much cultivation going on in 1916 the woods remained prominent features that were heavily defended and took much effort to overcome. The British started to attack High Wood on 14 July 1916 and only managed to overcome the stubborn German resistance two months later.

The lurking mass of High Wood

Next stop was the Thiepval Memorial to both visit the memorial and to use the car park as a convenient departure point for my walk. The memorial was designed by Sir Edwin Lutyens to commemorate the 72,192 missing British and Commonwealth men who fought on the Somme battlefields and have no known grave – but their names are carved into the walls of the monument. The figure for the missing differs from authority to authority, so I cite the data from the Commonwealth Graves Commission. The memorial is built roughly on the site of the old Thiepval Chateau that was pulverized during the battle.

A slight hitch developed when I ran into some commemorations at Thiepval Memorial which necessitated leaving the car on the side of the road. The commemorations are an annual event organized by the Royal British Legion to mark the anniversary of the first day of the Battle of the Somme. Lots of people of different nationalities had turned out to see the bands and observe the ceremonies. Afterwards, many leave their own tributes to the fallen at the memorial.

Poppy crosses placed on the Thiepval Memorial

It was interesting to run into some politicians at Thiepval, including Peter Robinson (First Minister of Northern Ireland) and Owen Paterson (Northern Ireland Secretary – UK Government). Lots of military representatives laid wreaths, including the Bundeswehr (German Army).

Owen Paterson and Peter Robinson at the Thiepval Memorial

So far I had been driving. Now it was time to walk down from Thiepval to the Ulster Tower, which is about a kilometer away. I started from beside the memorial to the 18th (Eastern) Division on the D151 road. Looking across the valley you can see Thiepval Wood (also known as Authuille Wood) to the left and the Ulster Tower to the middle right. Mill Road cemetery is to the right of the Ulster Tower and Connaught cemetery can just be seen at the edge of Thiepval Wood. In 1916, the British front line ran along the edge of Thiepval Wood. The Ulster Tower is positioned close to a point in the German front line known as the “Pope’s Nose”, and Mill Road cemetery is close to a German fortification called the Schwaben Redoubt.

18th Division Memorial at Thiepval: the Ulster Tower is in the middle left distance

On the way to the Ulster Tower I diverted to walk up to Mill Road cemetery. The view back to Thiepval shows that the German positions supported each other. The British could attach Thiepval and be fired upon from the flank from the Schwaben Reboubt while the reverse is also true.

View from Mill Road cemetery back to Thiepval Memorial

The Commonwealth Graves Commission (CWGC) does an excellent job of maintaining its graveyards, but what makes these places special is the unique stories that you run into time after time. Of course, every one of the gravestones tells its own stories but there are some that are a little different. In the case of Mill Road, many of the tombstones are laid flat on the ground because of the large number of tunnels and dugouts from the old German trench system that are in the area.

I found the gravestone of 2nd Lieutenant E.W. Lee of the West Yorkshire Regiment, killed on 28 Sept 1916. The relatives of Lieutenant Lee had placed a nice photo of him at the base of the tombstone and wedged it down with a rock and a poppy cross. Lieutenant Lee was killed during one of the many assaults that the British launched on Thiepval and its surrounding fortifications after July 1. Although the 36th Division made an amazing attack on July 1 to seize much of the German front and second lines, including capturing the Schwaben Redoubt, reinforcements could not get to them due to the criss-cross of machine gun fire from all sides and the British did not capture the Schwaben Redoubt finally until 14 October 1916 when a combined attack by the 18th and 39th Divisions succeeded.

Headstone of 2nd Lieutenant Lee

Apart from the way that woods dot the landscape, another unmissable characteristic about the Somme battlefield that really strikes home when you walk is how the land rises and falls.  While some pictures make the battlefield look reasonably flat there are a mass of hollows and dips. The Germans held the vast majority of the high ground and while there’s no mountains or cliffs for attacking troops to scale, the distances of a couple of hundred yards between the front line was often made a lot more challenging by the slope of the ground.

For example, heading down the hill from the Thiepval Memorial to the Ulster Tower there’s a reasonably small fall of 15m over roughly a kilometer. Think of this line as roughly equivalent to the German front line on July 1. But head down Mill Road from the Ulster Tower towards the Ancre Valley and the ground drops more steeply from 130m to 70m in another kilometer and this is the ground that the British troops advanced across. Attacking across shell-holed ground into wire-protected lines protected by machine guns was never going to be easy. Add a slope and the task gets a lot more physically demanding.  It also makes communication much harder for the runners who relayed news back to headquarters and creates horizon lines that highlight soldiers crossing from one side to the other and make them easier targets. After walking the area my respect for the men of the 36th Division who attacked from Thiepval Woods across Mill Road on July 1 increased enormously.

Ceremony at the Ulster Tower

The Ulster Tower (built in 1921) hosted a ceremony to remember the fallen of the 36th Division. Although the men of the 36th were largely drawn from the Protestant community in Ireland, it was good to see that communities from all parts of Ireland were represented at the ceremony. The band of the Royal Irish Regiment (RIR) played as did a band from Belfast dressed in the khaki uniforms worn by the British Army in World War I.

Band of the Royal Irish Regiment plays at the Ulster Tower

Colour party for a 1914-replica marching band with Thiepval Memorial in background

The formal ceremony ended with laying of wreaths by politicians (U.K., Irish, and French), the military, and other organizations and was then brought to a close with the “Last Post” played by an RIR bugler and the affirmation by the crowd of the words by the First World War poet Laurence Binyon that  “We will remember them”.

RIR bugler plays the "Last Post" at the Ulster Tower

Because the day was so busy, the people who run the Ulster Tower weren’t offering tours of Thiepval Wood where they have restored trenches and other fortifications to the state they were in July 1916. The wood is now private property and can be dangerous due to unexploded munitions. Tours can be booked at the Ulster Tower and the entrance to the wood is by the side of Connaught cemetery.

Moving on, I walked back to my car and drove to the Newfoundland Memorial at Beaumont-Hamel. This 74-acre site was bought by the people of Newfoundland in 1921 to commemorate the sacrifice of the Newfoundland Regiment on July 1, 1916. The ground has largely been preserved since that time and includes the remnants of trenches (some protected by pathways laid in the trenches) and the shellholed ground that the Newfoundland Regiment attacked across into ferocious German machine-gun fire with predictable consequences.

Preserved trenches at the Newfoundland Memorial (Hunters Cemetery in the background)

It really is a sobering experience to look at the ground that the Newfoundlanders advanced over and realize that they must have looked to the Germans like so many sitting ducks. Most of the troops reached no further than the Danger Tree, a landmark in No-Mans Land. All of the 22 officers and 656 of the 758 other ranks that advanced became casualties. Some say that their sacrifice assisted the attack of the 36th Division because the Newfoundlanders had not advanced, the Germans would have been able to fire on the 36th from their flank. Whatever happened, the net result was a disaster.

The "Danger Tree" in No-Mans Land

The walk around the Newfoundland Memorial takes some time because there’s a fair amount of ground to cover if you want to look at the trenches, see the famous “Y-Ravine” that the Germans used both as a strongpoint and a communications route to their rear, and visit the various memorials to different divisions that are dotted around the area. The memorial to the 51th (Highland) Division contains a Scots Gaelic inscription. My knowledge of Irish Gaelic allowed me to understand some of the words, which I later found out to mean “Friends are good on the day of battle”.

Detail of memorial to the 51th (Highland) Division in Beaumont-Hamel

Time was running out. On the way back to Peronne I had time to pay quick visits to two other locations. The first was Blighty Valley cemetery, which lies off the road from Authuille to Albert. Most CWGC cemeteries in the Somme are positioned by the roadside but you have to walk in to some like Mill Road and Blighty Valley. Blighty Valley lies south of the Liepzig Salient, part of the defences of Thiepval. Because it is surrounded by fields of crops and reasonably far from the noise of the road, it is a very peaceful location.

Blighty Valley cemetery

My last stop was at the Lochnagar crater near the village of La Boisselle, just off the main Albert-Bapaume road. The crater is the result of exploding a mine of some 60,000 pounds of ammonal in an attempt to demolish part of the German front line and create a natural fortress for the advancing British troops to defend. The crater is over 300 feet across and about 90 feet deep and is dangerous because of the erosion of the sides of the crater.  Two mines were detonated near La Boisselle but neither had the desired effect and the British only captured a small part of the village on July 1.

Lochnagar crater (note the large poppy positioned in the bottom of the crater)

I returned to the Hotel St-Claude in Peronne after a long but very interesting day. The Somme battlefields are full of places where you can wander and spend hours realizing the folly of men charging up hills into machine-gun fire and wonder at the sheer bravery of those who did it. A reading of any of the books that I have mentioned will convince you that there is plenty more to see and learn from, so I am already looking forward to my next visit.

– Tony

P.S. for those who are interested, I used a Nikon D700 with a 24-120 lens to take all the photos.

The links to the books cited above lead to Amazon.com. For your convenience, links to Amazon.co.uk are shown below:

The Price of Glory: Verdun 1916 (Alistair Horne)

The Somme (Peter Barton)

Somme (Lyn MacDonald)

The First Day on the Somme: 1 July 1916 (Martin Middlebrook)

Major and Mrs Holt’s Battlefield Guide to the Somme

The Middlebrook Guide to the Somme Battlefields: A Comprehensive Coverage from Crecy to the World Wars

Walking the Somme (Peter Reed)

Posted in Travel | Tagged , , , , , | 3 Comments

European Office 365 datacenters exposed to U.S. authorities?


After the razzmatazz of the corporate Office 365 launch in NYC faded, local Microsoft subsidiaries around the world did their own thing to brief journalists and introduce details of local offerings, such as link-ups with telcos to deliver Office 365 packaged with voice and other services. I doubt that any Microsoft briefing touched on the price difference between the list price for Office 365 plans in the U.S. and those put in place for other countries, but that would be too much to ask.

In any case, one of the really interesting comments made at a local Office 365 launch came in London when Gordon Frazer, managing director of Microsoft UK, confirmed that there is no way that U.S.-based companies like Microsoft can prevent U.S. national security and law enforcement agencies accessing data held in Microsoft datacenters such as those in Dublin (Ireland) and Amsterdam (The Netherlands) used to service European Office 365 (and BPOS) customers.

Basically, if Microsoft is served in the U.S. with an injunction or other legal instrument that forces it to disclose information, it has to provide the data. Frazer said that customers would be told if their data was accessed unless a gagging order was in effect. You can read the full article on the topic on Zdnet.

Ouch! European corporations are possibly less enthusiastic about cloud services than U.S. companies and this isn’t going to help. It’s just another worry for the corporate security teams to contemplate as they consider whether their company’s data will remain safe if it is moved from on-premises servers to cloud services. And the thing at the back of my mind is a small concern that maybe Microsoft has been served with discovery orders by U.S. authorities in the past that has involved retrieval of data from BPOS servers and no one knows… because one of those blessed gagging orders was in place. Makes you think…

– Tony

Posted in Cloud, Office 365 | Tagged , | 1 Comment

Ballmer launches Office 365


For the fun of it and because I had some spare time, I tuned into the Microsoft launch of Office 365 in NYC via the web.

Steve Ballmer launches Office 365 in NYC, June 28 2011

The content was pretty much as you’d expect from a Microsoft event:

  • Marketing person welcomes everyone
  • Steve Ballmer comes on and makes some introductory remarks, including how excited he is to be here and how great it is that Office 365 is now available in 40 geographies with more countries on the way…
  • Microsoft people come on to do a demo of the technology (maybe because Steve can’t be trusted?). In this case, Kirk Koenigsbauer (a corporate VP) and John (?) showed some nice examples of collaborative document authoring using Word 2010 and the Word web app plus using a Windows 7 phone to capture a phone and insert it into a OneNote document. Best of all was using the Word-like design tools in SharePoint Online to get a nice-looking web site going, something that many small companies struggle with.
  • Steve comes back on to make some closing remarks before thanking everyone and walking off to the side. No questions were taken and the nice marketing person closed the web conference pretty quickly thereafter.

All in all, the event lasted less than 30 minutes so it had the redeeming feature of not taking up too much of anyone’s day.

I found the event moderately interesting because:

  • I thought that the focus was very much on the small to medium business (over 1.5 billion users). I think there’s good reason for this as obviously most small businesses would agree with the quote “We didn’t start the business with the dream of running IT and now we don’t have to” from ESL Industries in Wellington, NZ, one of the companies that have been testing Office 365. The attraction of being able to use the same technology that large enterprises enjoy without the overhead of huge IT staffs is something that a small to medium business has got to enjoy. Small companies can move much faster than large companies when it’s time to adopt new technology, so that’s another reason why it’s a good tactic of Microsoft to emphasize this market segment.
  • One of the other companies mentioned said that they were going to junk a lot of servers as they moved to Office 365, which may be good for local greenhouse gas production but bad for local sales representatives of server hardware.
  • The announcement was made that Microsoft is teaming up with 20 telcos to provide bundled services to businesses. The telcos provide Internet, phone services, devices, and services while Microsoft provides Office 365. Sounds like an excellent way of delivering a complete package to many businesses (and indeed, I have already heard UPC advertise such an offering for Ireland), but it didn’t quite gell with Steve’s assertion that Microsoft needs a strong partner base for Office 365 as many small IT consultancies will not like to see the advance of the telcos to serve customers who might previously have used a consultant to help with email, web site, or sorting out PCs.
  • Steve said that Microsoft’s SLA for Office 365 was the “best in the business”… well, maybe, but only if you agree that SLAs should be measured at the boundary of Microsoft’s datacenters. No one can control the Internet and I think that there’s a fair chance that other hosting companies can deliver a better SLA than Microsoft can if the SLA was measured where it matters – when the service is provided to user desktops.

The Office 365 team has blogged about their launch and I am sure that there will be much additional comment over the next few days. In the interim, now that the excitement has passed, we can all get on with other more important things…

– Tony

Posted in Cloud, Office 365, SharePoint 2010 | Tagged | 2 Comments

Interesting price differences for Office 365 accounts


Due to be formally launched in NYC on Tuesday, June 28, there’s no doubt that Microsoft has had enormous success with its Office 365 beta program. Depending on whom you talk to, tens of millions of people signed up to use Office 365 to have the opportunity of trying out Exchange Online, SharePoint Online, and Lync Online. The day of reckoning for these folks is fast approaching as Microsoft has decreed that their accounts will be converted from beta to trial status after launch and they will then have 30 days to decide whether they should transform the trial into a paid contract to continue the service. This seems pretty fair as it’s obvious that Microsoft has to begin the process of realizing some of the investment that it’s made to engineer Office 365 and to deploy the necessary hardware to support it in datacenters around the world.

What will be interesting is just how many people decide to continue to use Office 365 and pay over a monthly subscription – and how much that subscription will be. Any Office 365 contract implies an increase over the current zero cost of the service and has to be compared with the current zero cost offered by competitors such as Google. A small business can easily use the free versions of Gmail, Google Docs, Google Calendar, and so on whereas Microsoft won’t have a free version of Office 365. Of course, a company could use Hotmail but that’s not particularly satisfying after you’ve used Office 365. Mind you, exactly the same point could be made about moving back to Gmail from Exchange Online, especially in terms of the interaction between the mail server and Outlook client.

In trying to figure out exactly what I will get if I choose to continue with Office 365, I was able find a formal service description for Office 365 online. However, the Office 365 Answers site contains a lot of helpful information as do the community sites that Microsoft run for topics associated with Office 365, such as the forum that handles questions about billing and subscriptions. For example, this article explains quite a lot about Plan P1, which is the basic plan designed for small businesses and individual practitioners, including some of the limitations that exist to keep this plan’s monthly cost so low. From an Exchange perspective, Plan P1 only includes Outlook Web App, so you need to acquire an Office license if you want to connect Outlook. One of the interesting aspects of Office 365 plans is that the minimum commitment for a subscription is one (mailbox), which taken with the low monthly charge makes Plan P1 a very attractive proposition for any small business.

The biggest downsides are the lower level of support (basically, don’t bother calling Microsoft, go direct to your closest blog) and a limitation of 50 user licenses, after which you’re looking at an E (enterprise plan). I don’t think this is a major concern because let’s face it, if a company is growing rapidly and needs to make the transition to a > 50 seat environment, it will have other more pressing and urgent issues to deal with before thinking about email. And anyway, the data migration will simply be a matter of OST to PST transfer before switching over the new E-plan.

Most of the other limitations are related to enterprise-style features such as no single-sign on or support for directory synchronization that the average individual practitioner couldn’t care less about. There are some DNS issues with .no and .dk (Norway and Denmark) domains that deserve your attention if you live in one of these countries. Apart from that, sign up and you should be good to go in a couple of minutes.

I guess Microsoft is polishing the final details of their service descriptions and that all will be revealed after the June 28 launch. At that point, I expect that Microsoft will no doubt contact me to ask whether I’m staying or going.

Poking around the web in my search for information, I was interested to discover that although Plan P1 delivers the same functionality throughout the world, Microsoft charges different prices for the same product in local markets. Plan P1 costs $6/month in the U.S. and EUR5.25 in Euro-land countries such as Ireland or France, Germany or The Netherlands, or roughly $7.45 at today’s exchange rate. I shouldn’t have been surprised because IT companies have been charging different prices for the same product for years. Indeed, way back in the late 1970s when Commodore launched the PET computer, they found that they could sell their computers for up to 50% more in the U.K., Germany, and France and promptly shipped as many PETs as they could from the U.S.. The net result was a nice uptick in Commodore profits and a scarcity of PETs in the U.S. Commodore did the same trick with the first run of the C64. By the way, those of you who miss the C64, C128, and Amiga, you can check out the project to recreate a modern version of these classics based on modern hardware.

Getting back to our story, Microsoft can argue that the local uplift is justified because of higher local staff costs, social insurance, taxation (mostly higher in European countries), and the costs of building datacenters in different countries to satisfy the requirements of companies that might be willing to use cloud services but don’t want their data to leave national borders. For whatever reason, a surcharge of $1.45/month seems a lot to impose. Closer to home, Plan P1 costs US$6 in Mexico and CAN$7 in Canada.

Interestingly, the cost in the U.K. is STG4.00/month or roughly $6.38. Maybe the closeness in price between the U.S. and U.K. is because of the relative size of the markets open to Microsoft in those countries or perhaps it’s simply because Microsoft U.K. is the largest subsidiary and wields more weight when it discusses prices with the mothership in Redmond.

*** Update July 2 ***: Microsoft sent me an email to say that Office 365 was now available for purchase. The price in Ireland is EUR5.25 plus EUR1.10 for VAT (Value Added Tax). I therefore conclude that all of the prices cited here are before local taxes. Although this makes it easier to compare prices across different countries it does nothing to make the price differentiation that exists any more understandable.

The same price differences exist for the enterprise Office 365 plans. Taking the top E4 plan as the baseline, prices quoted for different countries range from US$27CAN$33, EUR25.50 (France), or STG17.75.  The Euro price is equivalent to $36.21, an uplift of $9.21 over the U.S., while the U.K. price is again much closer to the U.S. at approximately $28.25. Over a year, that’s an extra $110.52 for an enterprise mailbox using Plan E4. If you had 2,500 employees, your additional charge for being based in Europe would be $276,300 (2,500 * 9.21 * 12). By comparison, a similar company operating in the U.K. would pay only an additional $37,500. Again, these figures don’t take account of local taxation regimes and the bottom line might well be different in some countries.

Exchange rates do vary all the time so perhaps the Microsoft pricing czars expect the Euro to decline to close to 1.10/$ due to all of the furore around deficits, but it’s a bit of a reach to see such a decline in the next year or so, especially when many economists expect the dollar to remain weak in that time. Or maybe it’s just that Microsoft thinks that Europeans are willing (or stupid enough) to pay so much more for exactly the same service.

No doubt the logic and wisdom of Microsoft’s pricing policies will be revealed in due course… In the interim, I expect that the price differential in different parts of the world will cause some interesting discussions between companies considering the move into the cloud and Microsoft salespeople. For example, will a company based in Germany that has 500 employees in a U.S. subsidiary be expected to pay German or U.S. prices for those subscriptions?

– Tony

If you’re interested about reading more on the issues that surround deployment of Exchange in the cloud, you might be interested in this post.

Posted in Cloud, Exchange, Office 365 | Tagged , , , , | 2 Comments

Generating Exchange 2010 reports with PowerShell


From time to time, someone in the Exchange community outside the Exchange development group comes up with a contribution that adds an enormous amount to many different deployments. I think that the PowerShell script posted by Steve Goodman on his blog may just be in that category.

For years, I have “suggested” (some would say “ranted”) at the Exchange development group about their relative (or total) lack of system reporting. Indeed, at one gathering where an Exchange VP invited me to talk to the engineers about some of the issues that existed in the product, I noted that it was very hard to generate snapshots of system environments or server configurations that could be used to document deployments or provide information to support personnel.

This was in the Exchange 2003 timeframe and while Microsoft has done a good job of addressing some of the other issues that I raised at the time (for example, eliminating the blizzard of undocumented system hacks executed through registry settings and providing a method to automate common administration tasks), they haven’t done anything really to provide system administrators with a way to generate a simple report about an organization and its configuration. Sure, the Exchange 2010 version of EMC boasts the “Organization Configuration” option, but it’s hardly a nicely-formatted and elegant output.

You can make the argument that it’s difficult to generate a comprehensive report about an organization because a) every organization is different, b) every administrator has their own idea of what such a report might contain, and c) the report could take a long time to run if an organization is very distributed or one of the mega-organizations that supports multiple hundreds of thousands of mailboxes. All true, but that’s not a reason for ignoring user requests for such a feature that go back to the original Microsoft Exchange Conference (MEC) in 1996. Perhaps the Exchange community hasn’t been vocal enough in their demands on this point! More likely it’s that the engineers have had better, more interesting things to develop and add to the product. Administrative functions always seem to be the last on any programmer’s to-do list.

In any case, Steve Goodman has done us all a service by generating a very nice Exchange 2010 organization report using PowerShell. Better still, you can download the code and tailor it to meet your own needs – or adopt Steve’s approach to have the report emailed to you automatically on a daily basis. And while the report isn’t formally supported by the Exchange development group, it is based entirely on out-of-the-box Exchange cmdlets and standard PowerShell code so the magic is in its implementation rather than some undocumented and previously unknown technique. In fact, the report is very similar in many respects to a report called “ExArt” that the Exchange development group use to capture information about customer test environments that are running beta versions of Exchange. Which in turn makes you wonder why that report never found its way into the product…

My recommendation is that you read Steve’s blog, examine the sample report that he includes, and download the code to try it out in your own environment. Of course, you’ll practice safe PowerShell by not running the script in your production environment before carefully vetting that it doesn’t do anything nasty… but it doesn’t… so go ahead and give it a try – and do let all know if you amend the script to add something new and interesting to the report!

– Tony

Posted in Exchange, Exchange 2010 | Tagged , , , | 4 Comments

“The Register” runs ESEUTIL to rescue an Exchange 2003 server


It’s nice to see that “The Register” continues to support Exchange 2003 administrators with advice such as that offered in the article “Help my Exchange server just rebooted” posted on June 18, 2011.I’m assuming that the article was about Exchange 2003 because the screen shot displayed system files dated from 2005, so we’re definitely not in Exchange 2007 or Exchange 2010 territory.

To me, the tale of woe outlined was a flashback to the bad old days of depending on an all-too-fragile single database, tape backups that perhaps were never verified, and multiple single points of failure. Even though the point wasn’t made, the article also outlined a compelling case for the advantages of migrating Exchange 2003 servers to either on-premises Exchange 2010 or into the cloud with Office 365. After all, if you elect to upgrade to on-premises Exchange 2010, you can deploy a Database Availability Group (DAG) and protect mailbox databases that way, or if you choose to move to Office 365, you pass the problem over to Microsoft, who will reassure you that Office 365 maintains copies of databases in “pods” split across two datacenters.

Reading the article transported me back to the days when ESEUTIL was regarded as an essential tool for Exchange administrators. Early versions of Exchange had databases that weren’t particularly efficient at reusing white space (empty pages) in the database and the only way to return space to the file system was to rebuild the database with ESEUTIL. This was deemed to be a good thing in the days when big disks were expensive and administrators fretted about available disk space.

Indeed, there were those who espoused the goodness of a regular database rebuild with the same vigor that colonic irrigation enthusiasts look forward to a good clean-out. Of course, a rebuild database was more compact and this helped somewhat with tape-based backups because the databases took less time to copy to tape, but the compacted databases had some downside as well: it wasn’t possible to replay old transaction logs into the new database; the recovered space was invariably swiftly reclaimed as the database grew to accommodate user traffic; and running ESEUTIL to rebuild databases required all users to log-off and not reconnect to Exchange until the rebuilt database was brought online.

ESEUTIL has never been a sparkingly fast program and it was common to achieve throughout in the 5-10GB/hour range – barely acceptable in an era when Exchange servers boasted mailbox databases sized less than 50GB (remember the speed of tape backups – it was a brave administrator who trusted Windows disk systems and the security of tape backups enough to allow databases grow more) and certainly unacceptable today when mailbox databases commonly run > 500GB.

The version of ESE included in Exchange 2003 is actually pretty good at internal database management and there was a lot less call for database rebuilds after Exchange 2003 appeared. However, as the article covers, ESEUTIL/R (recovery mode) could be needed to convince a database that it was OK to mount.

In the normal course of events, you should never have to resort to ESEUTIL with Exchange 2010. If your company is able to afford to deploy at least two servers, you can run a DAG and have the mailbox databases replicated between the servers to ensure basic redundancy. The HP E5000 Messaging System remains the only appliance-type Exchange 2010 system on the market and it contains two multi-role servers configured in a DAG, so that’s a good choice for a low-end deployment. Two copies of a database is good, three is much better, and four is a warm blanket of security for any administrator, so if you can afford the extra servers and disk, the ideal situation is to build and deploy a DAG where each mailbox database has at least three copies. Microsoft’s latest mailbox role storage requirements calculator contains the ability to visualize the layout of databases and copies in a DAG and is worth investigating, if you haven’t seen it before.

But what if you’re a small company who feels rather similar to the tale of woe exposed in the article – just a single Exchange 2003 mailbox server, still running tape backups, and looking nervously towards the UPS in the hope that it can protect your hardware should a disaster strike. Well, you’re a prime candidate to stop running on-premises email and move quickly to the cloud by migrating to Office 365. Sure, you might have to upgrade some PCs because Office 365 doesn’t support older Outlook 2003 clients and probably make some changes to your network infrastructure to make sure that your clients can all reliably connect to Office 365, but you’ll end up with a far more functional email capability that’s protected in a way that you simply cannot in any cost-effective manner.

Moving into the cloud won’t cure all problems and there have been some well-documented instances when all of the major cloud providers including Microsoft and Google have suffered outages that affect users. Although network glitches are always possible for any service that depends on the Internet, I think that the datacenters where services like Office 365 run are pretty safe from the affect of a truck hitting a pole and a resulting power spike. At least, that’s the theory… and anyway, if something does happen, you’ll never have to run ESEUTIL again and that’s something worth looking forward to.

– Tony

If you’re an Exchange 2003 administrator who’s trying to figure out how best to upgrade to Exchange 2010, you might like to read Microsoft Exchange Server 2010 Inside Out, also available at Amazon.co.uk. The book is also available in a Kindle edition. This post covers some of the topics that you’ll find in chapter 7 (ESEUTIL) and chapter 8 (the DAG), but there’s lots of other stuff to read…

Posted in Exchange, Office 365 | Tagged , , , | Leave a comment

Measuring Outlook 2010 network latency


One of the truths often overlooked by those who wish to rush into the cloud is that the characteristics of their network will change dramatically. Some of the traffic that is currently processed internally will have to be channeled out across the Internet to the selected cloud provider. For example, all of the client traffic from applications such as Outlook or OWA will now travel via HTTPS to the cloud provider instead of being directed to an internal Exchange server. And if we believe the marketing positioning, everyone will have massive increases in mailbox quotas with the resulting increase on OST size… and that data has to be synchronized!

The niggling questions that therefore occur is how much bandwidth is likely to be needed, where that bandwidth should be provided to best service large user communities, and what kind of latency needs to exist to ensure that users experience roughly the same kind of performance across the cloud that they get when Exchange is provided internally.

Of course, one of the great benefits of running Outlook in cached Exchange mode is that it hides many of the network glitches that can occur when you depend on the Internet to get from client to server. Background drizzle-mode synchronization keeps the replica folders updated on the client so that most network interruptions pass unnoticed. But while new messages are received in the background, Outlook uses a special thread to send new messages. Connecting to Exchange Online in Office 365 to send a new message, especially one with a reasonably large attachment (say 1MB) is when I sometimes see performance suffer a tad when compared to connecting to an internal Exchange server.

I’m perfectly willing to accept that part of the problem is due to the characteristics of the Internet connection that goes into my house. I also accept that many other hops exist to make the connection from the local telephone exchange to the datacenter of my ISP to Microsoft. In fact, the complexity of the connection is frightening at times! On the other hand, you can argue that the flexibility of Internet connectivity is a huge strength.

In any case, if you’re in the process of figuring out how to move your company to the cloud, you might find that this blog post by Microsoft’s Neil Johnson contains much valuable and interesting information about how network link latency affects Outlook 2010. It’s certainly worth reading if only to throw additional light on to the tolerance Outlook has for different latencies.

Happy weekend!

– Tony

Posted in Exchange, Office 365, Outlook | Tagged , | Leave a comment