The simple EWS Exchange Email Client


Fellow MVP Glen Scales has created a really nice example of the power of Exchange Web Services (EWS) to build an Exchange 2010 email client that you can run through a PowerShell script (for more details of the script, see Glen’s blog). I like this code a lot because it allows you to run an email client on a workstation for test purposes without having to install Outlook or run up OWA (yes, I know OWA is a great client, especially in Exchange 2010 SP1, but sometimes it’s just too slow to start on a virtual machine running on a laptop).

Glen’s client works with V1.0 of EWS, which you can download from Microsoft. You need to install EWS and PowerShell on the workstation that you want to run the client. You’ll also need access to the Exchange 2010 cmdlets, which you can gain by either running the Exchange Management Shell (in which case you have to install the Exchange 2010 management components), or you can make a remote connection to an Exchange 2010 server using the code explained here.

To start, you put the script (I renamed it SimpleClient.ps1 to make it pretty obvious what the purpose of the script is so that I could remember it in the future) into whatever directory you use for PowerShell scripts and fire it up with:

C:> .\SimpleClient.ps1

The script loads a form (shown below) to collect connectivity data such as the email address of the mailbox that you want to use and credentials if you need to provide them. By default the script uses the account and the credentials that you’ve logged in with.  The script will use Autodiscover by default to find the EWS web service but you can feed it a specific URL if you want for testing (such as https://ExServer1.contoso.com/ews/exchange.asmx).

Collecting mailbox and credentials to connect the Simple Email client

Once connected, you’ll see the folder structure in the mailbox and be able to navigate within it to go to a specific folder. The current version of the client doesn’t support archive mailboxes, but I believe that Glen is working on an update using Exchange Web Services V1.1 that will support this feature.

Viewing the Inbox folder with the Simple Mail Client

You can click on a message to view its contents or its header or export it to a file, or send a test message from the mailbox. Behind the scenes, the script runs to process the commands sent from the form. You can expose EMS to see the commands that are executed and gain some insight into what’s going on.

PowerShell script executing behind the scenes

I won’t pretend that this is a client that everyone will be excited about, but it is something that I think is of value to many administrators or anyone who wants to dabble with Exchange Web Services. Remember, now that WebDAV has bitten the proverbial dust, EWS is the way forward if you want to access Exchange mailbox and message data, so anything that throws more light onto that subject is welcome.

– Tony

For more information about Exchange 2010 SP1, consider my Microsoft Exchange Server 2010 Inside Out book, also available at Amazon.co.uk.

Posted in Exchange, Exchange 2010 | Tagged , , , | Leave a comment

The myth surrounding the use of ESEUTIL to rebuild databases


Some people still believe that it is a good thing to run the ESEUTIL utility to defragment or rebuild an Exchange database. I can’t see why this myth persists, so here’s an attempt to drive a stake into its heart.

ESEUTIL is a blunt instrument that was badly needed in the early days of Exchange. By this, I mean Exchange 4.0 (1996), 5.0 (1997), and 5.5 (1998). Microsoft wasn’t too good in those days at reusing the white space that exists inside ESE databases and disk space was expensive. ESE databases swelled over time and the only way to recover disk space to the system was to run ESEUTIL. This approach worked and it had a nice side effect in that any lurking corruption such as a failed checksum that existed within the database might be fixed by ESEUTIL as it examined and validated each page from the source database before writing the page into the new database.

There are lots of articles to tell you of the wonders and benefits to be gained from a good database defragmention (for example, this one on msexchange.org). Most of these articles were valuable in their time but are now out of date and should be read with caution, even if you’re running Exchange 2003 or Exchange 2007. Best practice evolved over time and everyone you met at a conference was sure that running ESEUTIL was a great thing to do; it became a form of colonic irrigation for Exchange, something that flushed all the bad stuff out and made Exchange feel better all over.

In this context, “bad stuff” means corrupt pages – but of course, if the database rebuild dropped corrupt pages, it means that your brand new database has just lost some data; the dropped pages might be index pages or contain mailbox items. No one might notice, especially if the pages store old information that should really have been deleted a long time ago, but the sheer fact that data loss could occur when you rebuild a database should set a red flag for ESEUTIL.

A couple of things happened around 2003 to undermine this particular “best practice”. First, Microsoft got their collective act together and fixed the internal Store maintenance so that white space was used more efficiently. From Exchange 2003 onwards, I had no problems telling customers that they should not run ESEUTIL unless they were told to do so by Microsoft Support (and had received a very good reason for running the program – in other words, this couldn’t be a “Hail Mary” kind of last-gasp pass when the support professional had no other suggestions to offer and merely wanted to keep the customer busy while they figured out what to do next). The improvement has continued over Exchange 2007 and Exchange 2010 and today Exchange 2010 can do intelligent things like single page patching when databases are deployed in a Database Availability Group (see this post for more detail).

The steadily dropping price of storage is the second reason why ESEUTIL became a lot less important. In the old days, when people counted storage in megabytes and fretted over users who had the temerity to send a message containing a 1MB attachment, running utilities to recover disk space seemed like an excellent system management activity. Today, when terabytes are cheap and we’re all discussing 25GB mailboxes, the prospect of recovering a few hundred gigabytes seem a lot less attractive. I know that the smaller rebuilt databases are easier to backup but that forgets the salient point that a rebuilt database begins to swell back towards its original size as soon as you bring it back online.

Databases have a “natural” size of their own and the trick is to provide enough storage to allow databases to run without administrator interference for months at a time. In this respect, it’s interesting to point out that circular logging is no longer a bad thing for a production Exchange server (another best practice hits the dirt). If you run databases in a DAG and have at least three copies, you don’t need to disable circular logging as the databases are protected by having sufficient copies available to recover from most problem situations. (Public Health Warning: your mileage will vary – a database copy on a flaky server is worse than useless).

Deploying multiple database copies within a DAG is not an excuse to avoid backups! Do not believe the propaganda on this point. Although Exchange 2010 includes some very good high availability features, it’s no reason to lose all sense and wisdom of good system management practices by discarding the safety net that a solid backup regime provides. If things ever go south, you’ll be happy to have backups available. And if someone tells you that they don’t use backups anymore because they rely on database copies, look at them in the same way that you’d regard an errant child who you’ve just caught before they inserted a wet finger into an electricity socket. They are either smoking some interesting material or they have the extreme luxury of having the Microsoft Windows and Exchange development groups on hand to sort out any problems if their servers or disks explode.

So can there be good reason to run ESEUTIL to rebuild a database? Absolutely! It may be your only get-out-of-jail-free card to play to rescue a corrupt database, even if you end up losing some data. However, I would make sure that Microsoft support is happy with you running ESEUTIL before you actually do it. Remember too that rebuilding a database renders all transaction logs null and void because they won’t match the new database, so be sure to take a full backup immediately you have brought the new database online and made sure that everything works.

Another reason for running ESEUTIL to rebuild a database might be after you move or delete a heap of mailboxes out of a database. This happens in educational establishments where they remove student mailboxes every year and I can see good reason to “reset” a database after deleting all the mailboxes.

But apart from these instances, I can’t think of why I would want to take a database offline for several hours – even at a weekend (or most definitely at a weekend when there’s usually much better things to do) to reduce performance against SLA for user access in an attempt to create a slightly smaller and possibly perhaps maybe a more efficient database. It just doesn’t make sense. I’m sure that some others will propose good reasons for running ESEUTIL to rebuild a database and that’s just fine – as long as you have thought things through and concluded that this is absolutely the only way to accomplish your goal, then have at it… but don’t expect this utility to be the proverbial magic bullet that makes up for months or years of database neglect.

– Tony

The latest Microsoft support article on running ESEUTIL is KB328804. If you’re interested in more diatribes against bad Exchange management practices, come along to the Exchange 2010 maestro training sessions, or just get a copy of my Microsoft Exchange Server 2010 Inside Out book.

Posted in Exchange, Exchange 2010 | Tagged , , | 8 Comments

Connecting to Exchange 2010 with PowerShell


One of Microsoft’s goals for Exchange 2010 is to provide administrators with the ability to manage servers from workstations without requiring the installation of the Exchange 2010 management components. Obviously some pre-requisites exist in that PowerShell 2.0 and Windows Remote Management must be installed on the workstation before you can even think about installing the Exchange 2010 management components, so that’s the first step to take care of. Assuming all the prerequisites are in place, you should be able to install the Exchange 2010 management components and then fire up the Exchange Management Shell (EMS) to connect to an Exchange 2010 in your local site.

If the Exchange 2010 management components are not installed on a workstation, then the EMS initialization script is not available and you have to perform the tasks that the initialization script performs to create the remote session, identify your account to Exchange with the necessary credentials to log onto the account, and import the set of cmdlets permitted for your role.

The first step in a do-it-yourself EMS session is to start PowerShell and use the Get-Credential cmdlet to input the username and password that we need to connect to the target Exchange organization. These credentials should be for a privileged account as otherwise you won’t be able to do very much.

$Credentials = Get-Credential

PowerShell displays a dialog to allow you to put in the username and password that we will use to connect. We then create a new remote PowerShell session and connect to the remote Exchange organization. Note that Kerberos is specified as the authentication method.

$ExSession = New-PSSession –ConfigurationName Microsoft.Exchange –ConnectionUri ‘http://ExServer1.contoso.com/PowerShell/?SerializationLevel=Full’ -Credential $Credentials –Authentication Kerberos

After we establish a session, we can import the set of Exchange cmdlets that our account is allowed to access. As shown in the screen shot below, EMS responds with an acknowledgement that it has imported the specified command set into the session.

Import-PSSession $ExSession

Connecting to Exchange 2010 with remote PowerShell

After the cmdlets are loaded into your session, you can work remotely in exactly the same manner as if you were logged onto the server. The Get-Command cmdlet will list the cmdlets loaded into the session and the Get-Help cmdlet can be run to show the help that is available for any of the cmdlets that are loaded into the session. Unfortunately, even though the EMS startup screen indicates that you can use wildcards with the Get-Help cmdlet, due to some issues with the operating system, the advent of Remote PowerShell has removed this ability that exists in Exchange 2007.

Once your session is established and you’re connected to Exchange 2010, all transactions flow across HTTP via IIS to be executed on the target server. When you are finished, you can terminate the session with the Remove-PSSession cmdlet.

Remove-PSSession $ExSession

Why would you create such a connection to Exchange? Well, you might want to use the PowerShell Integrated Scripting Environment (ISE) as your preferred tool to write and test scripts to automate aspects of your Exchange deployment. When you start ISE, it won’t connect to Exchange 2010 unless you instruct it to, so if you want to use any of the Exchange cmdlets in code, you have to connect to Exchange by running the commands described above (with the exception of retrieving your credentials as ISE will use the credentials of your logged-on session). You could do this by running a script (perhaps one that defines variables that you find useful) or you could create a custom menu option for ISE that runs the commands to connect to Exchange. The easiest way to do this is to edit your ISE profile to include the commands. See http://technet.microsoft.com/en-us/library/dd819492.aspx for details about how to credit and edit ISE profiles.

Most administrators probably do the simple thing and click on the EMS icon in the menu on a workstation where the Exchange 2010 management components are installed. Invoking EMS causes Windows to run this command:

C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -noexit -command ". 'C:\Program Files\Microsoft\Exchange Server\V14\bin\RemoteExchange.ps1'; Connect-ExchangeServer -auto"

The initialization script is RemoteExchange.ps1, which does what’s indicated by its name and creates a remote PowerShell connection to Exchange 2010. Interestingly, the Connect-ExchangeServer command immediately afterwards is the one that connects EMS to an Exchange 2010 server in the site. Note that it uses the -Auto parameter, meaning that EMS will attempt to connect to the local system (if it is a server running Exchange 2010) followed by other servers running in the local site (CAS servers first, then mailbox servers).

You can run the Connect-ExchangeServer cmdlet in an EMS session to force EMS to connect to a specific server. This is a useful thing to do if you want to connect to a specific server such as one in another Active Directory site. For example:

Connect-ExchangeServer -Server exserver1.contoso.com

Switching the connected server with the Connect-ExchangeServer cmdlet

In the screen shot you see that EMS initializes and connects as normal – in this case to server ExServer1. We then run Connect-ExchangeServer and specify exserver2.contoso.com as the target server. EMS connects to this Exchange 2010 server and loads in the cmdlets permitted by the RBAC roles held by the user. As you can see, any cmdlet that is already available for the session is skipped. After connecting to server exserver2, all future cmdlets run in the session are executed on that server. This won’t matter if you are working with organization-wide configuration data but it does if you run cmdlets that do things like change OWA virtual directory settings and don’t specify a target server.

It’s worth saying that remote PowerShell is one of the more fragile components of Exchange 2010 because it depends on so many moving parts to work together before you can connect. IIS must be configured correctly with the right modules and setting; WinRM has to permit the HTTP requests to pass; WSMan must be able to communicate with the servers; and your mailbox has to be authorized to run PowerShell (the default setting). If you run into problems, you should consult the post on Troubleshooting Exchange 2010 management tools startup issues to provide a good starting point for where you need to look to make everything right.

Hope this helps!

– Tony

Follow Tony @12Knocksinna

This is material that isn’t included in my Microsoft Exchange Server 2010 Inside Out book, mostly because I had to cut pages to fit the book into the prescribed limit of 1,300 pages set by Microsoft Press. If this is an example of stuff that’s been cut, you can imagine the value of the material that’s been retained! The book is also available from Amazon.co.uk.

*** Update December 7, 2010: Microsoft has released the Exchange Management Troubleshooter (EMT), a utility designed to look for common problems that might cause Remote PowerShell (and by design, all of the Exchange 2010 management tools) not to work properly on a server. You can read all about EMT on EHLO.

Posted in Exchange, Exchange 2010 | Tagged , , , , , | 30 Comments

The very useful ExFolders utility


Amongst the features deprecated (others would say “amputated”) from Exchange 2010 is support for the WebDAV (or just plain “DAV”) interface. Exchange 2000 was the first version to support WebDAV and at the time, it seemed that WebDAV would become the interface that programmers would use to access data in mailboxes, public folders, and the like. Alas, WebDAV has run its time and is no longer the flavour of the month, so you need to move to a new interface if you have programs or utilities that are based on WebDAV. Exchange Web Services (EWS) is the recommended replacement.

You might guess from its name that PFDAVAdmin is based on DAV. This is a useful program that originally started as a utility called PFAdmin for administrators who wanted better management capabilities for public folders. PFDAVAdmin is replaced by the ExFolders program for Exchange 2010. There are two versions, one for the RTM release of Exchange 2010 and one for Exchange 2010 SP1. It’s obviously important that you use the correct version to connect to an Exchange 2010 (RTM or SP1) server as the schema changed between the two releases so the RTM version doesn’t understand how to connect to an Exchange 2010 SP1 server and vice versa. Interestingly, once you run ExFolders from an Exchange 2010 server, it can connect to an Exchange 2007 server.

Administrators often resorted to PFDAVAdmin when they realized that the public folder administration console provided in Exchange (2007 or 2010) exhibited all the signs of an afterthought on Microsoft’s part; work that they really didn’t want to do because “who cares about public folders”… This criticism is less true in Exchange 2010 SP1 where Microsoft did some work to allow better control over public folder permissions through the console GUI, something I shall return to in a future post.

Editing public folder permissions with ExFolders

Running ExFolders is simple. Download the correct image from Microsoft and place the executable in the Microsoft Exchange binaries folder and then invoke it. You’ll be asked to provide some administrator credentials and then identify whether you want to open the public folder hierarchy or a mailbox store. You also have to identify a domain controller that ExFolders can use to read the Exchange organization configuration.

Once connected, you can navigate down through the public folders and update permissions as shown above, or can examine the items that are in a folder (below), but you can’t view the contents of the items as that has to be done with a client such as Outlook or OWA.

Viewing items in a public folder with ExFolders

Many interesting items can be discovered in mailboxes, such as the structure of the mailbox and the many hidden items that are stored in the mailbox root to be used for Exchange internal processing. You can see the two move reports that Exchange holds for a mailbox (see my post about mailbox move history for more information about this data). You can also see the dumpster folder structure (under Recoverable Items), the folders used for Reminders and Views, and the client-visible folder structure under “Top of Information Store”.

Using ExFolders to access mailbox data

Clearly there is lots more to explore in the ExFolders utility that can be covered in a short post. The best idea is to get a copy of the program and start to use it to find out just how ExFolders can add real value to your deployment. Recommended!

– Tony

If you’re interested in learning more about how to manage Exchange 2010 effectively, why not come along to the Exchange 2010 maestro training sessions, or just get a copy of my Microsoft Exchange Server 2010 Inside Out book.

Posted in Exchange, Exchange 2010 | Tagged , , , , , | 1 Comment

I miss the Microsoft Exchange Conference (MEC)!


Update March 6, 2012: Well, the lobbying has succeeded and Microsoft has announced that they are bringing back MEC. You can get more information from MECisBack.com or read my take on the new MEC on WindowsITPro.com.

———-

I first attended the late lamented Microsoft Exchange Conference (MEC) in Austin, TX in September 1996 and had the unique experience of being assailed by Elaine Sharp, the then product manager for Exchange 4.0, who loudly inquired “how dare I write about her product”… After a short, sharp exchange of views, we partly amicably and collaborated thereafter, including working together on my Exchange 5.0 book. Such was the magic of MEC.

Of course, in those days, the sessions were all about “Migrating from MS-Mail” and similar topics. By 1998, MEC in Boston had swollen to over 4,000 attendees (http://www.microsoft.com/presspass/features/1998/9-9exchange.mspx), the sessions covered topics such as the initial “Wolfpack” clustering for Exchange 5.5, and the parties were getting more and more interesting. I don’t think some of the hotels around the World Trade Center in Boston have yet recovered from some of the parties that the development group ran.

My most embarassing experience at MEC 2000 in Dallas when I was scheduled to do a keynote in the main hall in front of several thousand people. There was a lot of excitement at MEC 2000 because of the recent introduction of Windows 2000, the Active Directory, and Exchange 2000. I can’t quite recall what the subject of the keynote was, but I ended up making a comment that it was great that the Exchange System Manager (ESM) console now supported context-sensitive menus (right-click). The older ADMIN console used by the first generation of Exchange (4.0 to 5.5) didn’t go for such UI frippery.

Speaking at my keynote at MEC 2000 (photo: David Lemson)

In any case, the prospect of context-sensitive menus in an administrative console obviously hit home with the crowd (thus proving the unique level of absolute geekness of the Exchange community) and resulted in generous applause. I responded with some words along the line that “I wasn’t responsible for the work, but I’d be delighted to pass on the clap to the engineers.” A horrible but short pause ensued followed by a wave of laughter as the audience realized what I had said. Of course, my face was rapidly changing from red to deeper red as the meaning of my words sank home.

Microsoft is a wonderful technology company and it came as no surprise that someone quickly captured the snippet in an MP3 file; I received many messages from Exchange engineers over the following weeks to express their delight at my words. Or maybe it wasn’t delight…

The last US-based MEC was held in Anaheim, CA in October 2002. It is worth noting that MEC ran successfully in other countries, mostly notably in EMEA where its normal location was the Acropolis in Nice, France.

Why did Microsoft drop MEC? There are many stories and the official line is that management wanted to run a single annual technology conference, which was TechEd. It might be that MEC was getting too big and that there was too much duplicated content presented at MEC and TechEd. The cut and paste disease was very obvious.

Other conferences have attempted to step in to take the place of MEC, notably the Microsoft Exchange Connections conference and The Experts Conference (TEC). The Connections conference started from a small base but now attracts a reasonable audience at its events. A good collection of Exchange MVPs hang out at the Connections event and come together to deliver sessions that are interesting and worthwhile.

I have been to a couple of TEC events and have experienced good things at the conferences, especially if you are interested in Active Directory. Directory integration and synchronization are the foundation of TEC in the past but the organizers have broadened the conference to offer more value in terms of Exchange and SharePoint recently.

Good as these conferences are, they aren’t MEC. They aren’t MEC because Microsoft doesn’t provide the same level of support to the conference in terms of the program managers and engineers who used to swarm around MEC and made the conference a unique melting point for Microsoft, its customers, and the huge number of third party companies that comprise the ecosystem that surrounds and supports Exchange. A huge amount of valuable interaction between different parties at MEC and the sessions were invariably at the 300- level or above, so the information that was gained there was unsurpassed – certainly there was a low number of marketing sessions on the MEC agenda, so the information was useful, pertinent, and clear.

I’d love to see MEC make a reappearance. TechEd is too big and bland and attempts to cover too much technology in too little time. Exchange is big enough and the complete end-to-end ecosystem, including SharePoint, is interesting enough from a technical perspective to warrant a full week’s conference, providing that the right mix of speakers was created from Microsoft engineers, industry experts, third party software developers, and customers.

What do you think?

– Tony

Posted in Exchange | Tagged , , | 21 Comments

Granting write permission for calendar sharing with OWA 2010


The calendar sharing feature introduced in Outlook Web App 2010 (OWA) allows a user to grant access to their calendar to another user. To access the option, click on the Share option when in the Calendar and then on Share This Calendar. You’ll then be able to select the user(s) that you want to share your calendar with and define the level of information you want the recipient to be able to see in your calendar.

Creating a message to inform the recipient that you’d like to share your calendar

The recipients see a message as shown below. To access the calendar, they simply click on the Add This Calendar link. OWA will then add the calendar to the list of available calendars and the user can then access your calendar whenever they want by simply clicking on the calendar’s entry to instruct OWA to open it.

The message notifying the recipient that they can access your calendar

So good so far. The user will be able to see your calendar but they won’t be able to add anything to it or make a change to an existing appointment. In short, they are restricted to “Reviewer” access. You can confirm this by clicking on the Change Sharing Permissions option in the Share menu, when you’ll see something like the screen shot shown below. In this case, just one other user has access to the calendar and all they have is Reviewer access, so it shouldn’t come as a surprise that they won’t be able to add or edit items in the calendar.

Viewing sharing permissions for a calendar

Maybe reviewer access is all that’s needed. But there are instances where it’s good to be able to add or edit items in someone else’s calendar and the frustrating thing is that OWA doesn’t support any way to manipulate the permission granted on a calendar. However, this is possible through the Set-MailboxFolderPermission cmdlet, which is the underlying command that manipulates folder permissions. The command that we need to run is:

Set-MailboxFolderPermission -Identity alias:\Calendar -User UsertoGetRights -AccessRights Editor

For example, if my alias is “TRedmond” and I want to grant access to the user “Redmond, Eoin”, the command is:

Set-MailboxFolderPermission -Identity TRedmond:\Calendar -User ‘Redmond, Eoin’ -AccessRights Editor

Note that you can’t run the Set-MailboxFolderPermission cmdlet to alter a permission on a folder unless a permission has already been granted to the folder for the user. If you want to add Reviewer permission for someone who doesn’t already have access to a calendar, you have to run the Add-MailboxFolderPermission cmdlet with a command like this:

Add-MailboxFolderPermission -Identity ‘TRedmond:\Calendar’ -User ‘Pelton, David’ -AccessRights Reviewer

To confirm that everything has gone to plan, we can use the Get-MailboxFolderPermission cmdlet to validate the permissions on the folder.

Viewing mailbox permissions for the calendar folder

Here you can see that one user has Editor permission and another has the default Reviewer permission. If we checked using OWA’s Change Sharing Permissions option, we’d see something like the screen shot below. Note that you can’t use OWA to edit the permission anymore as the code doesn’t cope with Editor permission.

How OWA displays a user with Editor permission

Once a user has been granted Editor permission, they can edit or add items to a calendar. Note the “Notify” checkbox. If set, the user who owns the calendar will receive an “Appointment Created Notification” as a new message in their inbox to provide them with details of the new event.

Creating a new appointment in another user’s calendar

Calendaring sharing is a nice feature of OWA 2010. It’s just a pity that the developers left out the ability to grant editor access to a calendar – but now you know how to do it behind the scenes!

– Tony

Follow Tony @12Knocksinna

Read more information about the new features in Exchange 2010 and Outlook Web App 2010 in my Microsoft Exchange Server 2010 Inside Outbook, also available from Amazon.co.uk.

Posted in Exchange, Exchange 2010 | Tagged , , , , , , | 28 Comments

Cloud Computing Explained – A good guide


Cloud Computing is all the rage these days but people tend to think about their own piece of the IT spectrum when it comes to considering how cloud computing will affect the industry. I guess that’s natural. In any case, if you’re looking for a concise book that drags together the issues and influences about different aspects of cloud computing for the enterprise, you’d could pick up a copy of John Rhoton’s book Cloud Computing Explained: Implementation Handbook for Enterprises, also available from Amazon.co.uk.

Cloud Computing Explained

I’ve known John since 1986 and he has always impressed me as a technologist and the calm but effective way he goes about learning new technologies and then putting them into practice. He has written a series of books on anything from programming Internet messaging protocols (IMAP4, POP3, SMTP and the like) to wireless. His book on cloud is an interesting and worthwhile read that will help people to come up to speed about what’s happening in this space.

Another interesting aspect of this book is that it is self-published. Believe it or not, it’s difficult to get a manuscript accepted by a publisher these days, even if you are an author with a good track record. John decided to do everything himself and dedicated some time to research the process before deciding what approach to take. It seems to be working out pretty well for him as he has gone ahead with a second edition of the cloud book.

John subsequently collaborated with Risto Haukioja in April 2011 to develop thoughts on how to architect solutions that need to use cloud technology in their book Cloud Computing Architected: Solution Design Handbook.

These books are definitely worth investigating if you’re interested in this area. If you go to Amazon.com using the link above, you can browse through some sample content.

– Tony

Posted in Technology, Writing | Tagged | 1 Comment

Eight essential aspects influencing the publication of good technical books


A recent remark in a LinkedIn.com forum stated “Please do not use Microsoft Press and “Technical Editing” in the same sentence. Have you ever seen the 1st editions of the Technical books?”

I believe that the general thrust of the statement was that the technical editing done for first editions of Microsoft Press technical books did not do the job in terms of eliminating errors in the text before publication. Further comment then ensued that books from other publishers were better able to explain Microsoft’s own products, offered better coverage of topics, and were more up to date with the technology. All of this may be true and it’s certain that the quality of books can suffer due to many factors. It is also true that different publishers and different authors can combine at times to generate the best possible book on a specific technology.

I responded that my experience with Microsoft Press has been good. In my view, Microsoft Press has certainly dedicated sufficient editing resources to my Microsoft Exchange Server 2010 Inside Out book to allow the team that has worked on the book to drive to a high quality output. Of course, I would say that and we’ll see the result over time, but I hold to the view that Microsoft Press has assigned the right amount of resources in terms of technical editing, copy editing, indexing, and overall publication direction to do a good job. I hope that this first edition does not suffer from the problems reportedly seen in other books!

Thinking about the question a little more, I conclude that the overall quality of any technical book is influenced by many different factors, including:

  • The level of knowledge of the author (obviously!) and the technical editor about the topic covered in the book. Clearly any errors in the text are more likely to be caught if both the author and technical editor work together well and share a common understanding of the technology. Mutual respect is important here because a good technical editor will pose questions for the author based on what they find in the text and often these questions expose issues that deserve to be covered. An author who becomes exasperated by the feedback of their technical editor isn’t taking advantage of this most valuable resource.
  • The speed that the book appears after the product being covered. Books that appear very soon after software is released are far more likely to contain errors than those that appear months afterwards, simply because it is the nature of software that bugs are fixed and functionality might be removed or added very late in the development cycle. For example, Exchange 2010 SP1 changed late on to remove a new feature aimed to make management of cross-site connections easier; it also changed to retain moved mailboxes in source databases instead of the RTM behaviour of deleting the source after successful moves. If an author had committed text to a book just before the software was released they would have zero chance of catching changes like this and so their text is likely to be inaccurate and could be misleading. Is that the fault of the author, the technical editor, the copy editor, or the development group – or just the way that things work?
  • The access that the author enjoys to the product development group. If he or she has good access so that questions can be posed and answered by an authoritative source, the resulting text is usually better and more insightful than if the author has to rely on product documentation, development group blogs, and other resources – good as though these might be.
  • The maturity of the software product being described. If software is in a V1.0 state, the books that describe the software are more likely to be a V1.0 state too because the author, the technical editor, and probably the development group don’t have the same level of understanding as exists when a product has been in the market for a while, has been tested and deployed by many installations, and has been well analyzed by IT professionals.
  • The knowledge that the copy editor possesses about the topic. A good copy editor is able to spot mistakes based on their knowledge and experience and that’s very helpful to an author. It may just be a mistake where a word is missing, or it may be something critical like “I don’t understand this text and it doesn’t make sense when compared to what you wrote in the last chapter…” A bad copy editor misses stuff like this and concentrates on the pedantic side of the job, such as chastising authors for not using correct product names (how many times have I been told that it’s “Windows PowerShell” and not just “PowerShell”) or that I have to spell out acronyms ad nausem ( I think most people who read my Exchange books understand that EMC means the Exchange Management Console, but some copy editors like me to define this term in every chapter).
  • The ability of the publishing team to handle last minute changes. Personally speaking, I love to be able to make late changes in a book because it reflects reality. We live in an ever-changing world with so many sources of information that new and interesting items are bound to come to our attention all the time. It’s frustrating for an author when you’re told that a late change can’t be made even if it seems to you that it’s pretty important because it adds value to the text or addresses a concern raised in a chapter. Bad publishing teams stay focused on questions like page count, page layout, indexes, and all the important stuff that surrounds the publication of a book and forget that people buy and read books for the content.
  • Access to different computing environments. The best description of how a technology works in a particular environment comes from observation of how it actually works in that environment. It therefore follows that if an author has access to a variety of computing platforms and situations, they will better understand how a product works in those environments. I don’t have the resources to create a sixteen-member Database Availability Group that supports four hundred database copies and 64,000 real live mailboxes, so my knowledge on the daily operational concerns raised in such an environment is weak; my lifetime experience of large computing installations gives me some data to fall back on but nothing makes up for the real thing.
  • Time pressure to get the book out. A publisher can either let an author get on with the job of writing the best possible book they can create on a topic or they can hold to a schedule that is often arbitrary and sometimes impossible. The publishers that like schedules are often focused on selling seasons (do geeks buy technical books to give to each other for Christmas?) or on beating other publishers to the market with the first book on a product (in which case they run into the problem described in my second bullet). A publisher has to keep a certain amount of pressure on an author else the job would never get done, but the best publishers do it with subtlety and wisdom and protect authors from the tyranny of the calendar as best they can.

So there you are – the complete guide to understanding how a good technical book gets to be published in eight bullets.

Getting back to the topic in hand, I’m impressed at how Microsoft Press go about their business and I know that the same level of investment does not exist in other publishing houses and you can see it in the quality of their titles. The final judgment will be passed by readers… and you all have a vote.

–          Tony

Posted in Exchange 2010, Writing | Tagged , , , , | Leave a comment

Odd seeding behavior for an Exchange 2010 database copy


I run a virtual Exchange 2010 environment on my HP Elitebook 8530w laptop (8GB memory, SSD for the VM files). Usually things run along without a hitch but sometimes the USB connection to the SSD experiences a mild panic attack that causes it to disconnect from Windows. At that point the VMs go away and I start swearing, which is what happened today.

After I reconnected the SSD and rebooted all of the servers, I noticed that one of the Windows 2008 R2 servers was a sick puppy that wasn’t willing to do much except go into recovery mode. No problem, I’ll revert to a snapshot and all will be well as the databases on this server are replicated within a DAG. VMware duly loaded the most recent snapshot and booted the server. Some minor messing then occurred to reconnect the server back into the domain because it had lost synchronization. Windows needed to be reactivated too as it thought that a hardware change had occurred.

After all that, Exchange 2010 came back online and I noticed that the copy of the database on the server couldn’t synchronize with the active copy as Exchange had determined that there was too much of a divergence between the two copies for it to be able to fix things up. This was very logical because I use circular logging for these databases (to save space) and the transaction logs that would be necessary to correct the divergence were long recycled. Time to reseed the database copy, which I did with the Update-MailboxDatabaseCopy cmdlet (see screen shot).

Using Update-MailboxDatabaseCopy to reseed a database

No problems were recorded and the new database copy came online and was reported to be healthy. At least it was from EMS. On the other hand, EMC disdainfully refused to acknowledge the existence of any copy – active or passive – for the database, leaving a void where I expected to see some information.

Where have my database copies gone?

Interestingly, this only occurred for EMC running on the same server that I ran Update-MailboxDatabaseCopy. EMC was quite happy to list details of the database copies on all the other servers. I restarted EMC and the copies reappeared, so I concluded that this was a minor glitch in Active Manager on that server or the way that EMC fetches and displays information about copies. In either case, all is well and I’m up and running again as normal.

– Tony

You can learn much more about the Update-MailboxDatabaseCopy cmdlet and all of the other operations required to keep a DAG in good health in my Microsoft Exchange Server 2010 Inside Out book.

Posted in Exchange, Exchange 2010 | Tagged , , | Leave a comment

Active Directory book available for iPad or iPhone


While I use a 3GS iPhone, I must admit that I have not gotten into iPad. Maybe I’ll do so after Apple upgrades the device to V2.0, whenever that might be. In any case, I was intrigued to find that “Active Directory, 4th edition” from O’Reilly Books is now available as a book that you can read on an iPad or iPhone. You can find details here. The print version is available at Active Directory: Designing, Deploying, and Running Active Directory.

iPhone Screenshot 1

The author line up for this book is very impressive. Brian Desmond and Joe Richards both worked in HP Services, where they enjoyed an excellent reputation as technical leaders in the Active Directory space. Brian has since left HP and I am working with him at the Exchange 2010 Maestro seminar series where he is the “lab master” who has to dream up compelling and challenging lab exercises to complement the lectures that Paul Robichaux and myself deliver.

I don’t know the other two authors (Robbie Allen and Alistair G. Lowe-Norris) but both seem to have solid writing records. I can’t imagine how difficult it must be to coordinate writing a book between four authors, so to bring something out that is so useful is a real achievement.

Now I’m off to ask Microsoft Press, who now publish in co-operation with O’Reilly Media, whether they plan to publish my Microsoft Exchange Server 2010 Inside Out book using this route, or maybe even as a Kindle version. Maybe that will convince me to buy an iPad…

– Tony

Update 23 September: Microsoft Press has confirmed that O’Reilly will make the Exchange 2010 book available through multiple platforms. I anticipate that a Kindle version and an iTunes version will be released and that you’ll also be able to eventually access the book via the Safari online book platform.

Posted in Active Directory, Exchange 2010, Writing | Tagged , | 1 Comment