MEC 2012 Agenda appears (eventually)


Behind the scenes I have been critical of the folks organizing MEC as I consider them to have been too slow to publish the agenda listing speakers and session content for a conference that’s coming up very soon (49 days, according to the counter on the MEC web site). The problem here is that it’s difficult for potential attendees to gauge the value of the conference and to convince the powers-that-be that it’s worthwhile splashing out for travel, hotel, and other expenses to attend MEC if a solid agenda is unavailable. How else can you estimate the value of attendance? Sure, Microsoft has been pretty clear that MEC “the lost conference” is the coming out party for Exchange 2013 and that they’ll deliver lots of Exchange 2013 content in Orlando, but that’s not the same as seeing a fully fleshed-out agenda. And although Exchange 2013 is burning up many blogs at present, it’s also true that a large percentage of the people who might attend MEC are really more concerned about how best to deal with Exchange 2007 or Exchange 2010. Or, heresy of heresy, Exchange 2003… So what are we to make of the MEC agenda published today? Here’s my 10,000 foot view:

  • Monday, September 24 starts with a keynote right out of the Microsoft playbook when Rajesh Jha, the overall chief for Exchange, talks about topics like “innovative end-user productivity improvements.” I read this to mean that you should expect a tightly-organized and scripted talk that communicates the marketing messages but little else of real interest. Of course, keynotes are often social events – an opportunity to drop by and see who else is attending the conference – but the real action will start afterwards.
  • The opening keynote is followed by the “technical keynote”, promising to provide “an understanding of the new Exchange Server 2013 architecture.” No speaker name is assigned to this important and what-should-be-interesting talk, which I anticipate will be given by someone like Kevin Allison. The promise is that following the talk, we should be able to “contextualize new theory and content”, possibly because of the mind-altering drugs that will be freely available at the back of the room (aka Microsoft Kool-Aid).
  • For the rest of Monday, you’re offered the chance to select three sessions from eight offered on a menu spanning the major components of Exchange 2013. At this point, I’d be leaning towards sessions such as “High Availability and Business Continuity”, “Apps for Outlook and OWA” (lots to learn here about the new extensibility model), and “Deployment and co-existence with Exchange 2010”. Of course, I’ll make a final choice on site when I know the speakers that have been assigned to the sessions. There’s no real point in wasting time at a conference going to a session given by a weak speaker who can’t communicate.
  • Interestingly, Microsoft has taken the time to indicate follow-up classroom sessions for each of the formal sessions on Monday. This allows you to learn something about a topic and then dive into it further at a follow-up sessions more akin to a classroom lesson. I think this is a good approach as formal conference sessions often create more questions than they answer, especially when new technology is covered, and the classroom sessions provide a way to answer those questions after you’ve had some time to think them through.
  • Tuesday and Wednesday are called “Nothing but Exchange”, meaning that no sessions will be presented about other Microsoft technology or by third-parties who want to explain how their products add value to the Exchange ecosystem. A range of sessions is on offer to fill out detail about the headline Exchange 2013 topics introduced on Monday. I believe that these sessions will all be given by Microsoft personnel from both the engineering group and other teams that work with Exchange, but it’s hard to say exactly without names being assigned to the sessions. Interesting sessions here include “Public folder migration”, “Exchange 2013 sizing”, and “eDiscovery across Exchange 2013, SharePoint, and Lync”. I’m sure that many people will bring details of their public folder infrastructures to see how they can approach the migration to “modern public folders” and it will be interesting to listen to the Q&A at this session. The session on how Outlook 2013 and Exchange 2013 work together might also be valuable for those considering how to justify an early desktop upgrade to Office 2013.
  • Some non-Microsoft speakers are allowed to talk about Exchange 2010 and Office 365, and it’s here that we find ten sessions given by eight Exchange MVPs (about 7.5% of the worldwide total of Exchange MVPs), including my session on “Making the call: on-premises or cloud?”, which will clearly be a highlight of the conference (once I get around to creating some slides).
  • No details have yet been released about the evening activities. I imagine that Microsoft will try and make these quite exciting and unique to mark MEC out as a different kind of conference.

Any conference is as valuable as you make it. Some approach a conference like a military exercise and have a carefully-structured plan of everything that they want to do. Others take it more casually and attend sessions as the mood takes them on the day. Whatever your personal approach to conferences, there’s enough potentially interesting sessions on the MEC agenda to justify the attendance of anyone who is responsible for the design or maintenance of medium to large-scale deployments of on-premises or hybrid Exchange. (If you’re running a small one- or two-server deployment, it’s harder to make the case as you should really be considering Office 365 at this point.)

It’s becoming increasingly difficult to find good content at technology conferences in a world when engineering groups are now so fast at sharing information about new technology and independent blogs take up the remaining slack by providing additional insight and commentary. As a product that has just barely made it to a preview edition and is not expected to formally ship to customers until late this year, Exchange 2013 is still very new and that’s the major reason why MEC works in 2012. Next year, when Exchange 2013 has been out in the wild, it’ll be more challenging to build a compelling agenda as the focus changes from describing new stuff to exploring how that stuff really works in production. Microsoft has made a big thing about bringing back MEC after ten years. The old MEC had many flaws but it also had a great sense of community based on shared knowledge. The new MEC has a shot at delivering the same value, assuming that the speakers deliver in Orlando next month. With that in mind, I better go and start composing some slides!

Follow Tony @12Knocksinna

Posted in Exchange | Tagged , , | 1 Comment

Ferry to St-Tropez; the easiest way to get there during the summer months


If you holiday on the French Riviera during the summer, you’ll know that the road from Ste-Maxime to St-Tropez can be one of life’s miseries as traffic moves at the pace of a diffident snail en route home from a wine-pleasured lunch. In short, no one with more than two brain cells correctly aligned will drive to St-Tropez during the summer months unless they absolutely have to make the journey.

Of course, St-Tropez has a reputation that attracts tourists. Perhaps Brigitte Bardot will make an appearance. Perhaps they’ll meet another film star, or perhaps they’ll be discovered by a producer looking for the next big thing. For whatever reason, people flock to the village to enjoy expensive parking, generally poor food at high prices, crushed crowds around the port, and the chance to buy a picture from the artists that occupy part of the quays.

There are some good parts of St-Tropez, like a nice walk along the pier to the lighthouse or exploring the citadel, and to be fair, there are a number of good restaurants to be discovered too. If you want to get to the village and don’t want to drive, why not take one of the frequent ferries that ply between Ste-Maxime and St-Tropez? Like the famous Sydney ferry that goes from Circular Quay to Manley, these ferries are an excellent way to get to a destination while also having the chance to enjoy a pleasant sea trip.

Approaching St. Tropez from a ferry

We recently used Bateaux Verts (literally, the “green boats”, a company that offers a service every 15 minutes from both towns for €13 return. This might seem expensive, but the service is very fast and efficient and you have to treat the trip as part of a day out. The seats at the back of the boat are the best as you can enjoy the sea air without being splashed, which sometimes happens up front as the boat makes its way across the bay (the Mediterranean does have waves and a fair swell can develop if the wind blows).

Lighthouse guarding the entrance to the port of St. Tropez

Along the way you’ll see sights like the cruise ships that follow a circular path around the Mediterranean, visiting ports along the way, and see the ship’s tenders shuttling passengers in and out of St.Tropez. But the more interesting sights is the collection of yachts that anchor outside the port. These vary from the humble single-mast traditional sailing yachts to massive 100-metre long motor yachts (or private cruise ships), most of which seem to be registered in the Cayman Islands. The local English language radio station is full of ads for yacht crews, maintenance, captains, and the like and seeing the collections that accumulate at ports such as Antibes, Cannes, and St. Tropez, you come to understand a little of the life that their owners lead. Of course, many of the yachts are available for hire, complete with crew, should you have a spare $100,000 or so to pay for a monthly charter (and that’ll only get a medium-grade mega-yacht).

The ferries that shuttle across the bay give you the chance to share the same waters as those who cruise in the mega-yachts. It’s a much more relaxing and peaceful way of making your way to St. Tropez in high summer, should you feel the need to join the beautiful people who assemble there!

Follow Tony @12Knocksinna

Posted in Travel | Tagged , , | Leave a comment

Exchange Unwashed blog digest June-July 2012


I was remiss in not posting a digest of June articles so here’s a bumper edition listing all of the articles posted to WindowsITPro.com during June and July. In passing, I should also point to my 4,000-word plus in-depth review of Exchange 2013 posted on WindowsITPro.com on July 23. This article had been in the works for quite some time and was ready to go when Microsoft surprised the industry by announcing a preview edition of Exchange 2013 alongside the other products in the Office 15 wave.

In any case, here is the digest of the 20 articles that appeared in June and July 2012:

WebReady security vulnerability reveals complexity of modern software engineering (July 31) discusses how software depends on many complex links. In this case, a security issue was discovered in a software library licensed to Microsoft by Oracle that affected the WebReady document viewing feature used in Outlook Web App (OWA). Microsoft hasn’t yet given an ETA for a permanent fix, so for now all you can do is disable the WebReady feature.

Workcycles and the Managed Folder Assistant (MFA) (July 26) is an article that’s been in my queue for about six months. It never quite popped to the top of the stack because other more pressing topics demanded commentary. This article discusses the change Microsoft made in Exchange 2010 SP1 to move background maintenance processing to a more sensible auto-throttled workcycle basis rather than confining work to a limited period.

First impressions of Outlook 2013 Preview (July 24) provides some initial reaction after using Outlook 2013 Preview for a week or so. This wasn’t just a test. I actually connected Outlook 2013 to my “production” Office 365 mailbox and used it exclusively to find out what I liked and didn’t like so much. See what you think.

Microsoft finally sees sense about multi-mailbox searches (July 19). I always thought that Microsoft’s decision to make any mailbox liable to be searched by a multi-mailbox discovery require an Enterprise CAL was pretty dumb. After three years or so, the folks in Redmond saw the light of day and have changed the Product Use Rights (PUR) for Exchange 2010 (and Exchange 2013, I assume) from October 1, 2012 to remove this requirement. Score one for common sense.

Installing Exchange 2013 Preview (July 17) provides some notes about how to go about installing the Exchange 2013 Preview release on Windows 2008 R2 SP1 servers (I didn’t try the operation on the newly-released to RTM Windows 2012 Server). The upshot is that the install works fine as long as you pay attention to the prerequisites, including a number of hot fixes.

Exchange 2013 and Outlook 2013 preview versions debut (July 16) is an article written in a rush and with some surprise after Microsoft released Exchange 2013 and Outlook 2013 preview versions without notice. Of course, I have been tracking the development of these products for a long time but the rush to release a preview edition into public came as a complete surprise. Score one for the “let’s keep it secret” camp inside Microsoft marketing as the surprise generated a reasonable amount of excitement that I’m sure Microsoft was very happy to see.

The Bad Item Conundrum (or, how much data would you like MRS to drop?) (July 12) – I like the Mailbox Replication Service (MRS) a lot and think it’s one of the real success stories in Exchange 2010. However, MRS is quite picky about “badly formed MAPI items” and will refuse to move them to a new mailbox. This is OK if you’re happy for MRS to clean up the sins of the past and remove these items. It’s not so good if a mailbox has a lot of items that MRS doesn’t like.

Research proves that stupidity is a major factor in 419 email scams (July 10). Don’t you just love when hard-baked serious research proves what common sense tells you? Well, in this case, a Microsoft Research white paper demonstrates with lots of serious maths why spammers and scammers look for stupid people to respond to their pleas of untold wealth if only you’d get in touch.

Exchange Remote Connectivity Analyzer 1.4 is released – a wonderful tool (July 5). Version 1.4 of ExRCA has appeared and it’s a very nice tool – clearly something that all Exchange administrators should get to know. This article discusses some of the latest features and explains why ExRCA is so useful.

Counting mailboxes (July 3). Google got very excited when they reported that Gmail had accumulated 425 million mailboxes. I wasn’t so excited because I wondered just how they counted all the mailboxes and figured out what mailboxes are in use and which are not. The good news is that there are many scripts available on the network available for administrators to download so that they can generate reports on the mailboxes that they manage – and this article points to a few that you might like to consider.

Dell wins Quest (July 2). On June 5, I noted in a previous article that talks between Dell and Quest had fallen through and that this probably wasn’t such a bad thing as many corporate acquisitions run into choppy waters during the integration process, including the loss of much of the talent from the acquired company. But Dell was persistent and they put enough money on the table in the end to persuade everyone that they should win the right to incorporate Quest’s large software portfolio into Dell’s line-up. We shall see how this goes as the integration process proceeds.

Managing customizations for Exchange (June 28). Exchange provides zero facilities to track customizations that you might make to components such as the transport configuration file. Many of the files are server-specific anyway, but given that ALL-IN-1 2.3 (1988) boasted a complete Customization Management system, don’t you think that we could have a better answer in Exchange some 24 years later?

Demands of cloud mean on-premises deployment strategy must evolve (June 26). Have you noticed that Microsoft issues updates at an increased rate these days and that some of the Roll-Up Updates for Exchange 2010 have begun to introduce features and functionality that might have been reserved for service packs or even new versions in the past? The big difference is that Microsoft has to feed the Office 365 monster with new features so that its rate of change is as impressive as its Google competitor. The upshot is that some on-premises administrators now struggle to match Microsoft’s new pace. Is this a problem? Well, decide for yourself.

Is Office 365 suitable for small companies? (June 21). A commentary in the UK Register asked whether Office 365 was suitable for small companies because of some difficulties that the author had encountered in their work as a consultant helping companies to migrate to Office 365. I found some good and not-so-good comments in the article. My own conclusion is that Office 365 is highly suitable for small companies, which is why I use it.

Mimecast issues “shape of email” report (June 19). Mimecast is a very reputable company working in the email space. They put out a report purporting to analyse how people use email, which is an interesting topic for debate given the many ways that we have to communicate today. My problem was the underlying source of data used to determine the statistics described in the report. See what you think!

Upgrading to Windows 8 Release Preview affects Outlook’s safe senders list (June 14). I tend to be cautious when someone reports a problem when running a pre-release version of software simply because it’s impossible to know whether the circumstances that they encountered would be the same for anyone else. You should be cautious too when reading this piece describing how my upgrade to Windows 8 Release Preview wiped out Outlook’s safe senders list, which seemed odd to me.

The Story of Send (June 12). I emphatically did not write this piece to fill space, nor did I do so because I was blown away by the wonders of Google’s cartooning skills. Rather, the serious message in the story was the power efficiencies of modern datacenters. At least, that’s what I thought the story was.

Exchange sessions at TechEd 2012 – Nothing much new, MEC is where the action will be (June 9). It’s difficult for the Exchange development group to get TechEd sessions as they compete with all of the other Microsoft engineering groups for slots on the agenda. Things are especially difficult when it’s a case of “same-old, same-old” and an engineering groups is in the “can’t say anything about the new release” period leading up to the first public sighting of a new product, which is the situation that the Exchange group found themselves in for TechEd US 2012. And then there’s the small matter of MEC, which I believe should publish its line-up of sessions and speakers for the relaunched Exchange conference next week. We await the news eagerly.

Exchange Roll-Up Updates Are Becoming More Like Mini Service Packs (June 7). Exchange 2010 SP2 RU3 includes some important new code that eases cross-site failure conditions for clients. That all seems like great news, but this code was originally supposed to be in Exchange 2010 SP1 and was pulled at the last moment. And now it appears in a roll-up update!

Dell decides not to acquire Quest (June 5). This was the original note about Dell’s pursuit of Quest that eventually produced a result for Dell on July 2. I have been wrong before and anticipate being wrong in the future!

Follow Tony @12Knocksinna

Posted in Email, Exchange, Exchange 2010, Office 365, Outlook | Tagged , , | Leave a comment

How much is your email worth?


The nice people at Backupify have created a Gmail value calculator to help solve the question that I know has been keeping everyone awake at night: “Just how much is the email in my Gmail account worth?”

Apparently the answer is in the order of $3,588, based on the average U.S. annual salary ($45,230), the average number of messages in a Gmail account (5,768), and how long it apparently takes to write an email – 1 minute and 43 seconds.

I’m not sure about this data. I assume that the value given for an average U.S. salary is based on some U.S. statistical data and that it is reasonable to take such a value for what is very much a finger in the air kind of exercise, even if many Gmail accounts are used by people (like students) that have low or little income or as functional accounts that aren’t really owned by anyone (such as contact points for small businesses). I have more problems with using the other data points as the basis to calculate a value for email.

First, the same value is attributed to every message in the mailbox. My brief examination of my Gmail account reveals what could only be described as “a great deal of crap.” This comes about from the fact that I use Gmail as a “hold-all” kind of email account of which I give details when I register for a web site. Gmail accordingly acts as the receptacle for all the incoming marketing communications (the most polite term I can assign to these messages) and other banal information that companies feel compelled to share with me.  Gmail does a fine job of marking many of these messages as spam. Others remain waiting for me to conduct a periodic review and clear out.

Second, of the messages that remain, many belong to long-dead conversations that I wish I could clean up but have never because of the inability of Gmail’s user interface to accommodate power users who hold to the old habits of “read, then decide to keep/file or delete.” Even though I have turned conversation view off, I still find it tiresome to delete all the old messages in conversation threads that contain duplicated and redundant information. I did try, but gave up after a while. If Google offered a client for Gmail that had half the functionality that Outlook provides for Exchange, I think I would be much happier. And yes, I know that Outlook can connect to Gmail via IMAP, but it’s so slow…  The end result is that I have a lot of what I consider to be redundant and unwanted messages cluttering up my mailbox.

Third, one minute 43 seconds seems long for many of the messages that I send. I don’t think that the majority of my messages take this long to compose as I belong to the “if it can’t be said in a couple of sentences then it’s confusing” brigade (a theory that of course doesn’t apply to blogs). Most of my replies are one-liners that take ten seconds or less. Then again, I suppose that I have spent longer on other messages – but those are very much in the minority. And others might like to peruse, examine, pontificate, or otherwise dawdle over their messages. According to a white paper on the Backupify site, the 1 minute 43 second information appears to be derived from Nielsen online data, so I suppose it’s as good a guess as any, even if people like me weren’t covered by the survey.

Gmail value calculator

Anyway, I plugged my data into the calculator. After a short delay, the results popped out and I was told that my Gmail is worth $3,019.31. Some other analysis told me that I was “valuable” (thanks!), “old-school” (hmmm…), an introvert (news to anyone who knoews me), and “verbose” (right on the money, but hard to figure out from Gmail perhaps?).

The fine people at Backupify didn’t go through this process for love of their fellow man. They’re doing it to convince people that their Gmail accounts are so valuable that everyone should promptly sign up for an online backup service, preferably that offered and managed by Backupify.

I have one nagging doubt about the underlying value proposition. Google delivers an excellent SLA for Gmail and has done so for the last few years. They are a much bigger company than Backupify that has invested billions of dollars into many layers of redundancy (datacenter, network, servers, O/S, file system, storage, etc.). Both companies share a single obvious weakness (the Internet). With this in mind and seeing that my Gmail is only worth a relatively small amount of money that I can’t really account for, why bother with a separate backup?

Follow Tony @12Knocksinna

Posted in Email | Tagged , | 2 Comments

Farewell EMC, welcome EAC – the end of Windows-based Exchange administration


In a conference room in Redmond, WA last February, Microsoft revealed to a group of Exchange MVPs that Exchange 2013 would only include a browser-based administration client. The era of traditional Microsoft Management Console (MMC)-based UIs such as the Exchange System Manager (ESM), used in Exchange 2000 and Exchange 2003, and the Exchange Management Console (EMC), used in Exchange 2007 and Exchange 2010, was at an end. The future, or so we were told, was in the web. The consensus of the group was that the change was a good thing, as long as no functionality was dropped.

And so it came to pass that the Exchange 2013 preview edition made available on July 16 includes the Exchange Administration Center (EAC), a much-enhanced version of the Exchange Panel (ECP) shipped in Exchange 2010 (hint: look at the URL for EAC shown below to see the close connection that it shares with ECP). ECP established many of the principles that EAC exploits, such as RBAC-controlled UI display, and it does indeed include the vast majority of the functionality that was in EMC plus all the UI necessary to manage new Exchange 2013 features such as “modern” public folders and Data Leak Protection (DLP).

Exchange 2013 Administration Center (EAC)

As evident in the development of ECP, Microsoft has been on the path to browser-based administration for around six years now. You can understand why from their perspective as it makes a ton of financial, engineering, testing, and support sense to eliminate the Windows-based management tools. In 2011, Exchange 2010’s EMC experienced problems when the IE team upgraded a component that EMC used only to find that the EMC window wouldn’t close in some circumstances. Perhaps EAC will be less prone to breakage because of some change in a Windows components. Given Microsoft’s current focus on cloud-based delivery for its Office products, it also makes eminent sense for the Exchange team to discard any utility that’s tied to Windows and focus instead on creating a comprehensive management framework that works well across both on-premises and cloud environments. I see this happening in EAC and again, a hint to Microsoft’s directions is provided by the prominent “Office 365” label displayed by EAC.

What’s for sure is that Exchange benefits because EAC uses the multi-browser approach that Microsoft takes for Outlook Web App (OWA). Essentially, if a browser supports OWA premium, it will support EAC, so that means that you can use IE, Chrome, Safari (for Windows), and Firefox. On the other hand, don’t expect to be able to run EAC on Opera or other less mainstream browsers. I’m sure that some in the Exchange community will bemoan the demise of EMC. That was my original position until I realized the benefits of upgrading from what has become a fat, dumb, and not-so-happy management console to something capable of running on multiple platforms that doesn’t depend on complex machinations between IIS, Windows Remote Management (WinRM), and other Windows components to run properly and which can be as slow as a wet pig on a bad night (with all due respect to our porcine friends).

In short, Microsoft is dead right to consign the MMC-based consoles to the great byte wastebasket in the sky. And if you really need to get an MMC fix, Exchange 2013 preview still includes the “Exchange Toolbox”, a console to manage traditional public folders (those that have not yet been migrated to the all-singing, all-dancing modern variety introduced in Exchange 2013) as well as provide pointers to useful tools such as the Exchange Remote Connectivity Analyzer (ExRCA). EAC isn’t perfect. However, I’m sure it will get better as feedback flows back from people who use the Exchange 2013 preview.

And I’m positive that there will be quite a few debates on EAC and how it functions as a management console at the Microsoft Exchange Conference in Orlando next September. Maybe I’ll see you in Orlando to join the EAC debate there!

Follow Tony @12Knocksinna

Posted in Exchange, Exchange 2010, Office 365 | Tagged , , , | 3 Comments

Countdown accelerates to Microsoft Exchange Conference – worth attending?


As I write, there are 67 days to go before the new, improved, and so much better Microsoft Exchange Conference (MEC) returns after its ten-year hiatus. Even though I have attending far too many IT conferences in Orlando, I’ll be there when MEC swings into high gear on September 24, and not just because my beaming face appears on the MEC web site! Actually, looking at the photos (above), they resemble a line-up of mug-shots taken at a police station, but that’s not important right now… In fact, the photos were taken last February. Microsoft holds an annual summit meeting for the people it recognized as Most Valuable Professionals (MVPs). Normally the summit is held at Microsoft’s HQ in Redmond, WA to make it easy to interact with the product engineering groups and the MEC team took the chance to film a number of the MVPs (with our agreement) to help publicize the event.

To return to MEC, after this week’s Office 2013 announcements, including the availability of a preview version of Exchange 2013 (see my notes about installing Exchange 2013), some might wonder why they should spend all the money and effort to go to MEC. After all, you can download and install the preview versions of Exchange 2013 and the other Office 2013 products to gain experience and insight into the technology through your own test deployment. A mass of commentary has already appeared about the new features that appear Exchange 2013 like the Exchange Administration Center (EAC), the browser-based successor to the MMC-based Exchange Management Console (EMC), and you can expect to see a stream of similar articles appear in blogs, including the Exchange group’s own EHLO blog, over the next few months. With such information at our fingertips, is there any good reason to go to MEC?

The “Top 10 reasons” described on the MEC site is a good start, but I think that the answer comes down to a single word: “community”. If you place a high value on interacting with other professionals to learn about technology and to pursue to often over-hyped concept of “best practice”, then events like MEC are great places to meet people who can help you to understand, deploy, manage, and master technology. It’s also true that the opportunity to hear Exchange engineers describe the technology that they work on can provide a unique insight that is unobtainable elsewhere. Often insight happens by ignoring the formal PowerPoint slides, which will have been refined to the nth degree by the tender care of Microsoft’s graphic artists and PR teams, and listening instead to how the engineers describe technology, the weight and emphasis that they attach to specific points, the way that they answer questions (or not), their willingness to explore issues, and perhaps the chance to sit down with engineers over a coffee (or some other suitable beverage) afterwards to discuss their work in more detail.

Blogs and articles are great, but hearing about technology from the proverbial horse’s mouth is a different matter. I anticipate that Microsoft will flood MEC with engineers from the Exchange 2013 development teams who can answer questions such as “how can I get my existing 100,000 public folder implementation over to Exchange 2013”. At least, I hope that they do and don’t commit the faux pas of international MEC events in the late 1990s and early 2000s when Microsoft attempted to restrain travel costs by sending a small group of engineers and program managers to deliver sessions. All this resulted in was a fatal lack of quality when compared to the U.S. version of MEC. Most of the speakers did well when they discussed their own area of expertise, but asking someone who worked on (for example) the management or migration tools to discuss the finer points of high availability was a disaster waiting to happen, and it often did happen.

If only because they want the relaunched MEC to be successful, I don’t think Microsoft will make the same mistake in Orlando. If you attend MEC, I anticipate that you’ll have the chance to listen to engineers who work on all parts of Exchange 2013 and Exchange 2010 and cover the three platforms that Exchange runs on (on-premises, hybrid, and cloud).

But Microsoft is not the source of all good information. MEC will have many other interesting sources to tap, including MVPs and administrators and planning from the companies who participate in Microsoft’s Technology Adoption Program (TAP), who have had the chance to run early versions of Exchange 2013 in production. These people can be a rich vein of otherwise unobtainable information that provide the essential facts to underpin a successful deployment. They are also great contacts to have when you need further information in the future. After all, it’s much easier to begin a conversation with someone when you share a common experience.

It’s impossible to put a value on community, connections, and insight. At least, not a value that can satisfy the kind of management and beancounters with whom “Dilbert” has to cope. If you’re in the situation where you have to make a case to attend MEC, I think you should focus on some specific technical issues that your company has to face over the next few years and tie your attendance at MEC to the pursuit of information that will help to resolve the problems.  I think that’s a reasonable approach to take – at least it was one that I was sympathetic to when I was signing travel requests!

Follow Tony @12Knocksinna

Posted in Uncategorized | Tagged , , , | Leave a comment

The ALL-IN-1 Mail Filter: Forerunner of modern email rules processing


Email systems had a different ebb and flow twenty-five years ago. An average mailbox received perhaps 10-15 messages daily and the messages were simpler with fewer attachments and properties. Most messages were in the order of one printed page or less, or roughly 2KB of ASCII text. Some email systems, such as DEC’s ALL-IN-1 (an early timeshared Office Automation system that had email capabilities), allowed users to compose message text with more complex editors (WPS-PLUS or WordPerfect) but the lingua franca was plain text as that’s all that you could reliably guarantee a recipient to be able to read if the message went “off-server”. When printed or displayed on-screen, a message usually still boasted that it was an “Interoffice Memorandum” or similar official-style communication. In the early days of email, we were still in the process of replacing typed memos and in many companies it was important for email to mimic the style and appearance of typed output.

Today’s users probably process ten times the mail traffic that we handled then. But even so, it seemed like a good idea to come up with methods to help users to process email automatically by allowing them to create a set of rules that prioritized messages by reference to keywords and other message properties and then take an action on messages to refile them into a folder or leave them in the Inbox for the user to process. This project became the ALL-IN-1 Mail Filter, an idea that’s described in U.S. patent 5377354 “Method and system for sorting and prioritizing electronic mail messages” granted on December 27, 1994. I am a co-inventor of this patent.

In patent-speak (written so only a lawyer could love the text), the patent covers:

A method and apparatus for prioritizing a plurality of incoming electronic mail messages for a user uses a user created and modified rules-control (12) which is stored in a rules-store (12). Incoming messages are stored in a message store (11) and are screened individually by a rules test unit (13). The rules-test unit has a comparator (52) which matches keywords which are chosen by the user while creating the rules, add supplies signals to an action list unit (54). By applying the user created rules for deciding which messages constitute the priority messages for the user, a priority assigning unit (45) within an action portion (35B) of the rules-store (12) assigns a priority number (say from 1 to 5, 1 being the highest priority for example) to each screened message. Responsive to the assigned priority number of the screened message, the message is sent to a main folder store or forwarded or put away as appropriate. The user created rules can be modified by the user using a conventional keyboard.

In human-speak, what we did was to create an ALL-IN-1 option to allow users to create a set of keywords that they thought were important and could be used to filter incoming messages. For example, you might have a keyword called “Knocksinna” because it was the code name for your current project. Each keyword was given a priority value with 1 being the highest. Each priority value was given an action chosen by the user. For example, “Place all priority 1 email in the Projects folder” or “Move all messages with priority 5 to the wastebasket”.

We then had a program that ran on the VAX/VMS minicomputer where ALL-IN-1 was installed. I believe that the program was written in LISP, a language that was often used for artificial intelligence (AI) projects. The program was written by my co-inventors, all of whom worked for one of DEC’s corporate software engineering group based in Sophia-Antipolis, a technology park located between Nice and Cannes in the South of France.

Artificial Intelligence was all the rage in 1987-88 and lots of work was being done in “knowledge engineering” to determine just how much intelligence could be incorporated into computer programs. DEC’s European Technical Center was staffed by some pretty interesting people, one of whom being Andrew Buchanan. He specialized in AI and convinced me that AI could solve some common business problems as opposed to the sample AI programs that were circulating. My favorite of these was one that allowed you to play Monopoly against the VAX. Unfortunately the AI wasn’t very developed, the computer was too stupid, or the human beings cheated a lot as the computer invariably lost by swapping valuable properties for a song. We viewed the Mail Filter as a project that illustrated how AI could assist people with common tasks in an office environment.

In any case, the LISP program took a list of new messages extracted from the user’s ALL-IN-1 inbox and compared the information about the new messages with the user’s keyword priority list before deciding how to process the messages. The program wasn’t fully integrated into ALL-IN-1 at the code level, so it wrote out a text file containing details of how the messages should be filed. An ALL-IN-1 script then opened the text file and executed instructions to process each message as instructed.

All of this sounds pretty kludgy and indeed it was. No great breakthrough in software engineering was achieved to create the various components that interacted to process incoming email, but that wasn’t the point. We wanted to demonstrate an idea and if the idea proved worthwhile, we wanted to take it forward and incorporate it as a new feature in a future version of ALL-IN-1. Unfortunately that never happened. The code that we created worked and although it was slow (and buggy at the start), it did the job. There’s no doubt in my mind that it would have been feasible to increase speed and eradicate any remaining bugs had we persisted but that never happened because other development priorities got in the way.

ALL-IN-1 never had inbox rules functionality but lots of email products boast a similar feature today. For example, the rules processing of Microsoft Outlook is simply a far more developed version of what we created in 1988. The large list of other patents that acknowledge patent 5377354 is evidence of its influence since the patent was filed by DEC.

Why was the Mail Filter patent never asserted in court if so many other email systems developed similar functionality? Perhaps it’s because the patent was just one of many owned by DEC, Compaq, and HP (for such was the ownership chain due to mergers and acquisitions) and never seemed to be important in terms of the overall patent portfolio. On the other hand, it’s true that DEC, Compaq, and HP all had patent cross-licensing agreements with other technology companies that allowed those companies to access the IP. Microsoft was one of those companies so any infringement by Outlook was fully covered by the cross-licensing agreement.

HTC bought the Mail Filter patent from HP in 2011. Companies quite often trim their patent portfolio to remove or sell on patents that have expired or no longer seem interesting and I’m sure that this is what happened when HP decided to sell the patent to HTC.

Looking back, we probably should have done more to develop the idea contained in the Mail Filter. At least we captured the idea in the patent.

Follow Tony @12Knocksinna

Posted in Email, Technology | Tagged , , , | 2 Comments

Sanity slowing coming into focus in patent wars


Patents have been a lucrative source of highly profitable income for the technology companies that have accumulated sizable portfolios over the years. Companies earn income by licensing their patents to other companies or by prosecuting others who are felt to infringe a patent. At the same time, a patent portfolio is a wonderful way to gain access to the technology owned by another company through a cross-patent licensing agreement. And a counter-suit based on one or more patents in a portfolio is an excellent defensive mechanism when other companies attempt to sue for infringement. It’s easy to understand why technology companies like to have extensive patent portfolios gained through innovation by employees or acquisition from other companies. Much of the value in Google’s recent acquisition of Motorola Mobility, for instance, comes from the patents owned by Motorola Mobility, even if some of those patents are reasonably antiquated in the context of current technology. It’s also interesting to see how Amazon’s purported plan to create an Android-based smartphone is accompanied by an effort to assemble sufficient patents to ward off future lawsuits, much like those who ventured into the realm of vampires wore garlic.

Patents are much beloved by lawyers too. First, patents are written in a strange form that might not be readily understood by the man in the street and therefore requires the service of lawyers to write patent applications and interpret the text in the final form of the patents. Convoluted and complex sentences are a specialty. The lawyers would say that their text is constructed in such a way that the invention described in the patent is clear. However, the output is often difficult to read and understand if you’re not familiar with patents, which means that businesses engage lawyers to understand these documents. And of course, when infringements are considered, lawyers are needed to detect, prosecute, and defend the allegations. The result is a veritable torrent of professional fees from a multitude of cases taken by different technology companies against their peers. The number of cases has been steadily growing too as companies like Apple become more aggressive in using their patent portfolio to defend market segments like smartphones against rivals.

I’m not a lawyer, but my awareness and knowledge of patents was gained as a member of Alpine Engineering and Design, Inc. corporate Intellectual Property committee over a seven year period as well as enjoying the responsibility for managing patents for a large business unit for six years. As such, you could say that the patent lawyers who supported businesses educated me over that time. In addition, I am the co-inventor of a patent granted in the U.S. and Europe and have been involved as an expert witness or consultant in a number of recent cases, so I’ve been able to observe how patent lawsuits evolve from initial allegation of infringement onwards to prosecution and trial.

Observing the rush to the courts by companies who seek to preserve and uphold their rights is interesting, but a recent judgment by Richard Posner in the U.S. might encourage some common sense to the cut and thrust of the lawsuits filed by technology companies. Posner, a judge of the U.S. court of appeals for the 7th circuit (and co-author of an interesting blog) set a high bar for companies to prove and specify damages in a case argued between Apple and Motorola Mobility (now owned by Google) by finding that neither side proved a good case for damages to be awarded. The judge said that a declaratory judgmentwould have no practical effect” and dismissed both claims with prejudice, which stops the claims being refiled in future.  Posner noted that Apple failed in this case “despite its vast resources and superb legal team, to do so (quantify the benefits of infringement) in a minimally acceptable manner—failed whether because of mistakes in trial preparation (which even the best lawyers can make), or because too many cooks spoil the stew (Apple is represented by three law firms in this litigation), or maybe because the infringements did not deprive Apple of any profits…” Wow! Mind you, the lawyers might have known what was coming as Posner had derided some claims during the hearing, including an Apple claim that their patent covering unlocking a phone with a swipe had been infringed if a user tapped instead saying “Apple’s argument that a tap is a zero-length swipe is silly. It’s like saying that a point is a zero-length line.”

Posner subsequently threw some additional cats into the proverbial pigeons by questioning whether software should be covered by patents at all. Cue panic in the offices of companies who have assembled substantial patent portfolios, including the patent trolls (or more charmingly, as they were called in a recent WSJ article, non-practicising entities or NPEs).

Another example of common sense occurred in the U.K. High Court when Mr. Justice Lloyd ruled against Apple in a suit brought against HTC alleging infringement of four patents relating to iPhone, including a “slide-to-unlock” patent that’s deemed to be pretty fundamental to the operation of modern smartphones. In this case, a patent to protect sliding to unlock a smartphone was deemed partly invalid because it was “too obvious” and the judge ruled that HTC did not infringe the remaining part of the patent. Applying the test of obviousness sounds like eminent good sense to me. It would be nice if some of the patent examiners applied the same test when they accepted patent applications. However, in saying this, I admit that the examiners are under terrific pressure caused by much work, possibly too little resources, and maybe just some stress from the lawyers who file the claims.

Apple ran into another problem in the U.K. High Court on July 9 when their case against Samsung that alleged the Galaxy Tab copied the iPad was thrown out by Judge Birss, who said: “They do not have the same understated and extreme simplicity which is possessed by the Apple design,” and “They are not as cool. The overall impression produced is different.” The judge noted that the Galaxy Tab was substantially thinner than the iPad and that the Samsung devices had “unusual designs” on the tab. All-in-all, another bad defeat for Apple’s lawyers.

Update (July 18): According to some reports, Apple is being forced by Judge Birss to publish a notice on their website and in several British newspapers to correct the “damaging impression [that] the South Korean-based company [Samsung] was copying Apple’s [iPad] product.” Apparently the report has to stay on Apple’s site for six months – I can’t find any trace of the notice on Apple’s UK site, but I’m sure that it will only be a matter of time before it shows up.

Update (July 26): Apple won a stay on having to post the notice about Samsung not copying the iPad. The legal games continue.

One or two rulings do not bring the whole edifice of suit and counter-suit tumbling down. However, it should create some doubt in the minds of patent lawyers and the companies that they serve when the time comes to decide whether to press ahead with cases that could cost an awful lot of money and result in the same kind of outcome that Apple has recently experienced in the U.S. and U.K. I think that this would be a very good thing because many patents are pretty obvious or can be easily worked around if they are inadvertently infringed. In fact, many patent infringements aren’t really serious and are hardly worth prosecuting, so lawyers and companies alike might start to focus on the really important or fundamental patents that describe and protect obvious innovation.  For example, I think Apple’s patent covering the use of sensors to indicate obvious water damage to electronic devices is worthy of protection because it addresses a real-life problem in an innovative manner.

The old adage that a patent portfolio is composed of 1-2% fundamental patents, up to 10% of commercially interesting patents (that can be used for cross-licensing or defensive purposes) and 90% of patents that don’t really add much except numerical weight (or rather tomes of obscure text) seems to be getting truer. Perhaps we’ll see more of the type of agreement reached by Facebook and Yahoo! last Friday to settle differences in what seemed to be a pretty crazy attempt by the previous Yahoo! CEO to extract a ton of money from Facebook. The agreement settles the case and lets the two companies engage in a more productive form of intellectual property sharing. It would be nice if a similar conclusion was reached more often.

I doubt that companies will cease suing each other for patent infringement as other judges will probably be more open to the pleading of those who think their patents have been infringed, but it’s nice to see some element of sanity being restored to the debate.

Follow Tony @12Knocksinna

Posted in Technology | Tagged , , , , , , | 4 Comments

Computing survey reports on UK market – 2% still using Exchange 2000?


The U.K. is Microsoft’s largest subsidiary and has always been an important and mature market for Exchange Server. The results of a survey about the pace of transition to Exchange 2010 reported by the well-respected Computing magazine are therefore interesting and deserve some commentary.

According to Computing, Exchange has 80% of the enterprise groupware market with competitors such as Lotus Domino limping on with just a few percentage points, probably in companies that have used Lotus Notes since the dawn of time. Computing did two surveys in August 2010 and February 2012 to assess how quickly enterprise customers were deploying Exchange 2010 and reports that with Exchange 2013 on the horizon: “history suggests that whatever the advances in functionality may be, there will be no rush to adopt by UK enterprise.”

They point to the fact that the February 2012 figures (some 30 months after the original release of Exchange 2010) show that only 42% of enterprise customers now run Exchange 2010. And although there’s been a marked increase from the 8% reported some 18 months previously, 56% of customers still run either Exchange 2003 or Exchange 2007 with another 2% stuck on Exchange 2000.

My mind still can’t quite get around the fact that 2% of UK enterprise customers run Exchange 2000 as I wonder why they have never upgraded. The base survey data is from “IT Decision Makers at large UK organizations”, so I assume that these individuals are visited regularly by Microsoft salespeople, whose scorecards must look pretty bad with Exchange 2000 still in the picture. Couldn’t these accounts have been persuaded to upgrade to at least Exchange 2003 rather than stay on software coded in the last millennium? In any case, the 2% has remained stable between the surveys so these customers must be set in their ways.

The report observes that many customers who run Exchange 2003 or Exchange 2007 are now “en route to upgrading to what is – numerous bug fixes and service packs later – a stable and mature platform.” The platform here refers to Exchange 2010, even if Computing is incorrect in some of the advantages listed for Exchange 2010, such as support for Outlook 2010 and virtualization, both of which can be gained with Exchange 2007. That being said, there’s no doubt in my mind that Exchange 2010 is a far superior product to Exchange 2007, if only because of its high availability features.

No mention is made about a potential shift away from Exchange 2010 as the target destination to either hybrid deployments or an “all-in” migration to Office 365. Perhaps this was due to the date of the survey as Office365 was still under the cloud (no pun intended) of some early reliability hiccups in February 2012 and Exchange 2010 SP2’s hybrid connectivity wizard hadn’t made an impact then.

I wonder if the advent of Exchange 2013 after its debut at the Microsoft Exchange Conference (MEC) in September will impact upgrade plans further, especially if, as has been the practice in previous versions, Exchange 2013 requires a client upgrade (Outlook 2013) to achieve its full potential or some features won’t be quite complete until a future service pack arrives. The fact that Outlook 2013, along with its Office 2013 counterparts, boasts a new user interface is likely to cause further delay due to the impact of any UI change on users and help desks. We shall see in due course.

The report noted that the main reasons for delaying a move to a new release were due to cost (36%) or fear of disruption to end users (27%). The need to deploy new or upgraded hardware is also a factor, as per the requirement to deploy Exchange 2010 on Windows 2008 R2 and to upgrade Client Access Servers (CAS) to take on the increased load generated by moving the MAPI endpoint for clients from mailbox servers to the CAS. Exchange 2013 is likely to cause some similar debates around choice of operating system (will Windows 8 Server be in the picture?), the kind of hardware platform used (dedicated servers or virtualization), and indeed the time required to deploy and then move user mailboxes from old servers to databases on new servers.

Interestingly, 34% of those surveyed reported that email performance had suffered some degradation in the immediate aftermath of the move to Exchange 2010. Thankfully this figure decreased over time to 3% at the time of survey. I think this indicates the usual experience where administrators grapple with new features in the early days of deployment and gradually become more experienced and competent over time. It might also indicate common problems with new software such as problems with third-party packages that interface to Exchange (backup products are a usual suspect here). Good planning and practice in a realistic test environment can reduce the potential for problems for the actual deployment but never eliminate snafus. It’s just the way that IT works.

It would be interesting to compare and contrast the data for the Exchange installed base in other geographies. Alas, the data isn’t available – at least, not publicly (I’m sure Microsoft has it).

Follow Tony @12Knocksinna

Posted in Email, Exchange, Exchange 2010, Office 365 | Tagged , , , , , , , | 2 Comments

Adieu Minitel


France turned off its Minitel service on June 30, thirty years after the little beige boxes first appeared in French homes. Apparently almost 800,000 devices were still in use when the time came to flip the switch from the maximum of 9 million devices that were in use in the mid-1990s, figures that are a tribute to the foresight of those who deployed and maintained the system since its introduction.

Looking at the Minitel devices now through the lens of modern computing design, you might wonder at the clunkiness of it all. But if you cast your mind back to the connectivity that people generally enjoyed in 1982, Minitel all seems so avant-garde, far removed from dial-up network connections over a 300 baud modem to bulletin boards accessed from the earliest PCs or other devices.

Integration was the big advantage enjoyed by Minitel, a factor exploited with success by Apple today. France Telecom designed and deployed the Minitel devices, connecting them into its own network and providing the essential infrastructure for third-party information providers to link their services and make them available to Minitel consumers. Sounds a lot like iPhone and the App Store today!

Sets were free because they replaced the traditional printed (heavyweight) telephone directory. The services were many and varied and very much a precursor of how the Internet developed. First up were public services and an online telephone directory. The ease by which you could search for a local doctor or dentist was amazing, even for technophobes. Next came services such as online horoscopes, airline booking, train arrivals and departures, and so on, all paid for via your telephone bill. France Telecom made it extremely easy to consume a lot of services without really noticing, another issue that the App Store has been criticized for in modern times.

But the thing that made Minitel was “Minitel Rosé”, literally “Pink Minitel”. Just like pornography accelerated the pace of development for inventions such as the VCR, DVD, and the web, sex services accessed via the black and white screen and chunky keyboard became very popular and accelerated the usage of Minitel across France. I remember visiting a French neighbor for dinner in 1988. After the meal, the men were brought into the host’s office to view his hobby of connecting to a popular pick-up forum. I had no idea that our host liked masquerading as a blond 16-year old female from Toulouse and spent hours chatting to others who lurked around the forum. To each his own, I guess.

France has been successful with many great natural projects such as the TGV. Minitel was successful within France but never made a transition to other countries. Despite its lead in terms of vision and implementation together with a successful nationwide deployment and evidence of a solid business model, Minitel just didn’t make it elsewhere. France tried, but no other national PTT (for such was the target market in the 1980s and 1990s) was tempted to use Minitel and eventually its technical lead and access to information was succumbed by the advent of the world wide web and browsers that could run on just about any computing device.

So adieu Minitel, another great idea that had its time but never quite made it outside its home market.

Follow Tony @12Knocksinna

Posted in Technology | Tagged | 2 Comments