Exchange Connections 2012 – Your chance to present


Following on from my recent post about the three major U.S.-based conferences that feature a lot of content about Microsoft Exchange Server, I note that the nice people at Penton Media have issued a call for papers for the Exchange Connections 2012 event, which is due to take place in Las Vegas, NV from October 29 through November 1. I’ve included the complete text of the Penton communication below. Exchange Connections is co-located with a number of other conferences and I’ve also included the text of the call for papers for Windows Connections, just in case some are more interested in talking about Windows infrastructure components such as Active Directory

The important points I take from this call are:

  1. A focus on practical sessions. I like this because these sessions convey immediate value to attendees
  2. A note that presenters should come with content that hasn’t appeared elsewhere. In other words, they shouldn’t be busy applying a new slide template to their deck in the speaker room shortly before they get up to present. This happens far too often at conferences and becomes obvious when color palettes don’t quite match or you see logos leaking into white space. Or simply that you’ve heard the presentation a couple/many times before!
  3. High availability remains a key focus for administrators. I also see that Penton is looking for someone to deliver a one-day pre-conference session on the Exchange Management Shell, something that indicates a demand from attendees to get to grips with the shell.

In any case, to get your name into the speaker selection process, you need to come up with ideas for three sessions and send them off to Penton by April 30. I wouldn’t worry too much if you can’t come up with ideas for three good sessions – all conference organizers put this kind of thing into their call for papers as they like to have speakers do multiple sessions. If you have really compelling content but can only do one or two sessions, I’d still submit on the basis that you at least force the organizers to make a choice – and you never know.

Presenting at a major conference can be stressful. However, it’s a great experience that helps you to understand the material you present better than ever before and is a wonderful addition to any CV. Try it… you might just enjoy the experience!

– Tony (follow my ramblings on Twitter)

Exchange Connections 2012 Call for Papers

The Fall Exchange Connections conference provides Exchange Server administrators with an in-depth training opportunity in which they can learn about features and techniques that they can implement in their current Exchange environments. In addition, attendees will have an opportunity to learn about the latest Microsoft technologies, including Lync, Office 365, and Exchange 15.

Exchange Connections is collocated with Windows Connections, SharePoint Connections, SQL Server Connections, and DevConnections.

Audience

  • Exchange Administrators
  • IT Professionals (Systems Administrators, Architects, Network Administrators)
  • IT Managers and Business Decision Makers

Topics

We are looking for sessions that offer practical advice and walk attendees step-by-step through features, techniques, and troubleshooting in Exchange 2010 and 2007. Sessions should not be recycled from past Exchange Connections shows.

We are looking for breakout sessions of 60-75 minutes in the following topic areas:

  • Exchange Architecture & Infrastructure
  • High Availability/DAGs
  • Exchange Security Features, such as Role Based Access Control (RBAC) and retention policies
  • Interoperability (with SharePoint, Office 365, other versions of Exchange, etc.)
  • Exchange Management Shell (EMS)

Suggestions for additional topics are welcome.

We are also interested in proposals for deep-dive pre-conference and post-conferences workshops (either half day or full day) on the following topics:

  • Getting Started with PowerShell for Exchange

Suggestions for additional pre-conference and post-conference sessions are welcome.

Submissions Guidelines

We expect each speaker to present three breakout sessions at the show. Please submit the following information about each of your sessions to Megan Keller, Editorial Director, Penton Media Technology Group: megan.keller@penton.com.

  • An active session title that describes exactly what attendees will learn from your session
  • An 1-2 paragraph abstract that describes in detail what you will discuss during your session and the value the attendee will get out of attending that specific session
  • A biographical statement

Deadline

Submissions are due by Monday, April 30th. Final session selections will be made in early May.

Windows sessions wanted also

Windows Connections is collocated with Exchange Connections, SharePoint Connections, SQL Server Connections, and DevConnections.

Topics

We are looking for sessions that offer practical advice and walk attendees step-by-step through features, techniques, and troubleshooting in Windows Server and Windows OS. Sessions should not be recycled from past Windows Connections shows.

We are looking for breakout sessions of 60-75 minutes in the following topic areas:

  •  Active Directory
  • Group Policy
  • DNS
  • Virtualization (including both Hyper-V and VMware)
  • System Center
  • Storage
  • Networking
Posted in Active Directory, Exchange, Technology | Tagged , | 2 Comments

First Exchange 2013 training course appears


December update: Now that Exchange 2013 is generally available, I want to make it quite clear that I neither endorse nor recommend the training mentioned below. This entry was posted to demonstrate the foolishness of launching a training course for a product some six months before it was actually available. If you are in the market for Exchange 2013 training, please take the time to review offerings from other reputable companies such as ClipTraining before you make your decision.

Subsequent to my April 13 WindowsITPro article speculating whether Microsoft will   achieve Release-to-Manufacturing (RTM) status for Exchange 2013 in mid-November 2012 with full product availability following in early 2013, I was quite amused to find the first evidence of Exchange 2013 training has been published online.

Exchange 2013 training coming to you soon

The dates for the courses are “TBD”, which didn’t come as a huge surprise seeing that Microsoft hasn’t yet made a public beta of the next major release of Exchange available. Everyone knows that a new version is on its way as part of the Office 15 wave and I began to refer to it as Exchange 2013 after Mary Jo Foley published an article about SharePoint 2013, which is also part of Office 15. You couldn’t really have one product with a “2013” suffix if the others don’t share the same. I therefore conclude that we will have Outlook 2013, PowerPoint 2013, etc. etc. etc.

I do hope that we won’t see a rush of books that appear soon after Microsoft ships Exchange 2013. It’s great to have a book in the market that explains all about new technology but it has been the case with the last few versions of Exchange that a) changes have happened in the product between final beta(s) and the RTM software and b) some functionality has not been complete until Microsoft ships the first service pack. Think of the introduction of Standby Cluster Replication (SCR) in Exchange 2007 SP1 or the complete rewrite of Outlook Web App (OWA) in Exchange 2010 SP1, plus the addition of new features like block mode replication.

Preparing training about new products is also difficult. Apart from learning enough about new technology in order to be able to teach it at a sufficiently in-depth level, a lack of real-world experience exists about how the technology actually functions in production environments. No amount of Microsoft white papers or TechNet documentation can compensate for knowledge gleaned through external deployments, if only because these deployments invariably reveal flaws that Microsoft didn’t consider during their development or testing. That’s why training developed over time and with the advantage of mature reflection is better than anything rushed out just to be able to say that a company can deliver courses about new technology.

But then again, the imperatives of the market mean that companies do have to get new courses out there to survive. As Exchange 2013 (or whatever it will be called) comes into focus, I wish those who develop training well and hope that they succeed in the noble task of helping others understand just what the Exchange development group has been working on since they completed Exchange 2010.

– Tony

Follow Tony’s rambling @12Knocksinna

Update April 17, I see that Computer-Based-Training (CBT) video programs for Exchange 2013 are now promised by a company based in Florida. Whatever about training courses, which can be built and delivered reasonably quickly, training videos require a whole new set of skills and much more preparation and post-production work to produce. It’s impressive that a company should be offering training videos about what’s still a non-existent product…

Posted in Exchange, Training | Tagged , | 2 Comments

HP’s (eventual) push into cloud services


News that HP has launched its first public cloud services and will enter public beta on May 10 provoked my interest from many angles. As a former CTO of a major HP business, I had observed (and sometimes participated in) many attempts within the company to construct a cogent approach to cloud services. Despite much hype and bluster and a fair amount of wasted (and expensive) effort, nothing much had come from these attempts. HP just couldn’t make its mind up whether it would be a provider of hardware to cloud vendors or plunge into the maelstrom itself.

However, looking at what’s now on the table, I think that there are a couple of reasons why HP has a shot at being successful in creating a cloud business.

First, the right executive leadership is in place in Bill Veghte, who took over as HP’s chief strategy officer following the retirement of Shane Robison last year. The important thing here is that Veghte wears another hat as Executive Vice President of HP’s Software Business. The combination of being a member of HP’s executive committee, owning direct responsibility for a major business group, and having oversight over corporate strategy is a powerful mix that seems (to me) to be right for the job. The fact that Veghte leads HP Software, home of powerful automation technology (think of the datacenter automation software bought with Opsware) resulting from years of patient acquisition, is another pointer. If any HP business needs cloud services to succeed, it’s probably Software, if only to provide a replacement income stream for the loss of on-premises deployments of longstanding products such as OpenView as customers move work to cloud services.

Another important point is that Veghte has been through the creation of a major cloud infrastructure before at Microsoft. In fact, I attended a number of Veghte-led sessions in the 2007-2008 timeframe when he briefed HP executives about Microsoft’s datacenter investments and the way that they planned to create the essential software base (using Windows) to host cloud applications. I imagine that this background and knowledge has been invaluable in charting HP’s path to the creation of its cloud offerings.

Second, HP has reasonable experience in building and running cloud services such as Snapfish and Magcloud. Issues such as how to scale and manage the storage necessary to accommodate the amount of data that cloud services typically have to cope with plus customer billing and delivery have been worked out since 2005. I am not saying that everything is perfect here as clearly HP’s cloud efforts to date have been spotty and inconsistent, but a track record does exist.

Third, HP Labs has been researching cloud services for a long time in an effort to understand aspects such as automation, security, and management. It’s a couple of years since I sat on HP Labs project review boards, but I imagine that progress has been made since and that HP Labs has made a contribution to what is now being offered.

Fourth, HP has an incredible amount of hardware assets that it can draw upon to equip its datacenters. Servers and storage will probably be highlighted here, but HP also has a very solid networking business based around ProCurve and some recent acquisitions such as 3Com and TippingPoint. Cloud businesses depend on solid networking and it’s nice for a cloud provider to have a complete networking business available. On the server and storage side, it’s also probable that HP has learned a lot from its provision of hardware to other cloud providers. The kind of servers used in cloud datacenters differ from standard ProLiants sold to the corporate world in that they are very much simpler and designed to be rip and replace compute boxes. The same is true of storage where massive racks of JBOD tend to be the preferred choice for cloud datacenters. I’m pretty sure that HP’s hardware teams have come up with some interesting technology for use in their datacenters.

Last, time was running out and it became imperative for HP to make a move into cloud services, if only to protect and enhance its reputation for the design and operation of large internal IT infrastructures that are now called “private cloud”. Remember, HP has to continue to sell a massive amount of hardware to corporations around the world to support its operations and this task would become a lot harder if HP couldn’t provide its cloud credentials. The same is true for HP Services, which became a massive managed services player following the acquisition of EDS in 2008. HP Services can now offer customers a range of public and private cloud services and this should help it better secure multi-year customer management contracts.

Of course there will be many bumps to navigate along the road to success. Amazon is already a big player in public cloud services and HP can expect to compete with solutions from IBM and other players. Microsoft, HP’s major partner in many other areas, will compete with Azure and Office 365. And some of the customer technical staff that HP has to convince might consider its chosen architecture (OpenStack and KVM) to be the wrong choices. Time will tell.

The net-net of HP’s plunge in public cloud services is that it’s a major strategic move for a massive company that they have been considering for over five years. We will soon find out whether the time spent waiting was a good move because it allowed the market to mature and all the necessary pieces to come together within HP or if the delay has compromised HP’s ability to succeed in what is becoming an increasingly competitive cloud services market.

-Tony

Follow Tony’s ramblings on @12Knocksinna

Posted in Cloud | Tagged , , , | 3 Comments

To MEC or not to TEC, where are my Connections?


I think that there’s a fair chance that I am the only person to have presented keynote sessions at each of the Microsoft Exchange Conference (MEC), The Experts Conference (TEC), and the Exchange Connections conference. As such I might be in a reasonable position to offer some guidance as to what value these conferences offer to technologists who are interested in learning more about Exchange. Here goes.

Quite understandably, MEC is occupying a lot of attention at present. In early March, Microsoft announced that they were bringing MEC back after a ten-year gap. It seems like the plan is to use MEC as the public launch event for Exchange 15 (or Exchange 2013, if you subscribe to the theory that this will be the name attributed to the product). At least, the indications of the agenda published on Mecisback.com point to sessions that are heavily focused on Exchange 15.

The combination of MEC returning, complete with the surrounding tales of the wonderful happenings that occurred at previous MECs, plus lots of sessions describing the technical innards of a new version of Exchange is sufficient to give the Orlando-based event (Sept 24-26) the potential to be a big draw. The sole problem  that might prevent Microsoft selling out the 2,000-or-so seats for MEC is money. People who have already committed to attend other events such as TEC, TechEd, and Spring Connections before Microsoft announced the relaunch of MEC at the start of March might find that travel and training budgets do not permit a trip to MEC in 2012. We shall see.

Apart from attendees finding the necessary dollars to attend, another nagging doubt that I have revolves around MEC’s concentration on Exchange 2013. No one loves to hear about new technology more than I do, but potential conference attendees need help with more than news about new stuff.  It’s more valuable to hear about how to use technology to solve business problems. Although some companies deploy new versions of Exchange early, the majority traditionally wait until the first service pack is available. A feeling persists that Microsoft server applications lack polish until they have the chance to receive customer feedback and perhaps some additional time to complete features. Certainly this was the case with both Exchange 2007 and Exchange 2010. In addition, some of the folks who might have deployed a new version of Exchange on-premises now use Office 365 and don’t have to care too much about the technical details because Microsoft takes care of upgrades, refreshes, and so on now. Given these factors, it’s easy to imagine how some might struggle to convince their management that a trip to MEC is a good investment when their company probably won’t deploy Exchange 2013 until 2014-2015. Even so, I still think that MEC will be a sell-out and deliver a week of compelling content.

So what about TEC? Well, TEC has certainly delivered in terms of some excellent sessions from both Microsoft and external experts at the events that I have attended. They have a dedicated team that searches out solid speakers who boast a lot of experience on the topics that they talk about, which is always a good thing. On the other hand, the amount of Exchange content at TEC is small in comparison to a full-on conference such as MEC. TEC also covers SharePoint, virtualization, PowerShell, and its historical underpinnings in Active Directory and identity management that gave the conference its raison d’etre when it was first set up by NetPro (a company that was succumbed into the much larger Quest group some years ago). A good bunch attend TEC year-on-year and some of the sessions from external speakers are as in-depth and helpful as you’d ever want to hear.  On the other hand, Quest also uses TEC to publicize its software and at times it can seem like a marketing-led event. This is understandable as all conferences have to be funded and attendee fees often don’t pay the full cost.

And then there’s Exchange Connections, which is run as part of a suite of events that focus on different aspects of Microsoft infrastructure technology (Windows, SharePoint, ASP, etc.). Connections took up the baton after Microsoft dropped MEC to provide the Exchange community with a place to gather to share news, views, and information. Many MVPs and other experts support Connections as have Microsoft speakers over the years to provide a good mix of “here’s what we designed the product to do” and “here’s what the product actually does when deployed into production” sessions.

Although Connections has successfully provided a lot of excellent content for the Exchange community in the last 10 years, it now faces a simple but difficult challenge – Connections now has to rapidly evolve to provide something special that is very distinct to MEC, a conference that takes place some five weeks previously. I think it is unlikely that many will attend both conferences and it seems like a straight contest between the two for attendee dollars. TEC is out of the equation this year because it occurs at the end of April and most attendee decisions have been made already. The influence of a revitalized MEC will be felt by TEC in 2013. At that time TEC will also need to come up with a new value proposition or otherwise it will become an also-ran when it comes to Exchange.

What do I think the independent conferences should focus on? The answer is pretty simple. They cannot take on Microsoft when it comes to describing technology that’s en route or the finer engineering details that are only known to those who have access to product code. Therefore, independent conferences must deliver content that is practical, hard-nosed, and insightful based on real-life deployment experience. In other words, while MEC tells people what’s coming or the Microsoft view of life, the universe, and Exchange, independent conferences tell the other side of the story. It will still be tough for them to compete against MEC because some attendees will always be drawn to a Microsoft event on roughly the same basis that no one was ever fired for buying IBM computers. But at least they’ll have a shot at attracting sufficient attendees to make an independent conference financially feasible.

And what of TechEd, Microsoft’s other event that absorbed Exchange content after the last iteration of MEC? I have not attended TechEd for some years now as I considered it to have become a bloated event full of so-so presentations delivered by presenters who rarely satisfied.  Additionally, the amount of marketing content on the TechEd agenda appeared to be increasingly important. There are notable exceptions of course as great presenters and compelling sessions still do happen at TechEd, but these jewels had become increasingly rare. The upshot was that my desire to find an excuse to attend another TechEd declined gradually over the last few years. I’ll need a good reason to return.

I hope that Microsoft won’t fall into the trap of recycling sessions from TechEd to MEC and back again, which is what happened in the past. My hope is that the Exchange development group delivers new and compelling content at MEC. I think this is a far assumption for 2012 as they’ll be talking about a new product. The key will be to maintain quality when the technology isn’t quite so new.

It’s both an interesting and complicated time to be making a decision about what conference to include on your travel and training plans for the remainder of 2012 and on into 2013.Until it eventually delivers, MEC will remain an unknown but compelling potential. TEC will happen as planned in 2012 and face into the future thereafter. The Fall 2012 Exchange Connections event is probably under most pressure because it happens after MEC and therefore has to change now. At the end of 2012 we will have much more information upon which to base 2013 plans.

I think it’s important to have a selection of conferences to inform and educate people about technology. It would be a great pity if the long-awaited relaunch of MEC caused other conferences to fold. Competition usually drives development and evolution. Let’s hope that this occurs in the conference space too.

– Tony (follow my ramblings on Twitter)

Posted in Exchange | Tagged , , , , , , | 8 Comments

March 2012 articles posted on WindowsITPro.com


Here is the digest of articles that I posted to March 2012 during March 2012:

iPhone meet Exchange 2010 posted on March 29 is a review of Steve Goodman’s new book iPhone with Microsoft Exchange Server 2010 – Business Integration and Deployment, which describes how administrators can take on the challenge of managing Apple iPhone and iPad devices that connect to Exchange 2010 through ActiveSync. The book is very readable and offers lots of good advice. Recommended!

Of vaults and retention policies posted on March 27 discusses some early attempts by Compaq (ex-DEC) ALL-IN-1 engineers to implement retention policies for Exchange 5.5 way back in a time when no one could even contemplate a 25GB mailbox.

Unreported and unloved: March 16 APAC Exchange Online outage – I wrote this article on March 22 as I was curious that an Office 365/Exchange Online outage had attracted no real attention from the mainline IT press and speculated that this was because it only affected users at the other end of the world. There’s just no respect shown to APAC…

Office 365 price reductions – but what about Plan P? – as an Office 365 subscriber, I like to see Microsoft reducing its monthly subscription fees – but I was less impressed when I discovered that they only dropped prices for Plan E (enterprise) subscriptions and left Plan P (professional) rates alone. As explained in this article posted on March 20, my conclusion is that Microsoft is quite happy to soak the cash cow of Plan P subscribers, who don’t really require much support after they manage to sign up and connect to Office 365, in favor of reducing Plan E prices to be more competitive with the likes of Google Apps. Understandable but infuriating.

Exchange 2013 anyone? On March 16, I noted that Microsoft watcher Mary Jo Foley had revealed that SharePoint 2013 was the name of the next release of this Office 15 wave product. Logic indicates that all of the Wave 15 products will share the same suffix, so I therefore conclude that the next major release of Exchange will be named Exchange 2013. You can post your bets now.

Six months of solid Office 365 performance (but…). It’s important to give credit where it’s due. On March 15 I noted that Office 365 had celebrated six months of reliable service (of course, by doing this I accept that I put a hex on Office 365 that duly exerted its woeful effect in APAC on March 16). However, my confidence in cloud services had been dented by the infamous leap year bug issue that afflicted Microsoft’s Azure operation on February 29 last…

Exchange 2010 uses two distinct types of circular logging. This article had been in my “pending” folder for quite a while because lots of other topics had arisen that forced their way into the publication queue. I was able to find a slot for the article on March 13. It describes the two types of circular logging that exist in Exchange 2010: the first is the traditional sort that has existed since Exchange 4.0 and was originally designed to preserve expensive disk space on servers; the second came about in Exchange 2010 as part of the Database Availability Group (DAG) feature and is used to ensure that all database copies have access to data that they might need before any log is truncated.

The early history of Enterprise Vault. Symantec’s Enterprise Vault (EV) product has enjoyed enormous success since its debut as the first archiving/HSM product for Exchange. This article, posted on March 8, describes the early days of EV and how it came about as a project conceived by Digital Equipment Corporation in Reading, UK as part of Digital’s “Alliance for Enterprise Computing” agreement with Microsoft.

MEC returns in September 2012: the Exchange community applauds! By now you’re probably aware that the Microsoft Exchange Conference (MEC) is returning after a ten-year hiatus and will take place at the Gaylord Conference Center in Orlando, Florida from September 24-27. Registration is expected to commence during the week of April 2. My March 6 article provides some background on MEC and why I think it was an important part of building the community around Exchange that contributed to the success of the product in its early days. We’ll wait and see how MEC 2012 pans out…

Why sharing real-life support tools is so important. My first post of the month (March 1) discusses why the release of a number of tools authored by members of Microsoft’s support organization is an important contribution to the community. The more the merrier in this category, I think!

March was interesting from a number of perspectives. MEC coming back, a new book, memories of the old days of Exchange and how products evolved through discussions between large corporations, and some technical details. Hopefully April will be as interesting. To find out, you can follow my WindowsITPro blog or receive my updates through Twitter.

Cheers!

Posted in Exchange, Exchange 2010, Office 365, Technology, Writing | Tagged , , , , , , , , , , , | Leave a comment

DMARC and the continuing fight against spam


According to its web site, DMARC, (Domain-based Message Authentication, Reporting and Conformance), is “a technical specification created by a group of organizations that want to help reduce the potential for email-based abuse by solving a couple of long-standing operational, deployment, and reporting issues related to email authentication protocols.” Anything that helps to eliminate spam is clearly a very good thing and deserves some attention, which was the premise of RunAs Radio’s Richard Campbell when he called me up to invite me to debate the topic for a program that you can download from the RunAs web site.

RunAs Radio is an interesting venture (Richard is also an interesting person, but that’s another story) that attempts to capture the thoughts of technologists about topics of interest as they arise. I’ve been interviewed several times (before the DMARC program, the most recent covered Office 365) and enjoyed the process because the conversation is never stilted and Richard adds as much to the debate as the interviewee, which is not always the case when discussing technology. The nature of the discussion often starts with a defined topic and then meanders to cover other material in a very natural way. See what you think.

Of course, plans have been made, recast, relaunched, and remade to stop spam for many years now. The block lists maintained by organizations such as Spamhaus are well respected and serve as a method to stop email sent by well-known spammers. We’ve used block lists for years and they form an essential part of the defenses erected by most large companies to protect their email systems. Deployed alongside other spam checks such as validating message recipients against internal directories, block lists are effective in preventing identifiable spam reaching its destination.

Unfortunately those who sent spam aren’t stupid. At least, those who continually evolve new spam techniques are not stupid – those who simply rehash the work done by others in the hope of convincing others that mail really is from their bank, contains an offer to help someone release millions of dollars from a blocked bank account, or save someone from a fate worse than death are not in this category. Spammers set up and tear down email domains all the time, change the formatting and contents of messages, and work hard to make their messages to seem as innocuous and believable as possible to the recipient.

Techniques like DMARC and its SPF (Sender Policy Framework) predecessor both aim to assist receiving email domains to identify incoming messages from authentic senders by checking details contained in DNS records. Essentially, a receiving server can check DNS to establish whether messages purporting to come from a domain were transmitted by an authorized server. If messages are proved to come from an authorized server all is well and the messages can be accepted. If not, the message becomes a candidate for more extensive testing to determine whether it is authentic or contains spam.

DMARC builds on the framework used by SPF to add features such as providing direction as to what action to take if messages arrive from an unauthorized server. For example, this DMARC record provides the following instructions:

contoso.com TXT v=DMARC1\;p=reject\;pct=100\;rua=mailto:administrator@contoso.com

  • p=reject: States the policy for the domain “contoso.com”. In this case the policy is to “reject” incoming messages from unauthorized servers. It could also be “quarantine”
  • pct=100: States the percentage of messages from contoso.com that are subject to filtering. In most cases it makes sense to filter all messages.
  • rua=address: States a reporting URI for aggregate results. In this case, an email address is provided so that the administrator of the domain is informed of any problems caused by messages coming from the domain. In other words, they can find out whether spam is detected!

SPF has been around for a while and is reasonably widely deployed by large organizations. I suspect that the results reported in this interesting blog showing that nearly half of the messages that pass through Microsoft’s Forefront Online service originate from sending domains without SPF records might be influenced by the customer base who uses this service. In other words, the traffic between large companies who are more likely to have dedicated email administrators who have the time and interest to deploy SPF records might produce different results compared to traffic between many smaller companies whose email is not managed with as much attention to detail.

I note that Office 365 creates DNS TXT records to provide SPF data for the tenant domains that it hosts. This seems like a very intelligent step to increase the amount of SPF-authenticated mail in the network. The format of the record defined by Office 365 is as follows:

contoso.com TXT v=spf1 include:outlook.com ~all

This means is that any mail sent on behalf of the domain by an outlook.com (Office 365) mail server is OK. Interestingly, when I examine the DNS information for my domain (a service such as http://www.dnswatch.info/ makes this very easy), I see a second TXT record has been added:

"mscid=gn+/B+3OqZTqN469IjS37t7msXGgPJlp+/ZCsoh0G6cXONpsOn8Du4SPouE1a9vUTLROlqFbQJdM/TSiogCKfA=="

I have no idea what is the meaning of this record and there’s precious little to be discovered about what it might be used for through Internet searches. Perhaps it’s another piece of authentication data.

In any case, the point is that the global email infrastructure is gradually tightening its controls to enforce authentication between domains in an effort to eliminate spam. I doubt that we shall ever fully eliminate spam, at least not in the foreseeable future, but it’s good that these efforts are proceeding (albeit slowly).

– Tony

Posted in Email, Technology | Tagged , , | 1 Comment

Reverting to Outlook’s AutoArchive feature


A question arising from my post about some things to consider when you enable Exchange 2010 archive mailboxes asked how a user could revert back to Outlook’s AutoArchive feature if they decided that they really didn’t want to use the archive mailbox. The answer proved a little harder to discover, so here are the details.

First, when you enable a mailbox with an archive (through the Exchange Management Console (EMC) or by running the Enable-Mailbox -Archive cmdlet from the Exchange Management Shell), Exchange populates a set of archive properties for the user’s Active Directory account and creates the archive mailbox in whatever target database is selected (same database, different database, or in Office 365). The fact that the user now has an archive mailbox is signaled by Exchange to Outlook through the AutoDiscover feature, which is responsible for keeping Outlook updated with details of resources available to a user.

Outlook 2007 and 2010 invokes AutoDiscover when clients start and once an hour thereafter to ensure that resources such as the access granted to a shared mailbox are picked up and made available to users. You can see details of the communication that occurs between Outlook and Exchange by enabling logging and then examining the log file that Outlook creates the next time that the client starts. For Outlook 2010, the log file is located in \Users\UserName\AppData\Local\Temp\Olkdisc.txt (I haven’t used Outlook 2007 for a number of years, so you’ll have to figure this part out if that’s your client of choice).

Olkdisc.txt is an XML formatted file that can be opened with any text editor. Inside you’ll find details of the interaction between Outlook and Exchange as it first attempts to find the AutoDiscover access point and then retrieve information about resources. The piece we’re interested in looks like this:

<AlternativeMailbox>
<Type>Archive</Type>
<DisplayName>Tony's Online Archive</DisplayName>
<LegacyDN>/o=ExchangeLabs/ou=Exchange Administrative Group (FYDIBOHF23SPDLT)/cn=Recipients/cn=Something.onmicrosoft.com-52094eea20/guid=afc1e472-0826-498e-b990-85de223e809d</LegacyDN>
<Server>DBXPRD0410.mailbox.outlook.com</Server>
</AlternativeMailbox>

Not much fun for humans, but exciting for Outlook 2010 because this information tells the client that I can access an alternative mailbox (i.e. not my primary mailbox) of type “archive” (if it was type “delegate”, we’d know that this entry referred to a shared mailbox). The display name is simple text and can be changed by an administrator using the Set-Mailbox cmdlet. The LegacyDN (Distinguished Name) provides a path to the archive mailbox ending up in a GUID that Exchange can use to locate the actual mailbox, and the Server tells Outlook what server it needs to communicate with to access the mailbox. In this case, it’s an Exchange Online server as my mailbox is in the Office 365 cloud.

When Outlook receives the AutoDiscover information it knows that an archive mailbox is available so it then proceeds to disable the AutoArchive menu option. This option is quite old (in functionality terms) and goes back to Outlook 2003 or thereabouts when it was provided as a method to allow users to move items out of their online mailbox into a local PST. Of course, in those days we couldn’t imagine what it might be like to have access to a 25GB mailbox, nor could the PC clients cope with mailboxes larger than 1GB because of slow disks, poor synchronization, and dial-up connections. And although Outlook 2003 made dramatic improvements through the introduction of cached Exchange mode and better synchronization techniques, it still made sense to move items out of the mailbox to a PST so that they were available even if the network connection to Exchange was down.

When you have an online archive, Outlook deems that you should use this feature rather than AutoArchive. However, one potential issue with online archives is that they can only be accessed online as Outlook does not synchronize items from the online archive into the OST. However much I hate to say this, it’s therefore conceivable that AutoArchive and PSTs will be a better solution for some users.

To remove the online archive and revert to AutoArchive, you have to disable the archive mailbox by running the Disable-Mailbox -Archive cmdlet. For example, this command disables my archive mailbox, no matter whether it is on a local database or in the cloud:

Disable-Mailbox -Identity "Tony Redmond" -Archive

There is an EMC option to disable an archive but some reports suggest that this doesn’t do everything necessary to convince Outlook that the archive is indeed gone. When in doubt, run a cmdlet…

The next time that AutoDiscover runs, it will note that the archive mailbox is no longer available and will not include this resource in the list returned to Outlook. Thus, the next time that Outlook is started the AutoArchive option will remain intact and ready for the user to do whatever they want with PSTs (still a horrible file format, but useful at times).

Just another little corner of Exchange and Outlook interoperability exposed to blinding light.

– Tony

Posted in Exchange, Exchange 2010, Office 365, Outlook | Tagged , , , | 2 Comments

Exchange ActiveSync books – one coming, one in production


There’s no doubt that ActiveSync has been an enormous success for Microsoft. Every important mobile device vendor has licensed ActiveSync and although the implementation of the features enabled by ActiveSync varies from vendor to vendor, it’s still true to say that ActiveSync has now become a premier protocol when it comes to device connectivity, both for Exchange (every version since Exchange 2003) and Hotmail.

Because of this, it should come as no surprise that books are arriving to help administrators through the intricacies of mobile device management in an Exchange world. Steve Goodman’s new book iPhone with Microsoft Exchange Server 2010 – Business Integration and Deployment (also available on Amazon.co.uk) concentrates on connecting iPhones and iPads to Exchange 2010 . I am writing a review of this book that I hope to publish on my “Exchange Unwashed” blog towards the end of March. It’s also my current recommended book of the month.

I also see that Paul Cunningham has started to write a book about ActiveSync for Exchange 2010 that looks as if it will deal with Windows Phone 7.5 and other devices. Indeed, Paul is looking for administrators of Exchange deployments in terms of taking a survey to help him figure out what topics to cover in his book. Please do so if you can.

These books are ebooks or published on demand. This approach allows text to be developed, edited, formatted, and published faster than the traditional book publication process used for books such as Microsoft Exchange Server 2010 Inside Out, which can take up to a year to write and three months to edit and publish.  eBooks also allow authors to respond more quickly to the needs of the market. In this case, both Steve and Paul are obviously responding to the needs of Exchange administrators to facilitate secure connectivity for an increasingly bewildering collection of mobile devices. It’s both a fun and challenging task.

– Tony

Posted in Email, Exchange 2010, Technology | Tagged , , , | 2 Comments

Windows 8: New O/S – new UI – not sure yet


After returning from the annual MVP Summit in Redmond, it seemed like a good time to apply the Windows 8 Consumer Preview to a PC. My first selected victim was an old HP NW8230 (circa 2005-2006 vintage), as it seemed to be a good test of Windows 8 to see how well it fared on older hardware. All software runs well on new super-fast hardware; efficiencies and improvements surface when old hardware is used.

Up to this point, the NW8230 was running Windows 7 SP1 quite happily (2GB RAM, 120GB HDD). The first step in the upgrade process is to run a compatibility report, which identified that most applications were supported by Windows 8. There were some minor hardware issues (the report indicated that new drivers might be required) with Windows Security Essentials identified as the one major problem that had to be resolved before Windows 8 could be installed. Windows Security Essentials is an excellent (and free) program that deals with the basics of anti-virus and other malware. I don’t quite know how other companies sell similar software for a fee, especially as Microsoft plans to include protection in Windows 8 as part of their Defender application (another nail into the coffin of third-party AV software?). In any case, it had to go now and so it did.

Once the PC was ready, Setup launched and did it stuff. The software was downloaded, checks were made, and programs were laid down. For whatever reason, Windows 8 failed to install the first time around. The installation program is capable of detecting failure and reinstalling the previous O/S without intervention. After Windows 7 SP1 was reinstalled, we rebooted the PC and started the Windows 8 installation again, which ran through to the finish without any further problems. Total time, including the download, was approximately an hour.

After using the older PC to check out Windows 8 (and noting that the performance was perfectly acceptable despite its age), I decided to install the new O/S on my HP Elitebook 8530w, which is the computer that I use (or rather “hammer”) on a daily basis. This computer ran Windows 7 SP1 Professional 64-bit with 8GB of RAM and a 256GB SSD and it’s done an excellent job for me over the last two years. Of course, before I did anything, I made sure that I had a solid backup. There’s nothing like heading into the future equipped with the comfort of a good backup.

Once again, the installation was reasonably smooth, with one small glitch when the installation program announced that it was going to reboot the PC after reaching 100% of installing various files only to spend five minutes or so attempting to access something the DVD. At least, that’s what I diagnosed based on the noise coming from the drive. I took a risk and popped the drive open and the installation program promptly rebooted the PC. All was well upon the restart and the installation proceeded through to the finish. I was pleased to find that all of my applications and settings were preserved.

Replete in its awful default color scheme and chunky boxes (does anyone else think that the new start screen appears to have been crafted with some rather think crayons?), Windows 8 is now humming away. It certainly takes a little bit of getting used to but the human brain can be quickly coaxed into performing tasks differently. So it was with the change from Windows 3.1 to Windows 95 and so it will be from Windows 7 to Windows 8.

Windows 8 improves the O/S in many ways (for further detail, see the Windows 8 team blog including this post which explains new concepts such as “charms” – or review the set of hotkeys supported by Windows 8) such as much faster boot-up times. I think I have figured out how the old desktop interacts with the new Metro interface and I quite like some of the applications. The Mail application, for instance, was able to connect to my Office 365 (Exchange Online), Hotmail, and Gmail accounts. First Windows complained that the security policy for the PC did not protect information well enough (password complexity, lock after a period, etc.) to allow it to connect to Office 365 but it offered to apply the necessary policies and all is well. Unsurprisingly, this application behaves very much like the Windows Phone Mail app and while I wouldn’t give up Outlook or Outlook Web App to use it, the Mail app will provide reasonable access for casual email users. Even though the Calendar application synchronized quickly with my Exchange calendar, I didn’t care much with the layout or color scheme selected for the application. It seemed awfully brash. The People application doesn’t seem to be as well laid out as its Windows Phone equivalent in terms of how the layout displays the feeds from Twitter, Facebook, and LinkedIn. Perhaps it will grow on me.

Windows 8 runs a compatibility advisor report before installing the software. This had identified the problem with Windows Security Essentials but two other issues seemed to slip by. The first was with Zune, which was passed as OK but crashed after the installation. The problem was that Zune requires .NET Framework 3.5 to be installed on a PC. My Windows 7 SP1 installation included .NET Framework 4 Client Profile but this didn’t carry across to Windows 8. A quick download and install fixed the problem. The compatibility advisor also identified Twinbox, an add-in that integrates Twitter into Outlook as “remove only”, which I took to mean that I should remove the application as it might cause problems under Windows 8. I decided to leave well alone and so far Twinbox has continued to operate as before. In fact, the only issue I found was with a 2002-era game (Civilization III), which ran a tad slowly until I updated its properties to run in Windows 7 compatibility mode. The lack of problems with older applications is definitely a plus point for Windows 8.

I guess it will take time for online services to make sure that they work well with Windows 8. So far I haven’t had a problem with services such as online banking (in three countries) or other financial services. I did run into a problem with Citrix GoToMeeting, which informed me that my O/S and browser combination wasn’t supported! Skype upgraded successfully but caused a few problems thereafter as calls failed to hang up and forced the PC to be restarted to regain access to the application. Downloading and updating Skype with the latest software solved this problem.

One thing that I really do not like is the loss of the “Recently used documents” list in the desktop, which I assume is a side-effect of the replacement of the traditional Windows “Start” button with an Internet Explorer icon. This was pretty redundant in my case as I prefer to use Chrome, but it’s easy to unpin the icon from the taskbar. It has been my practice to refer to the recently used documents list frequently to pick up documents or other items that I have been working on, so its loss is regrettable. Pinning frequently used documents to their application (Word, Excel, etc.) helps and right-click on an application icon reveals a list of the last items that the application has processed, but it’s still jarring for a user to be forced to change habits of years because new software removes (or improves in the eyes of the developers) a useful feature.

Minor tweaks to the desktop aside, it’s the Metro UI that gets most immediate attention because it’s what Windows 8 displays when a PC boots (you can suppress Metro with this tip or make many changes to the interface). I fully appreciate that this is a radical departure from the previous desktop metaphor that is intended to allow Windows to accommodate platforms such as touch-driven slates more easily and elegantly. My appreciation of the work is growing with experience as well as the vast outpouring of tips and techniques around the world (for example, read the “imperfect view” of ex-Microsoft Distinguished Engineer’s Hal Berenson). Even with its prominence, I don’t spend much time at the Start screen or in Metro applications. Most of my time is spent working in “traditional” applications (Word, Excel, Outlook, PhotoShop, etc.). I assume this situation will change over time as application developers get to grips with Metro and discover how it can add value to their products.

I see some good in Metro and I like its look and feel on Windows Phone, where I find it to be a very approachable interface. However, I do have some concern about the application of a radical change of interface to popular desktop applications such as Microsoft Office. I’m restrained about what I can say here because of a Microsoft NDA, so I will keep my comments general and based on the information that’s available publicly today.

Perhaps my thoughts are captured simply thus: I hope that Microsoft doesn’t compromise the next generation of applications (including administrative tools) by an over-energetic application of Metro design elements everywhere. It would be a pity if corporate directives forced software engineers to introduce an interface that prevented companies deploying new software simply because the change is too difficult (and ultimately expensive) for users, support teams, and training departments to handle. Microsoft already experienced problems with the introduction of the infamous “ribbon” in Office 2010, but I think that Metro has the potential to be even more disconcerting than the ribbon for the average user.

Of course, it’s early days yet and Microsoft will rightly point to the fact that all software goes through multiple iterations before settling into a final form. The screenshots and other indications of current Metro-style implementations for applications probably don’t accurately reflect what the final form will be. User interface designers will tweak layouts, fonts, and sizing based on internal and external feedback. I hope that they fix some issues that concern me such as the allocation of too much white space on screen and the subsequent loss of data that can be displayed in the remaining space. Perhaps maximizing the effective display of data is more important in administrative interfaces rather than user-centric programs. I can certainly appreciate that argument. On the other hand, I’ve heard some observers complain that Metro-style interfaces are almost blinding (emphasized to make a point, no doubt) because so much white space is used.

I also dislike the use of capitalization for menu choices. For years, the convention has been to move away from ALL CAPS to an elegant mixture of initial capitals followed by lower-case letters for elements such as options presented to users. Now we see Office offering options such as VIEW and FOLDER. It all seems so mainframe-like, a return to the past, and very much like as if the application is SHOUTING to gain attention.

Again, it’s early days yet and I’m sure that the folks in Redmond will make many improvements between now and the final ship date for both Windows 8 and the applications that will run on the new O/S. I’m sure too that I’ll get used to whatever design elements are deemed to be most suitable for my consumption by the powers that be.

Now off to continue my exploration of Windows 8 so that I learn more about it…

– Tony

Posted in Technology | Tagged , , | Leave a comment

New French breathalyzer law


[Updated to reflect changed regulations]

Driving in France can present some unique challenges.  Drivers who are new to France who have to transit the massive roundabout at the Arc de Triomphe in Paris receive a rude introduction to some of the passion, terror, and luck that can surround the French driving experience.  The Arc de Triomphe presents no more difficult a driving challenge than other large cities do but it is different. Such is life.

Even though the élan and speed of French drivers can take time to comprehend, the fact remains that driving in France is much safer today than it was twenty years ago. As detailed in this report, road deaths in France fell from 7,720 to 5,332 between 2001 and 2004. The figure continued to fall to 3,994 in 2010 and is still hovering around this level. Another interesting way of looking at the issue is to review the top 20 departments for road deaths in France in 2010, which shows that Bouches-du- Rhône is highest in terms of numbers (150) while Charente-Maritime is highest in terms of deaths per million inhabitants (118.4). Both of these departments see a fair number of tourists and this might be a contributing factor.

Part of the success in reducing road deaths is due to the increased monitoring of speed by the police backed up with a widespread network of fixed and moveable radar-controlled cameras. However, I suspect that a lot more is due to the steady reduction in the blood alcohol limit to its current level of 50 mg/l. The France where many glasses of red wine are consumed over lunch followed by a digestif and slow drive home is steadily disappearing because people can afford to drink no more than one or two small glasses of wine and remain under the limit.

Although substantial progress has been made to reduce road deaths, French roads are still more dangerous than those other European countries (the U.K. is about twice as safe) and further action is being taken by the government. A new development is that from July 1, 2012 France requires all drivers to have a breathalyzer device in the car (décret n° 2012-284 du 28 février 2012). The requirement to carry a breathalyzer adds to the collection of other safety equipment that must be in a car driven in France including a red safety triangle, at least one reflective yellow vest (which must be in the car, not in the boot), a first aid kit (not strictly required but a great thing to have available), and a set of spare light bulbs (again not absolutely required as long as you don’t have a blown bulb when the police stop you).

Originally the idea was that the French police would fine drivers who did not have a breathalyzer in their car after November 1, 2012. The requirement was to produce a suitable device, meaning that it must be stamped with “NF” to indicate that it meets the “norme française” (is deemed suitable for use in France). Failure to comply would result in an 11 Euro fine. The fine is not a lot, but you can bet that the time wasted while the police pour over your car to possibly locate other faults that might lead to more painful fines, will heap insult onto the 11-Euro injury. However, various implementation problems created a situation where the French government decided in February 2013 that the police would no longer fine motorists for not having a suitable breathalyzer in their car but that the requirement to have the device in the car still exists. Confused? Join everyone else…

Breathalyzer chemical Contralco The question then is where to obtain a suitable breathalyzer before you drive in France. Car ferry companies do a good business selling the devices in ferry ports and on board ship, at a higher price than in France. Another option is to buy through companies that can supply suitable devices in your home country before you travel to France. For example, the aptly named FrenchBreathalyzer.com is happy to sell disposable devices at £2.45 each in the U.K. or £10.99 for a five-tube kit. You can also buy both individual breathalyzers and packs on Amazon.co.uk.

By comparison to prices outside France, Feu Vert, a well-known car part company throughout France, sells the same disposable breathalyzer for €1.25 (£1.05), while Amazon.fr offers Lot de 3 éthylotests chimiques homologués sous sachets individuels (set of 3 breathalyzers) for €9.90. Mind you, when I visited Feu Vert on 9 May 2012, they only had a breathalyzer costing €1.75. I guess these must be better than the ones advertised online.

Interestingly, Amazon also offers fairly cheap reusable devices such as Clatronic – AT 3260 – Testeur d’alcool (£18.90) as well as the more expensive devices like the ETHYLOTEST Détecteur d’Alcool Electronique CA2000 PX-PRO GOLD (€119). The problem with buying any electronic device is that you absolutely have to make sure that it meets the NF standard as otherwise you might have problems if stopped by the police. In fact, given the love of all things bureaucratic in France, it’s fair to predict that a failure to meet the designated standard will lead to much unhappiness.

Feu Vert’s price is more in line with what you can expect in other French outlets (supermarkets such as Intermarche sell breathalyzers cheaper at around €1.10 each), although the price in autoroute rest stops is likely to be higher.
Breathalyzer electronic CA2000 PX-PRO certified NF Unless you buy a reusable device you will have to buy at least two disposable breathalyzers because if you’re stopped by the police and have to use one device, you have to have another to comply with the law when you start driving again. In fact, you’ll probably want to try out one of the breathalyzers “just to see how it works”, so you’ll end up buying at least three. Disposable breathalyzers do have an expiry date and you might prefer to buy a reusable breathalyzer. However, these cost quite a bit more (for instance, Feu Vert offers the same device as advertised on Amazon.fr (see picture) for €119.90, while other suitable models can range up to €500 and beyond, depending on their features).

Car hire companies who operate in France have to provide breathalyzers along with the high-visibility jackets and red warning triangle required by law. Normally you will find a breathalyzer in the glove compartment (only one, though). Of course, it remains to be seen whether these items “disappear” from hire cars as souvenirs or simply because people want to use the breathalyzers themselves. I doubt that les flics will accept the excuse that a previous renter removed the breathalyzer so this now becomes one of the “must-check” items for car rentals along with a walk-around to look for scrapes and bumps.

It will be interesting to see if other countries decide to follow France’s lead and introduce a mandatory requirement to carry a breathalyzer in every car.  I guess it all depends whether this initiative succeeds in driving down road deaths even further. Let’s hope that it does.

– Tony

Follow my ramblings on Twitter!

Update 7 May 2012: For those travelling from Ireland to France on the Irish Ferries ship “Oscar Wilde”, you can buy a two-breathalyzer set on the boat for  €7.50. This is obviously more expensive than waiting until you actually get to France and can buy breathalyzers in a shop, but they are stamped “NF” and seem to meet all the requirements.

Update 9 July 2012: Halfords sells a two-item pack for  €7.49. I noticed that the pack is prominently positioned beside the cash registers. The same product is available from their UK web site for STG5.99, so the Irish price uses a pretty reasonable conversion rate, unlike many other UK-based retailers that operate in Ireland.

Posted in Travel | Tagged , , , , , , , , , , , | 10 Comments