I was in the middle of a perfectly pleasant briefing on some high availability solutions last week when a slide was presented that asserted that “32% of recovery operations fail”. The statistic apparently came from research done by a market research company that I won’t name here. I’m sure that the research conducted according to the usual norms and that data was collated and assessed with rigor, so I didn’t have a problem about the data source.
My issue was entirely to do with context. Presenting a statistic like this without putting it into context makes the information worse than useless. The logic for including the information on the slide was to emphasize the need for companies to think about the software that they use for disaster recovery. This is a good thing to do as every company should have a well-thought out plan for disaster recovery that is proven to be effective, insofar as this is possible.
However, stating that 32% of all disaster recovery operations fail without informing the listener about essential surrounding information such as the type of recovery operations that failed (complete server, a disk or two, or maybe just the recovery of a file), the sizes of the computers and installations where the failure occurred (I assume that large enterprises have better recovery procedures than perhaps exist in small branch offices), the expertise and experience of the administrator who attempted the recovery, the application and type of data, and the tools used present an incomplete and possibly misleading picture.
Apart from anything else, I also think that CIOs and IT Directors would be under terrific pressure from the business if 32% of recovery operations fail and that administrators who failed in all these attempts would be receiving the proverbial pink slips. You’d imagine that there is a good business reason that justifies the expense and effort of a recovery operation and that the folks who pay for the work wouldn’t be best pleased if their requests for data recovery ended up in failure 32% of the time.
Perhaps I am ultra-sensitive to this kind of thing due to many years of being chastised by editors about leading statements that have appeared in my writing. In particular, I think the editors at Windows IT Pro Magazine do a fine job of reminding authors that they can’t simply scatter statements around articles without providing the evidence to prove to readers that the information is based on a solid base of fact. However, I also think that common sense should sound a loud warning bell when marketing (or technical) personnel representing a company who wants to sell their product put out leading statements backed up by nothing but waffle and hot air.
Low standards in presentations inevitably lead to uninteresting presentations and further compounds the risk of death by PowerPoint. Let’s keep presenters focused by making sure that any important data presented on a slide is backed up by hard data.
– Tony
Hi Tony,
I totally agree with you. Similar to your experience in this briefing I have found that it is in the area of DR/HA that these kind of statistics are thrown around so liberally. Generally it is to create a sense of fear in people that their precious company data is at risk and by buying this new product they will be protected. If only it were that simple.
Thanks for the article.
J