A Lack of Insight.
Most business reports are well meaning and highly publicized, but also of hideously low standard. Typical of the genre in my opinion are the reports on office productivity highlighted in the Insight magazine and linked here: Insight: Proof of link between office design and productivity
The font is designed for impact as is the headline. Regrettably the content is the product of pushing calculator buttons, writing down unprocessed numbers and developing unsubstantiated assertions. The whole has the intellectual rigidity of porridge.
Thus while it is great that 61% of workers like their workspace, we have no idea what this means. These employees like their workspace compared to what for example; the back of the bus, the local Starbucks, their last place of employment? And quite importantly, how and why was this sample survey population selected?
Also we can prove nothing. We can show strong evidence, or ‘be almost certain that…’ yet there is a tiny chance that the world is created just for you dear reader. The rest of us are merely ciphers of your emotions and the moon is actually made of chocolate biscuits. There may be very strong evidence to the contrary but this slim possibility cannot be entirely overruled, especially — perhaps — for you now that you have considered it. There will never be incontrovertible proof to the contrary. This reason of proof and its permanent absence lies at the heart of the myriad versions of God never mind office designs.
Statistics then must be as meaningful as possible. Otherwise they deserve all the disapprobation hurled their way. Stats have to pass the ‘so what?’ test. At least as importantly they need to pass the “what does this mean test? Spraying percentages around means almost nothing in statistical terms; percents are merely descriptive and utterly worthless without backup. Consider the truism “Fewer than 50% of British hospitals are better than average: Shock!”
As an example from the Insight piece discussed here, take “Over two-thirds (69 percent) claim that the basis of a successful (and hence productive) working environment is a mix of spaces that offer people a choice of places to work depending on their individual preferences and job function”
A. What does this mean?
i. What mix are we talking about?
ii. What type of choice is on offer?
iii. How is productivity linked to success?
iv. How do you define success?
B.How representative is this sample?
i. In there a vested interest in the respondents saying what they are saying?
ii. What alternatives are under discussion?
Therefore please may we all do something meaningful with our data when we collect it. Many companies assiduously collate data spending considerable time, effort and money. Then they don’t know what to do with those data apart from making basic sums, which are pretty meaningless. In the case of the Insight article, pretty and meaningless.
You should be seeing the following after you have invested your company’s resources in surveys:
- The assimilation of just a few meaningful scales from the slew of individual questions. A meaningful scale occurs when the Cronbach Alpha exceeds .7. If this is Double Dutch then you have, er, overwhelming evidence that your data is being improperly used.
- These scales should then be statistically assessed for associations.
Without scales we have “29 percent do not like the design of their workplace. Although 60 percent are happy with the amount of natural light they have in their workplace and 51 percent are happy with the general temperature.”
It is almost certain, for example, that light and temperature should be on the same scale (rather than listed separately) and that these factors are inextricably linked with an employees’ sense of workplace design (see Knight & Haslam, 2010). With scales we can see (a) if these data are reliable, (b) if they are valid and (c) if they are linked.
Thus we might be able to say something like “control over one’s ambient working conditions is directly associated with one’s perception of workplace design.” And we have a concept of something useful. However without statistical investigation this phrase remains a hypothesis rather than a conclusion.
Once this is all done, the development of a meaningful model illustrates sloughs of collected data in one clear picture.
All of these steps are required. In the Insight report compilation we see none of them. This is not to decry the hard work that has gone into these reports. However as it stands the figures are of primary school standard (none of the maths here is beyond the average ten year old), they are partially interpreted and lack any sense of meaningful validity.
Years ago my PhD supervisor looked at sheets of paper that represented hours of my honest but essentially uninspired and straightforward endeavour. He kindly raised his eyes above the line of his black spectacles, handed back my sheaf of endeavour and said softly “Craig, don’t bring me shit.”
I offer the same advice to the compilers of myriad reports like these. Collect, think and analyze – percentages do not an index make. Take your time and do things properly. We may then have something to move us all on from the reinforced heuristic of the past century or so.