Business lingo is suboptimal. The word “suboptimal” is a good example. It entered corporate biz-speak in the 1980s as a euphemism for failure, and soon became standard management jargon. Likewise, problems have become “opportunities”, and weaknesses are now “challenges”.
Euphemisms to disguise failure, deflect blame, or soften criticism are socially useful. Such conversations get uncomfortable among people who work together, and nearly impossible with their leaders. “Upward management” (another fine euphemism) requires diplomatic tact. Plain talk can be a career limiting move (“CLM”).
But politeness gets in the way when discussing risk. A “suboptimal risk mitigation” sounds gentler than a “grave danger”, but it’s less likely to receive the attention it needs. Corporate survival should trump political correctness, but a “significant risk posture misalignment” sounds so much nicer than an “existential threat to our business”.
Executives aren’t the only perpetrators. Technobabble obfuscates hard truths too, especially when others feel reluctant to show ignorance or challenge the resident alpha nerd. Would you admit you don’t know what a “rule-driven pre-inspection packet drop anomaly” is while everyone around you nods knowingly? (Hint: It means your firewall engineer screwed up.)
We should demand clarity about risks, even if it means showing our ignorance. And we should speak plainly, even if we risk making someone uncomfortable. Don’t sugarcoat it.
We can all do this, but it’s especially important for leaders. We’re social animals. Leader communication styles are mimicked by subordinates, then by their subordinates’ subordinates, and so on down the line. Organization culture, including communication style, reflect its leaders. Plainspoken leaders set a good example. And when there is a risk or problem, good leaders want facts not fuzz.
Many organization cultures mask risk with deflective language, some to such an extent that even dire threats can’t be safely discussed. Usually this problem is proportional to the number of Dilbert cartoons posted on cubicle walls. An engineer at one such place compared his troubled company to the Titanic, suggesting its leaders would form a committee to “investigate a reported iceberg proximity event” before admitting they were sailing in the wrong direction.
We live in a dangerous world. If you see an iceberg ahead, call it an iceberg and change course. Anything else would be suboptimal.
Michael McCormick is an information security consultant, researcher, and founder of Taproot Security.