Decision-Making: The Cult of Haste and the Curse of Waste

Rate this post
Guest post by Martin Binks - the former dean of Nottingham University Business School and a Professor of Entrepreneurial Development at its Haydn Green Institute for Innovation and Entrepreneurship.

“Act in haste, repent at leisure” is a much-quoted maxim, but it appears to be one in which we have little faith. Acting in haste is not just a norm in the sphere of business: it is something that is championed and even celebrated. The ability to make decisions swiftly is far more likely to be hailed as a strength than it is to be cited as a potential failing.

Naturally, there are many occasions when speed is of the essence. A world without deadlines is a world without dynamism. Yet the division between imperative urgency and needless alacrity is often crossed, in large part because we seldom contemplate the vital difference between the two.

The distinction is undoubtedly one our students would do well to learn. They are, after all, the decision-makers of the future – a future liable to be increasingly characterised by uncertainty and change – and many of the myriad choices they will confront will demand far more than knee-jerk responses rooted in the mistaken belief that rapidity is the only true gauge of their problem-solving prowess.

So how might we help them develop a mindset that acknowledges and values the advantages of adopting a philosophy that is more comprehensive, rigorous and perhaps even ingenious? It seems to me that the issue revolves largely around the age-old phenomenon of “what if”.

Hindsight may well be a wonderful thing, as we are routinely assured, yet few are the occasions when we employ it to prove ourselves spectacularly right. It is instead usually called upon to provide a painful lesson, whether in the form of a wistful survey of the road not taken or a too-late reflection on how things might have been.

These miseries can frequently be traced back to a decision-making process founded on the belief that choices are made only with reference to some kind of prepared catalogue of ready-made, fully formed options. This is a dangerous misconception. Decisions should stem from ideas, and the best ideas are not chosen: they are conceived. Attempting to select a winner from a neat list of available alternatives is no better than attempting to select a winner in a horse race – which is to say, if we are being harsh, that it is no better than gambling.

Gambles may be required from time to time, of course, but they are no basis for a methodology. Similarly, the conscious irresponsibility that underpins so many decisions – for example, those we know only too well merit more deliberation than we are prepared to offer – is not a trait we should be happy to see, let alone encourage, in our students.

Accordingly, we need to turn the “what if” scenario on its head. We need to deal with “what ifs” before rather than after the event. We need to move away from a regret-driven culture of “What if we had done this?” and nurture a prescient culture of “What if we were to do this?”. In short, we need to replace hindsight with foresight.

The first step on this journey is to examine the root causes and component parts of any problem. This should take place before potential solutions are even considered, because deconstruction must necessarily precede reconstruction. We have to fully understand the matter at hand, particularly if it is complex, if we sincerely hope to address it by means of anything other than quick-fix incrementalism.

The way is then clear to generate ideas – lots of them – and, crucially, to proceed in the comforting knowledge that the vast majority will be bad. For the reality is that most people do not miraculously propose a bona fide “great idea”, because that is not how creativity works. We might set out with only a bit of a good idea, which, combined with a bit of another good idea and an improvement to a bad idea and a reaction to a thoroughly silly idea, will slowly form the makings of a feasible idea. It is only by producing numerous ideas and assessing the worth of every last one of them that we eventually recognise the cream of the crop.

Aside from its rigour and its focus on the long term, a notable appeal of this approach is that it is both mentally stimulating and fun. It underlines, too, that creativity is not the exclusive preserve of “visionaries” and “geniuses”: anyone has the capacity to challenge the conventional, to make connections and to think beyond the realm of tired clichés and lazy tropes.

Speaking of tropes, I am aware that advocacy of any decision-making procedure that is more time-consuming and, by extension, more costly is traditionally met with hoary warnings about return on investment. And it is true enough that, even in the presence of a determination to pre-empt each and every “what if”, there is always a chance that sizeable energy and funds might be devoted to radical concepts that ultimately come to nothing.

I cannot help feeling, though, that such an attitude is sadly typical of a broader propensity to prize the lure of short-term gains over the prospect of long-term benefits. Overall, I think we would do the wider world a much better service if we inculcated in our students a firm conviction that genuine waste lies not in pursuing novel ideas that might lead to dead ends but in deterring novel ideas in the first place.