By Chris Oldwood

Overload, 32(179):16, February 2024

Over-thinking is not over-engineering. Chris Oldwood presents some thought experiments to demonstrate why.

As the pendulum swings ever closer towards being leaner, and focusing on simplicity, I grow more concerned about how this is beginning to affect software architecture. By breaking our work down into ever smaller chunks and then focusing on delivering the next most valuable thing, how much of what is further down the pipeline is being factored into the design decisions we make today? And consequently, how much pain are we storing up for ourselves because we took the pejorative Big Design Up Front (BDUF) too far and ended up with No Design Up Front?

Wasteful thinking

Part of the ideas around being leaner is an attempt to reduce waste caused by speculative requirements which has led many a project in the past into a state of ‘analysis paralysis’ where they can’t decide what to build because the goalposts keep moving or the problem is so underspecified there are simply too many options and we have to second-guess everything. By focusing on delivering something simpler, much sooner, we begin to receive an initial return on our investment earlier which helps shape the future design based on practical feedback from today, rather than guessing what we need.

When we’re building those simpler features that sit nicely upon our existing foundations we have much less need to worry about the cost of rework from getting it wrong as it’s unlikely to be overly expensive. But as we move from independent features to those which are based around, say, a new ‘concept’ or ‘pillar’ we should not be afraid to spend a little more time looking further down the product backlog to see how any design choices we are considering now, might play out later. Emergent Design is not a Random Walk but a set of educated guesses based on what we currently know, and strongly suspect about the near future.

Thinking to excess?

The term ‘overthinking’ implies that we are doing more thinking than is actually necessary; trying to fit everyone’s requirements in and getting bogged down in analysis is definitely an undesirable outcome of spending too much time thinking about a problem. As a consequence, we are starting to think less and less up-front about the problems we solve to try and ensure that we only solve the problems we actually have and not the problems we think we’ll have in the future.

Solving problems that we are only speculating about can lead to overengineering if they never manage to materialise or could have been solved more simply when the facts are eventually known.

But how much thinking is overthinking? If I have a feature to develop and only spend as much effort thinking as I need to solve that problem today then, by definition, any more thinking than that is ‘overthinking it’. But not thinking about the wider picture is exactly what leads to the kinds of architecture and design problems that begin to hamper us later in the product’s lifetime, and later on might not be measured in years but could be weeks or even days if we are looking to build a set of related features that all sit on top of a new concept or pillar.

Building the simplest thing that could possibly work does not mean being naïve about the future.

The thinking horizon

Hence, it feels to me that some amount of overthinking is necessary to ensure that we don’t prematurely pessimise our solution and paint ourselves into a corner too quickly. As such, we should factor in related work further down the backlog, at least into our thoughts, to help us see the bigger picture and work out how we can shape our decisions today to ensure it biases our thinking towards our anticipated future rather than an arbitrary one.

Acting on our impulses prematurely can lead to overengineering if we implement what’s in our thoughts without having a fairly solid backlog to draw on, and overengineering is wasteful. In contrast, a small amount of overthinking – thought experiments – is relatively cheap and can go towards helping to maintain the integrity of the system’s architecture by narrowing the solution space to something more realistic. Very few software products have the need to scale to anything like what you read about in the technology news pages, despite what the optimists in the business might have you planning for.

The phrase ‘think globally, act locally’ is usually reserved for talking about the health of the planet, but I think it is fractal in nature, in that you can also apply it at the software system level too, to suggest factoring in thinking about the design and architecture of the system even though you are only implementing a feature in a small part of it.

One has to be careful quoting old adages like ‘a stitch in time saves nine’ or ‘an ounce of prevention is worth a pound of cure’ because they can send the wrong message and lead us back to where we were before – stuck for eternity in The Analysis Phase. That said, I also want us to avoid ‘throwing the baby out with the bathwater’ and end up forgetting exactly how much thinking is required to achieve sustained delivery in the longer term.

Chris Oldwood is a freelance programmer who started out as a bedroom coder in the 80s writing assembler on 8-bit micros. These days it’s enterprise grade technology from plush corporate offices the comfort of his breakfast bar. He has resumed commentating on the Godmanchester duck race but continues to be easily distracted by emails and DMs.

Your Privacy

By clicking "Accept Non-Essential Cookies" you agree ACCU can store non-essential cookies on your device and disclose information in accordance with our Privacy Policy and Cookie Policy.

Current Setting: Non-Essential Cookies REJECTED

By clicking "Include Third Party Content" you agree ACCU can forward your IP address to third-party sites (such as YouTube) to enhance the information presented on this site, and that third-party sites may store cookies on your device.

Current Setting: Third Party Content EXCLUDED

Settings can be changed at any time from the Cookie Policy page.