Avoiding the entropy trap: How AI can quietly complicate everything

Anurag Vohra – Head of Customer Lifecycle, Trading and Analytics; Head of C&I India, NatWest Group argues that while AI aims to simplify work, it can quietly increase complexity by generating excessive outputs, tools, and noise that dilute decision-making.

By Anurag Vohra
Anurag Vohra – Head of Customer Lifecycle, Trading and Analytics; Head of C&I India, NatWest Group

As organisations invest heavily in our artificial intelligence, the clear aim is simplification. We want faster decisions, cleaner processes, and less manual work; however, there is a very real risk, a very strange irony, that can play out where an organisation quietly becomes more complicated, not less. We suddenly have more tools, more dashboards, more versions of the same report, more notifications and call-to-actions than actual decisions. It's almost as if the more we try to organise things, the messier they become. That, in a word, is entropy.  

 

Entropy, in simple terms, is the tendency of nature to move from order to disorder. A clean desk becomes messy. A simple process becomes a 12-step approval flow and so on. Left unchecked, systems can drift towards chaos. Organisations and the structures they could have created today are meant in large part to control this chaos. That's why, when you look at AI, it does not automatically reduce entropy; ideally, it should, but the probability of it leading to more 'chaos’ is not low. Let's see how.  

 

In large organisations, entropy doesn't look like visible disorder. It shows up in far more subtle ways. Multiple dashboards, multiple similar projects and multiple meetings to align those projects, emails/reports/presentations/updates that no one actually reads, and so on. Now, with AI, as the cost of creating output drops close to nearly zero, the volume can increase, but without really strong filters, the clarity can drastically decrease.

 

There is (rightfully) a focus on understanding the mistakes AI can make, but an equally high risk that is much less talked about is that AI will enable the creation of so much activity that meaningful decisions become harder, not easier. If this seems a little abstract, let me go one layer deeper in an area which I am closest to - software engineering. In software engineering, this entropy effect is becoming visible quite quickly. AI can now generate code. It can write unit tests. It can even refactor entire modules in seconds. Breathtaking! 

 

 Without the right discipline, though, it leads to a different problem: 

- More code than anyone fully understands 

- Repositories grow faster 

- Duplicate logic creeps in 

- Inconsistent patterns emerge 

 

You end up with systems that do work but are increasingly harder to maintain, to debug, or scale. The risk there isn't just bad code - that can be controlled - it is the unowned complexity that has moved into production. That is why the fundamentals matter even more: a very clear architecture, really strong code reviews, consistent standards, and engineers who can say no to unnecessary generation. AI can accelerate development, but it can just as easily accelerate technical debt. In the end, good engineering is still about clarity, not just output. 

 

The good news? The last statement above is true even when we move far away from engineering. Real simplification comes down to something far less exciting than artificial intelligence or even technology. It's about discipline.  

 

Organizations that manage this well tend to do a few things differently. They will use fewer tools but use them very deliberately. They will prioritise decisions and clarity over output volume. Not every report needs to exist. Not every insight needs to be generated. These are organizations that will keep questioning, "What decision will this help us make?" They will build really strong filters. 

 

If the problem is a lack of good data, then get your data in order. If your process is broken, fix the process. If the problem is a behavioural problem, it may well need a behavioural solution. Being sophisticated enough to leverage the power of AI in understanding the problems, and then being sensible enough to use human judgement to carve out a path of highest impact and clarity – that is the superpower that organisations can make their own. 

AI can unlock a lot, but without the right culture and discipline, it can create noise faster than humans ever could.  

Empower your business. Get practical tips, market insights, and growth strategies delivered to your inbox

Subscribe Our Weekly Newsletter!

By continuing you agree to our Privacy Policy & Terms & Conditions