From monolith to molecular: Why core banking must evolve

From monolith to molecular: Why core banking must evolve

Core banking platforms serve as the lifeblood of financial services, encapsulating decades of expertise, stringent regulatory compliance, and hard-earned customer trust. These systems, initially conceived in the 80s and 90s and subsequently refined in the 2000s, are resilient, battle-hardened, and deeply trusted. However, they were not designed to accommodate the demands of today’s digital-first landscape, such as real-time payments, embedded finance, continuous product experimentation, or AI-driven personalisation. Consequently, what once served as the bedrock of banks is now increasingly becoming an obstacle to innovation in a world dominated by instant payments, embedded finance, and customer experiences that must measure up to giants like Amazon and Apple.

Despite substantial investments in modernisation initiatives, banks have often only managed to achieve incremental change, which fails to address the underlying structural constraints. Merely overlaying modern customer expectations on frameworks designed for a slower, analogue era is a futile exercise. Instead, the industry needs to transition from monolithic platforms to more modular architectures of the early cloud era and, eventually, to a molecular, cloud-native approach that promises true agility and scalability.

The first cloud era: Monoliths in virtual clothing

The initial wave of cloud transformation in banking was essentially characterised by lift-and-shift strategies. This involved virtualising traditional core systems, which were originally designed for on-premises environments, and deploying them onto cloud infrastructure with minimal architectural change.

While this approach reduced the need for physical infrastructure management, it did little to alter the underlying economics. The limitations of a monolith persisted, even in the cloud, leading to inefficient capacity expansion and rising costs. Furthermore, many of these systems continued to rely on end-of-day or near-batch processing, limiting their ability to provide real-time insights or responsiveness.

Modular architectures: Progress or a false dawn?

The next step was the adoption of modular architectures, which promised greater flexibility by decomposing monolithic systems into functional modules like deposits, lending, or payments. However, these modular cores often introduced new forms of rigidity, such as incremental costs, licensing complexity, and integration overhead. Additionally, they failed to effectively isolate resource demand at a technical level, leading to distorted economics and inefficient use of infrastructure.

The microservices era: Valuable, but not sufficient

As banks ventured further into cloud-native territory, many adopted microservices architectures. While these provided significant benefits, including independent scaling of technical capabilities, improved deployment velocity, and greater team autonomy, they were not always the right solution. Over-decomposition led to unnecessary complexity, added latency, increased operational risk, and development overhead. Thus, while microservices remain a powerful tool, they are not a suitable standalone solution for modelling financial products.

Molecular banking: From modules and microservices to micro-components

Molecular banking, a more fundamental architectural shift, decomposes the core into its smallest functional components, allowing for greater configurability, combinability, and independent scalability in real time. This enables innovative forms of product design and offers several substantial benefits, including precision scaling and cost control, iterative product innovation, parallel experimentation, and customer-level personalisation.

Migration without the shock

Core transformation has typically been hindered by the risk associated with migration. However, molecular banking offers a safer path by allowing for incremental migration of products and capabilities while legacy and modern systems coexist, thereby reducing operational risk and spreading costs.

Future-proofing for an AI-native era

As banking enters an era characterised by AI-enabled services, real-time decision-making, and hyper-personalised financial management, molecular banking provides a foundation inherently compatible with AI-native models. This enables product managers to evolve offerings with the same discipline and velocity that software teams apply to code, transforming the core system from a system of record into a system of continuous innovation.

Banks must be realistic about the limitations of past approaches. Neither cloud hosting nor modularisation fully addressed the structural constraints of the monolith. As customer expectations accelerate and AI-driven capabilities become standard, these compromises will become increasingly visible. Molecular banking represents a re-engineering of the core operating model, aligning core systems more closely with the realities of modern financial services.

Institutions that embrace molecular architectures will be better positioned to innovate continuously, personalise at scale, and adapt as technology and customer expectations evolve. Conversely, those that do not will remain constrained by core systems optimised for historical operating models, limiting their ability to effectively meet the demands of modern and future financial services.

Paul Payne, CTO at SaaScada

Source: Here

Share:

Picture of John Wick

John Wick

ABJ, a Senior Writer at All Banking, brings over 10 years of automotive journalism experience. He provides insightful coverage of the latest banking jobs across the American and European markets.
Picture of John Wick

John Wick

ABJ, a Senior Writer at All Banking, brings over 10 years of automotive journalism experience. He provides insightful coverage of the latest banking jobs across the American and European markets.
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x