AI Helps Insurers Break Free From Legacy Technology Debt

AI Helps Insurers Break Free From Legacy Technology Debt

As an expert who has navigated the shifting tides of the insurance industry for over 40 years, Simon Glairy brings a seasoned perspective to the pressing challenge of application modernization. In an era where legacy systems are often viewed as both the backbone and the bottleneck of enterprise operations, he specializes in dismantling the “risk-avoidance loop” that keeps many carriers tethered to outdated technology. This conversation explores the delicate balance between maintaining stable core systems and the urgent need for digital agility, particularly as the industry faces a generational turnover of technical talent and the transformative potential of Agentic AI.

We delve into the operational risks of losing undocumented domain knowledge as teams from the 1980s retire, and why the traditional “if it ain’t broke, don’t fix it” mentality has become a dangerous paradox. Simon explains the strategic shift toward modernizing in place via the cloud—a pragmatic alternative to the runaway costs of full-scale replacements—and how deterministic data can act as a necessary guardrail for AI-driven code extraction. Finally, we examine the rise of outcome-based modernization contracts and identify the critical “boiling point” where inaction finally outweighs the perceived risks of transformation.

Many core policy and claims systems date back to the early 1980s, often maintained by teams that have long since departed. How does this loss of undocumented domain knowledge impact daily operations, and what specific steps can carriers take to capture this logic before their remaining experts retire?

When you walk into the IT department of a major carrier, you can almost feel the weight of these systems; some of them have origins tracing back to the early eighties or even the seventies. The daily operational impact is a stifling lack of agility, where simple updates to respond to regulatory changes become high-stakes gambles because no one truly understands the ripple effects within the code. We are seeing a quiet crisis where the people who know how these applications breathe—the ones who understand the “why” behind a specific claims logic written in 1984—are walking out the door into retirement. To capture this logic, carriers must move beyond passive documentation and start using Agentic AI to systematically extract and audit the embedded business rules. It is about turning that opaque technical debt into a transparent map of assets while you still have a few veterans left to validate the findings. This isn’t just a technical task; it is a race against time to preserve the institutional DNA that differentiates a firm in a competitive market.

Carriers often adopt a laggard view on investment because legacy systems are perceived as stable despite growing regulatory exposures. How can leadership reconcile the “if it ain’t broke” sentiment with the need for agility, and what metrics help quantify the cost of continued inaction?

The “if it ain’t broke” mentality is a comfortable delusion in an industry built on managing uncertainty, but it creates a self-reinforcing risk paradox where the fear of change eventually becomes the greatest threat. Leadership needs to understand that while the system might be “running,” it is failing the business every time a new product launch is delayed by months because the mainframe can’t handle the data structure. To reconcile this, we look at metrics like the “technical debt tax”—the percentage of the IT budget spent purely on keeping the lights on versus driving innovation—and the escalating cost of finding specialists who still understand 40-year-old technology. When you quantify the speed-to-market lag and the mounting costs of regulatory non-compliance, the stability of the legacy estate starts to look more like a slow-motion collision. It’s no longer about whether the system works today, but whether its rigidity is strangling the company’s ability to exist tomorrow.

Full-scale core replacements frequently lead to runaway costs and the loss of differentiating business logic developed over decades. Why is “modernizing in place” via the cloud considered a more pragmatic alternative, and how does this approach better manage the transition of existing books of business?

I have seen far too many organizations embark on massive replacement programs only to find themselves drowning in runaway transformation costs and duplicated systems of record. The “modernize in place” approach is gaining traction because it respects the complexity of existing books of business rather than trying to force-migrate them into a rigid, off-the-shelf platform. By moving legacy platforms onto the cloud, carriers can improve scalability and data flow without the trauma of a “rip and replace” strategy that often leaves the most profitable legacy policies stranded on old hardware. This method allows for a strategic, controlled progression where you preserve the specialized business functionality that has been honed over decades. It effectively shifts modernization from being a catastrophic “reset” button to a continuous evolution that shortens the time to value and keeps the business operational throughout the transition.

Agentic AI allows for the systematic extraction of legacy code, yet LLMs can produce inaccurate results if left unchecked. How can organizations integrate deterministic contextual data to ensure modernization is repeatable, and what role do AI agents play in turning opaque liabilities into transparent assets?

Agentic AI is a game-changer because it allows us to analyze and migrate applications off expensive mainframes with a speed we couldn’t imagine a few years ago, but we have to be wary of the “hallucination” factor. To de-risk these initiatives, the secret is to wrap the power of Large Language Models in a protective layer of deterministic contextual data, which provides the hard facts and logic that the AI must follow. Think of it as a GPS for the AI: the LLM provides the engine and the creative pathfinding, but the deterministic data provides the actual map and the traffic laws it cannot break. AI agents act as tireless auditors, crawling through millions of lines of code to identify patterns and business rules that were previously hidden in the shadows. This process transforms a legacy system from a scary, undocumented liability into a clean, auditable set of modern assets that the business can finally manage with confidence.

Some firms treat modernization as an incremental project, while others seek outcome-based commitments from specialists. When is an incremental approach insufficient for a carrier’s timeline, and how do outcome-based contracts shift the financial and operational risk away from the insurer?

An incremental approach is like chipping away at a massive iceberg; it’s fine if you aren’t in a hurry and have a very stable market, but most carriers today are facing heat from every direction. If your timeline is being squeezed by a hard deadline—like a data center closure or a massive regulatory shift—then “chipping away” is simply insufficient and dangerous. This is where bringing in a modernization specialist for an outcome-based commitment changes the math entirely. In these arrangements, the vendor takes on the project risk, guaranteeing a specific result within a fixed budget and timeframe, which protects the carrier from the common trap of endless “consulting” hours. It shifts the burden of execution to those who have the specialized tools and AI-driven automation to ensure the transition doesn’t disrupt daily business-as-usual operations.

Delaying modernization is often compared to a frog in slowly heating water where the danger becomes fatal only when it is too late. What are the early warning signs that a legacy estate has reached this “boiling point,” and what is your forecast for legacy modernization?

The early warning signs are often subtle: a slight increase in the time it takes to produce a standard report, a growing difficulty in finding junior developers willing to touch the core system, or the “quiet panic” when a senior architect announces their retirement. You reach the boiling point when the energy required to maintain the status quo is so high that the organization has no resources left to actually innovate or escape the cycle. My forecast is that the next three to five years will see a massive shakeout where the “risk paradox” is finally broken by the sheer efficiency of Agentic AI. We are moving toward a world where modernization is no longer a once-in-a-generation trauma, but a normalized, automated process of continuous renewal. For insurers, the real danger is no longer the act of modernizing, but the continued opacity of their own systems; those who act now to make their core logic transparent will be the ones who survive the heat.

What is your forecast for legacy modernization?

I predict that within the next decade, the very concept of a “legacy system” will change because the barrier between old code and new platforms will be dissolved by near-real-time AI translation. We will see the end of the “rip and replace” era as carriers adopt a state of permanent evolution, where business logic is decoupled from the underlying hardware and can be ported across environments as easily as moving a file. The carriers who will dominate the market are those who stop viewing their core systems as static machines and start seeing them as fluid data assets that must be continuously refined. Ultimately, the industry will move away from the fear-based “risk-avoidance loop” and toward a model where technology is a transparent, agile partner rather than a mysterious, untouchable liability.

Subscribe to our weekly news digest.

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for Subscribing!
We'll be sending you our best soon!
Something went wrong, please try again later