Can Insurance Survive the Rise of Secondary Climate Perils?

Can Insurance Survive the Rise of Secondary Climate Perils?

Simon Glairy is at the forefront of a paradigm shift in how the global financial markets perceive and price climate risk. As a recognized authority in Insurtech and AI-driven risk assessment, Glairy has spent his career dissecting the structural failures of traditional insurance models that were designed for a world that no longer exists. His expertise lies in bridging the gap between historical actuarial science and the real-time, data-heavy demands of a volatile climate. In this discussion, we explore the erosion of traditional capital structures, the alarming expansion of the global protection gap, and why the industry must pivot from historical retrospection to high-resolution, predictive earth observation.

The following conversation examines the rising dominance of secondary perils over major catastrophes and the technical limitations of legacy modeling. It delves into the practicalities of building new risk assessment frameworks from scratch and why the next generation of insurance leaders will look more like data scientists than traditional underwriters.

Annual insured losses from events like severe convective storms and hail have recently approached $100 billion, consistently outpacing major hurricanes. Why are these frequent events now driving the global loss baseline, and how does this shift fundamentally alter the way we must approach capital structures?

For decades, the insurance industry was essentially obsessed with the “monster storm”—the Category 5 hurricane or the massive earthquake that happens once a generation. However, in 2025, we saw a staggering $98 billion in insured losses coming strictly from non-peak perils like hail, flooding, and wildfires. This represented the vast majority of the $108 billion in total natural catastrophe losses for that year, proving that the cumulative weight of frequent events is now a heavier burden than the singular, headline-grabbing disasters. This shift forces us to rethink capital structures because you can no longer build a stable portfolio by just hedging against a single 1-in-100-year event. We have to move toward a framework that accounts for this “new baseline” of volatility, where the aggregate loss from smaller, localized storms can bankrupt a carrier just as easily as a major hurricane.

Over half of all global economic losses from natural catastrophes currently go uninsured. What specific factors are causing this protection gap to widen even as technology advances, and what are the long-term financial consequences for property owners when traditional capacity pulls back from high-risk zones?

The reality is that nearly 57% of total economic losses—exceeding $181 billion in 2024—went uninsured, and that gap is widening because our industry is struggling to price the risks that are growing the fastest. When traditional insurers cannot accurately model a peril like a wildfire or a flash flood, they simply stop writing coverage in those areas to protect their own solvency. This creates a devastating ripple effect where a homeowner or a small business owner in California or the Midwest suddenly finds themselves holding all the risk on their own balance sheet. The financial consequences are visceral; when the 2025 Los Angeles wildfires caused $40 billion in insured losses, many property owners realized too late that their policies had hidden gaps or had been canceled entirely. Without a way to transfer this risk, these communities face a much slower recovery process, often taking months or years to rebuild what was lost in a single afternoon.

Legacy catastrophe models were originally built for peak perils with deep historical datasets, such as earthquakes. How do these technical limitations lead to a “dangerous feedback loop” in underwriting, and what are the specific challenges of modeling hyper-localized risks like soil moisture or fuel connectivity?

Legacy models are essentially looking in the rearview mirror, relying on decades of historical data that don’t reflect the current pace of climate change. This creates a dangerous feedback loop where an insurer can’t price a risk because they can’t model it, so they pull back, leaving the market underserved and underfunded. Modeling secondary perils is a different beast entirely because they are hyper-localized; you can’t just look at a general region and predict a wildfire’s behavior without understanding fuel connectivity or soil moisture levels. A house on one side of a ridge might be perfectly safe, while a house just 500 feet away is in a high-intensity fire path due to vegetation density. Traditional tools weren’t built to ingest this level of granular, non-linear data, which is why they failed so spectacularly during the $40 billion wildfire events in early 2025.

Integrating real-time satellite imagery and machine learning could potentially modernize risk assessment. What are the practical steps for building these models from scratch rather than patching legacy systems, and how do parametric structures help eliminate the delays typically found in traditional claims processes?

Building from scratch means we have to stop trying to “bolt on” climate data to old actuarial tables and instead start with the physical variables themselves. We need to ingest high-resolution terrain data, real-time satellite imagery, and vegetation mapping into machine learning models that can identify patterns no human actuary could see. This leads us directly to parametric insurance structures, which are a total game-changer because they pay out based on objective, measured triggers—like wind speed or flood depth—rather than a subjective loss adjustment. In a traditional system, a claimant might wait months for an adjuster to walk the property and verify damages, but with a parametric trigger, the payout can be initiated almost instantly. This speed is vital for recovery, ensuring that a farmer or business owner has the liquidity they need immediately after a $100-billion-loss year, rather than sinking into debt while waiting for paperwork to clear.

Modern underwriting now requires expertise in earth observation and data science rather than just traditional actuarial work. How should companies restructure their talent acquisition to better price risk at the individual parcel level, and what specific skill sets will define the industry’s leaders over the next decade?

The days of the traditional actuary being the sole gatekeeper of risk are over; the new industry leaders will be those who can speak the language of both finance and earth observation. Companies need to stop hiring solely from business schools and start looking for data scientists who understand how to manipulate terabytes of satellite data to price risk at the individual parcel level. This requires a skill set that includes machine learning, geospatial analysis, and a deep understanding of environmental physics. If you want to accurately price a convective storm or a wildfire, you need someone who knows how soil moisture interacts with local wind patterns, not just someone who can read a 30-year loss history. The carriers that invest in this type of specialized talent will be the only ones capable of writing the complex coverage that the market is currently starving for.

What is your forecast for the global insurance industry’s ability to close the protection gap?

I believe the protection gap will actually continue to widen in the short term, perhaps reaching even more staggering heights than the $181 billion we saw recently, before we see a significant correction. The industry is currently in a painful transition period where legacy carriers are pulling back because they haven’t yet mastered the data science required for today’s secondary perils. However, as parametric structures become more mainstream and AI-driven modeling proves its accuracy, we will see a new wave of capital enter the market specifically to cover these “unmodeled” risks. My forecast is that by the end of the decade, the most successful insurance platforms won’t look like traditional companies at all, but will operate as technology-first ecosystems that use real-time environmental data to provide instant, transparent coverage for the millions of people currently left behind.

Subscribe to our weekly news digest.

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for Subscribing!
We'll be sending you our best soon!
Something went wrong, please try again later