I’m thrilled to sit down with Simon Glairy, a renowned expert in insurance and Insurtech, particularly in the realms of risk management and AI-driven risk assessment. With years of experience navigating the complex data challenges in the insurance industry, Simon offers a unique perspective on how technology can transform fragmented systems into streamlined, decision-ready solutions. In our conversation, we dive into the critical data issues plaguing insurance carriers, explore innovative platforms that act as invisible infrastructure, and discuss how these solutions cater to specific industry needs while easing operational burdens. Join us as we unpack the future of insurance data management.
How did your journey in insurance and Insurtech shape your understanding of the data challenges carriers face today?
My career in insurance started with risk management, where I quickly saw how fragmented data systems created inefficiencies and risks across departments. Working on AI-driven risk assessment, I realized that without unified, reliable data, even the most advanced tools couldn’t deliver their full potential. Over the years, I’ve seen firsthand how disconnected core systems and manual reconciliations slow down everything from audits to executive decision-making. This inspired me to focus on solutions that bridge those gaps, ensuring data isn’t just collected, but actually usable for critical operations.
What do you see as the most pressing data problem for insurance carriers right now?
The biggest issue is fragmented data. Carriers often have multiple systems that don’t talk to each other, leading to different versions of the truth across Finance, Risk, and Claims. This isn’t just a technical glitch—it’s a financial and operational liability. It slows down processes like audits, increases compliance risks, and delays insights for leaders. When regulators and rating bureaus demand higher standards, carriers can’t afford to be piecing together data manually or relying on conflicting reports.
Can you break down the concept of an intelligence layer for data management in simple terms?
Think of an intelligence layer as a behind-the-scenes translator for all the data in an insurance company. It pulls together information from disconnected systems, reconciles it, and makes sure everyone—from Finance to Underwriting—is working from the same, accurate set of numbers. Unlike a typical dashboard that just displays data, this layer acts as invisible infrastructure, organizing and unifying everything before it even reaches the end user. It’s crucial because it saves time and reduces errors without adding more tools to an already crowded tech stack.
How does such a platform stand out compared to traditional data tools that carriers might already use?
Traditional tools like generic business intelligence platforms or data warehouses often require a lot of customization and ongoing management by internal teams. They’re not built with insurance-specific needs in mind, like regulatory reporting or complex policy data. A purpose-built intelligence layer, on the other hand, is tailored for the industry’s unique operational and compliance demands. It’s designed to integrate seamlessly and handle the heavy lifting of data orchestration, so carriers don’t need to keep reinventing the wheel.
What specific benefits does a unified data platform bring to different departments within an insurance carrier?
For Finance, it means faster, more accurate reporting during close cycles without endless manual reconciliations. Risk teams get a clearer picture of exposures across the board, which is vital for sound decision-making. Claims departments benefit from streamlined data that speeds up processing and reduces errors. Executives, meanwhile, gain real-time insights without wading through conflicting departmental reports. Essentially, it aligns everyone on the same page, which boosts efficiency and cuts down on operational friction.
How does a fully managed data solution ease the burden on internal teams at insurance carriers?
A fully managed solution takes the day-to-day grind of data orchestration off the plates of IT and data teams. Instead of spending hours on system maintenance, permissions, or troubleshooting pipelines, they can focus on higher-value tasks like analysis or strategy. It’s like having an external partner handle the plumbing of your data infrastructure—everything runs smoothly behind the scenes, and internal staff aren’t bogged down by the technical minutiae. This is a game-changer for carriers with limited resources or outdated systems.
Who do you think benefits most from these kinds of data solutions in the current insurance landscape?
Right now, regional and super-regional U.S. carriers are prime candidates because they often deal with legacy systems and fragmented data but have the scale to invest in modernization. Workers’ Compensation carriers also stand out since they need tight data integration for policy profitability and compliance. Additionally, self-insured organizations and public risk pools benefit hugely from standardized reporting and transparency across complex, multi-entity structures. These groups are under pressure to centralize data and prepare for technologies like AI, making unified platforms a perfect fit.
What is your forecast for the future of data management in the insurance industry?
I believe we’re on the cusp of a major shift where data management becomes the backbone of competitive advantage in insurance. Over the next five to ten years, carriers that invest in unified, intelligent data infrastructure will pull ahead—faster decision-making, better compliance, and seamless adoption of AI will set them apart. Those sticking with fragmented systems will struggle to keep up with regulatory demands and market expectations. The focus will increasingly be on invisible, purpose-built solutions that deliver value without adding complexity, fundamentally changing how carriers operate.