X

This site uses cookies and by using the site you are consenting to this. We utilize cookies to optimize our brand’s web presence and website experience. To learn more about cookies, click here to read our privacy statement.

Building a Future-Proof Data Warehouse for VGM’s Insurance Division

A brightly lit data center aisle with rows of server racks, digital data graphs and charts overlaid, illustrating how AI improves customer experience through advanced information storage and analysis.

As VGM Group prepared to retire its legacy insurance platform and migrate to a new third-party-hosted system, the organization faced a pivotal challenge. How could years of critical insurance data be preserved while building a scalable, forward-looking reporting infrastructure, especially when both the old and new systems were still evolving?

SPR was engaged to architect a unified data warehouse capable of integrating and transforming data from two distinct platforms. The solution would improve reporting accuracy, streamline performance, and enable VGM’s internal analysts to confidently manage the system after delivery.

The Challenge

At the start of the project, VGM’s insurance division lacked a centralized data warehouse. Reporting and analytics were being executed directly against a transactional database connected to a legacy system. This approach introduced performance bottlenecks, increased risk to business operations, and limited the ability to scale.

Compounding the challenge, the new enterprise system, which is hosted by a third party, was still under development. The internal team was gaining familiarity with its data structures in real time, meaning many transformation requirements had not yet been defined. Despite this uncertainty, the warehouse needed to support both systems seamlessly and be ready for long-term use.

The Approach

Before the project began, the VGM team refined our focus to data that they knew would be critical for historical and ongoing reporting, their prep work allowed us to focus on key areas while also supporting the system implementation team.

The project kicked-off with an in-depth exploration of the legacy data landscape. This involved mapping the structure, identifying inconsistencies, and designing a more organized and consistent schema for the future warehouse. Rather than performing a direct lift-and-shift, the team focused on transforming the data to improve reporting clarity, eliminate duplication, and resolve integrity issues.

The warehouse was built using Azure Data Factory (ADF), a low-code data integration tool selected for its flexibility and ease of maintenance. This platform aligned well with VGM’s internal team, which was composed primarily of data analysts rather than software developers. By designing pipelines in ADF, SPR ensured the solution would be supportable by VGM’s team without deep technical overhead.

Tackling Complexity Up Front

To minimize project risk, the most complex data sets were addressed early. This included ingesting and transforming data from the legacy system into the new warehouse structure. By validating the architecture with known data, the team established a reliable foundation before turning attention to the new system.

Because VGM was still learning the new platform, they strived to gather information ahead about the new system to make informed decisions. Close collaboration was required to define mappings and transformation logic. The consultants worked iteratively with the client to ensure the data could be integrated into the same tables already established, enabling unified reporting across both systems.

Navigating Security Constraints

Access to the new system’s data posed a unique challenge due to strict security protocols and hosting by an external provider. Working in partnership with VGM’s technical and security teams, SPR established secure access pathways that aligned with internal governance. This allowed nightly data transfers to proceed without compromising enterprise policies or delaying development.

Optimizing for Performance, Cost, and the Future

Efficiency was a core design principle throughout the engagement. Much of the legacy architecture had been structured for small, transactional queries. In contrast, the warehouse required processing of large datasets on a recurring basis. To address this, the team optimized data pipelines, query structures, and transformation logic to reduce runtime and minimize compute costs, especially important for nightly batch loads.

As the project neared completion, significant effort was placed on knowledge transfer and documentation. The goal was to ensure VGM’s internal analysts could confidently operate, troubleshoot, and extend the solution going forward. This included detailed walkthroughs, in-depth working sessions, and visual documentation with screenshots and step-by-step guides. The client expressed appreciation for the transparency and hands-on enablement provided throughout the transition.

The Outcome

The result was a robust, secure, and scalable data warehouse tailored to the needs of VGM’s insurance division. The solution supports data from both legacy and new systems, offers cleaner and more accurate reporting, and is optimized for long-term efficiency and maintainability.

Key outcomes include:

  • Data Continuity: Preserved critical legacy data during system decommissioning
  • Improved Data Integrity: Resolved inconsistencies and duplication to enable more reliable reporting
  • Operational Efficiency: Optimized performance for nightly data loads, reducing both runtime and cost
  • System Flexibility: Unified legacy and future-state data into a single reporting structure
  • Sustainable Ownership: Equipped VGM’s team with the tools and training needed to support the platform independently

Why It Worked

This engagement was successful because of strong execution, collaborative leadership, and a clear focus on long-term value.

  1. Communication from Day One: From the outset, SPR worked with VGM to establish a shared cadence. Using defined sprints and transparent priorities, the teams maintained momentum while navigating ambiguity and change. Clear expectations helped avoid surprises and enabled prompt issue resolution.
  2. Partnership Over Handoff: Rather than simply delivering a solution, SPR collaborated closely with VGM’s team to co-create the architecture. The consultants functioned as embedded partners, facilitating access and responding quickly to changing requirements.
  3. Practical Design for Real Users: The warehouse was built not just for data storage but for real-world usability. The decision to use a low-code platform made the system accessible for non-developers and helped ensure continuity after project completion.
  4. Proactive Risk Management: By addressing the most complex technical elements early, the project avoided costly delays later in the timeline. Difficult data sets were prioritized in the first phases, allowing the team to validate architecture and performance early on.
  5. Emphasis on Optimization: Every pipeline was designed with cost, speed, and reliability in mind. Transformations were streamlined to support nightly runs, and expensive queries were replaced with efficient alternatives.
  6. Commitment to Enablement: Documentation and knowledge transfer were treated as deliverables, not afterthoughts. The client was left with a well-documented system, practical training, and the confidence to manage it independently, an outcome confirmed by positive feedback during project retrospectives.
  7. Balancing: Two large workstreams running in parallel – the data warehouse and the insurance platform migration, both important, large business unit initiatives. Teams remained focused, organized, and dedicated to both efforts – VGM continually working one step ahead of SPR.

With a future-ready data warehouse now in place, VGM’s insurance division is positioned to move forward with confidence. The solution preserves historical insight, supports new systems, and enables fast, reliable reporting across the enterprise. Thanks to thoughtful planning and a collaborative delivery model, the organization now has both the technical foundation and internal capability to grow its analytics maturity on its own terms.