<- Back to Studies
Consolidating Data From Multiple Sources
Management of a pool for county governments and political subdivisions thereof were making member pricing decisions on an almost daily basis using reports that were several months old and under soft market conditions. The pool’s managers needed the ability to access integrated policy and claims information at multiple levels of detail and from multiple perspectives quickly in order to support decision-making.
The pool had vast amounts of data captured in multiple computer systems but each of these systems was a “stovepipe” – they were all isolated and could not integrate with each other. Policy processing activities such as rating, contract management, and reinsurance reporting are handled in-house using a custom-built system called RPMS. Claims processing for Property, Crime, and Auto Physical Damage lines of business is handled in-house and is supported by the P&CPlus product from DAVID Corporation. Automobile and General Liability claims are handled by a TPA which uses their own home-grown claim administration system. Public Official and Law Enforcement Liability claims are handled by a second TPA that hand keys data for each claim into a FoxPro database. Workers’ Compensation claims are handled by a third TPA which also uses their own home-grown, Oracle-based claim administration system.
Fragmentation of data created major problems in generating reports for management and outside sources such as independent financial auditors, actuarial firms, pool board members, and counties participating in the pool. Two to four weeks were required to generate a report because extracting data, relating claims data to policy data by member and policy year, calculating profitability and loss ratios, and summarizing the data was a manual process.
SST developed the Risk360°/Data Warehouse module in conjunction with the pool to solve the problem of consolidating and integrating exposure, premium, and claim data from multiple sources (systems) into a single database that is specifically designed to make reporting and analysis efficient and effective. The data warehouse contains policy, exposure, and claims data for all eight P&C lines of coverage. Complete history of changes in exposures and policy premiums are maintained to fully support as-of analyses. Claims data is captured at the financial transaction level of detail and is cross-referenced to the policy that was in effect at the time the loss occurred. This design enables quick and easy profitability analyses by policy year, fiscal year, and member year on both a current and as-of basis. Programs to extract, convert, transform, and load data into the data warehouse were created for all five “stovepipe” systems.
Analyzing and reporting of data stored in the data warehouse is done with an integrated, state-of-the-art business intelligence tool that hides the complexity of the database by providing a user interface that uses risk management terminology and organized the data elements in logical way. Pool managers query or locate the desired information by subject matter and then analyze or report that data using multidimensional analysis (also known as “slice and dice” or “twisting a Rubik’s cube), trend analysis, and spreadsheets. Thus reports showing trends of claims, claim development factors, and loss triangles are easily produced.
With the tools now available, pool managers have the capability of transforming data into information. They can underwrite, price, and evaluate its membership based on reliable and timely information. Loss sensitive pricing, with timely information, enables pool management to respond to increasing competition. Ad-hoc reports can be generated internally without having to wait for a TPA that may or may not be able to create the report. Keeping pool board members informed is a much easier task.
When the data warehouse project was initiated, the quality of the data was unknown but assumed to be correct and accurate by pool staff. Through the process of loading data into the data warehouse, the pool was able to assess the level of integrity and completeness of data held by its TPAs as well as its internal policy and claim systems. They discovered that their assumption of good data quality was wrong and initiated a data cleanup project that took over a year going back to the inception of each line of coverage.