CIOReview
CIOReview | | 19 AUGUST 2022By Ruma Bhattacharyya, Vice President of Business Systems Analysis, EverestRuma Bhattacharyya is Vice President of Business System Analysis at Everest. In her role, she manages the lines of communication between clients, businesses, and the technology team to help ensure that projects are completed seamlessly. With over 15 years in the Insurance industry, Ruma has experience in strategic implementation, integrated systems development, and company collaboration. Ruma and her team play a key role in the continued development and delivery of many strategic and transformational policy/claims/billing projects and maintain business relationships. Prior to joining Everest in 2008, Ruma was a consultant at Verizon Wireless and Arch Insurance Group and also served as a Senior Systems Engineer on AT&T at Hewlett-Packard for 7 years. She has her PhD in Chemistry from Kalyani University and currently lives in Hillsborough, New JerseyCXO INSIGHTSDATA INTEGRATION ISESSENTIAL TO WINNING For businesses today, data proliferation can be extremely valuable as it is a critical component for making prudent business decisions and gaining operational efficiency. Yet processing that amount of data and seamlessly incorporating it into new and existing workflows is not so easy, especially in highly regulated industries, as there are many common data integration challenges that businesses need to address. In order to benefit from the vast amount of data, it is essential for companies to have a solid data strategy in place. And proper data integration is the most important component of a data strategy plan. Data Integration ChallengesBeginning with usability. Data is king, but it is just data if it can't speak the same language or be translated for different workflows and departments. Businesses are often faced with inconsistent data formats and models which is where many of the data integration challenges arise. For example, within the insurance industry, the data received and saved by the policy management group is often different from how the claims operations team needs to present their data. To fix it, the data needs to be normalized so that it is compatible across the workflow. Once this usability challenge is identified, data normalization can be implemented either manually or through the implementation of a data analytics tool, and the data can then be utilized and integrated into the workflow more easily. Next, there is the issue of quality and consistency. Unfortunately, manual data entry is still prevalent, and a lack of companywide standard practices can lead to inaccurate, inconsistent, outdated, and/or duplicative data. To prevent this, data quality management ­ where someone is assigned to validate the data before it gets added into the system ­ must be enforced. Metadata management is another aspect of overseeing data quality that when handled incorrectly can pose serious challenges. Mismanagement can create an inability to share metadata across different applications, resulting in inefficient and error-prone processes. In fact, Gartner estimates that poor data quality costs organizations an average of $12.9 million every year. Data silos are another common pitfall for data integration. Data silos are created when data is accessible by one department but not easily available to other groups. When data is scattered through the enterprise, the risk of missing a crucial part of data is always present. Proper workflow integration, which connects one application with another typically via the Application Programming Interfaces (APIs), can help break down these silos. By putting APIs into place, users can do data entry in one application and data can move between applications. This can increase employee productivity and minimize costly human errors.Best Practices for Seamless Data IntegrationWhile there are specific, targeted solutions for each of these challenges, the businesses best positioned to gain a competitive Ruma Bhattacharyya
< Page 9 | Page 11 >