CIOReview
| | JUNE 20189CIOReviewTechnology as an EnablerWhen we approached the development of our FP&A platform architecture, our primary objective was to leverage technology to minimize the time and effort required in data compiling. We approached our platform architecture in three phases: data compilation and storage, data organization, and insight extraction.Data Compilation and StorageThe technology model for data compilation and storage continues to evolve from the traditional on-premise setup to cloud-based infrastructures that provide comprehensive capabilities and applications at scale without having to break the bank. Emerging data integration processes, such as the creation of a "data factory" that leverages popular programming languages like Python, can allow organizations to establish affordable ecosystems embedded with data-driven ETL workflows and pipelines that upload disparate data stores to the cloud. When we implemented our own FP&A platform, we designed automated workflows using Azure services to ingest data feeds from ERP, CRM, POS and other event data portals that our client partners provided. With data feeds linked to cloud-based servers that a third-party provider manages and secures, we avoided upfront capital investments and were ready for the next phase.Data OrganizationAs the breadth and depth of data continues to grow at a rapid pace, so does the need to process, clean and transform it for the consumption of analytics and BI applications. For many organizations, the need to leverage big data also means the need to have enough firepower when it comes time for computation. Having the right capabilities in place to do so is paramount to getting insights into the hands of decision-makers quickly. Leveraging services such as data lakes and software applications like Hadoop and Spark that are coupled with machine-learning components enables large amounts of data from any source to be processed and prepared at the speed needed to drive action today. In our FP&A platform implementation, we used these capabilities to build customized data structures that aggregate, transform and prepare disparate sources of data workflows and that are prepared to be integrated with our proprietary reporting and analytical processes.Insight ExtractionAnalytics applications are working to answer the demand for rapid insights by enhancing the UIs that sit on top of the underlying analytical packages, like R or SAS. Business Intelligence and data mining platforms such as Tableau are also following suit by adding statistical packages within their already user-friendly interfaces. As a result, business users in any functional area can leverage robust and statistically rigorous methodologies and garner more advanced insights and recommendations without having a degree in data science or statistics. Again, in our FP&A platform implementation, the data organization process generated final data sets that allowed us to seamlessly feed information into the proper end-user platforms, including EPMs, Power BI and statistical packages. This process enabled stakeholders to work with the data in near real-time, efficiently and effectively.Shifting from 80/20 to 20/80When it comes to distilling large amounts of data into actionable insights, time is almost always the most binding constraint. You may want to maximize the amount of time you spend analyzing and advising, but the reality is that data compiling can be such an enormous undertaking that you run out of time.To shift to a more desirable ratio between data compiling and data analysis, rethink how you approach data compilation and storage, data organization and insight extraction. Take advantage of emerging technology around data integration processes that can help you save capital. Find services designed to integrate with analytics consumption and use them to customize your own data structure. Leverage applications and platforms built specifically for efficient, easy access to insights.Such a comprehensive approach will help you evolve to where the bulk of your process time can be allocated to building insights and developing strategies. Ultimately, this ratio is not just a number -- it's a way to capitalize on technology to improve performance and drive success. When it comes to distilling large amounts of data into actionable insights, time is almost always the most binding constraint
< Page 8 | Page 10 >