| | JULY 20219CIOReviewwith Exploratory Data Analysis to help answer some of those initial hypotheses, help us to better understand the domain, and guide our initial thinking on feature engineering.We then brainstorm what a solution look like, what are the key requirements, and how would this fit into the workflow of our employees or the journey of our customers both as a prototype/MVP but also in the future vision for our solution.It is critical to create prototypes based on the feedback from these sessions and get those prototypes into the hands of stakeholders, subject matter experts from the business, and end users/ customers. When building a recommendation engine, we started with hi fidelity wireframes and mockups of the experience, for forecasting models it is often dashboards as well as the parameters for any what if analyses we should include, and for marketing mix models our prototypes show how the models would enable decision making for investments across all channels. How these get built varies. Typically, we start with white board sketches, move to tools like Figma or Sketch, and then build out a higher fidelity versions either using python libraries/ R shiny or in React. These don't need to be perfect and ready to industrialize. The focus should be things that can be built quickly to show how the solution will work and then get the minimum version needed to integrate with our models and pilot with our business units.Finally, there's the development of our models themselves. We start broadly on model and feature selection and aim to down-select 2-5 models for extensive tuning and feature engineering. Our goal is to balance out more transparent and explainable models vs overall model performance (accuracy, cost to implement etc..). Depending on the difference in performance between our models, we may not be able to utilize transparent models. This then leads us to spending time on explaining our models using techniques such as LIME or Shapley Values for our best performing models. There's no one size fits all approach, but typically we find we'll spend as much if not more time on ensuring our models and features are explainable than we will on actually developing and selecting our models.Phase 3 Implementation and Experimentation: Often our longest phase. The goal of this phase is to work with our business and stakeholders to introduce and evaluate our prototype into the real world. This requires working with the business to develop the parameters of our experiment design, understand what a limited release would look like (where and among what groups), and then evaluating the prototype solution vs existing benchmarks or a control group.Based on the results we build or refine our business case for our solution vs the roadmap and effort needed to industrialize and expand the solution. Then we either go through expedited versions of phase 1 and phase 2 for the broader release or we expand the MVP for further evaluation while building our next version of the solution.This phase also almost always requires a re-think of business processes and incentives. A properly designed evaluation will also provide feedback on how we may need change how the team needs to work or how their current incentive framework may be at odds with how we would want our solution to work. What this looks like in practice:The process above depicts our actual project plan for a large Advanced Analytics MVP (think building a recommender system). The process looks very much like an "agile" approach but should fit in any solution development process. Our key goals are to ensure that:· We're delivering some form of insights deliverable, or prototype for feedback every two weeks · No project exceeds 16 weeks before going into phase 3, with an ideal goal of getting to phase 3 in 8-12 weeks, if not sooner· We have at least 2 check ins with our "steering committee" for feedback and assistance with any key decisions that may need to get madeThis ensures that we never go too long without getting broader feedback on our solution and keeps us honest on how we scope and develop our solution. Our goal is to get to something good enough to pilot with our business partners within 1 quarter.While no process is perfect, we've found the approach above to yield a great deal of success in leading our Advanced Analytics solutions to be adopted by our business partners and to actually drive change. We hope that a larger number of organizations will find success using design led approaches to developing analytics projects, because sustained high rates of failure are a detriment to all analytics professionals.
<
Page 8 |
Page 10 >