| | 19CIOReviewDECEMBER 2021Let's revisit the introductory story:· The analytics team took full advantage of the available data, analytics and AI capabilities to pursue the insight requested. However, in communicating the results, they took the shortcut to include and present everything they knew, rather than translating it into what the target recipients, Chuck, the higher-level decision maker, and other members of his executive team,needed to know to make their decision. The analytics team designed for their own comfort rather than their audience's benefit. This resulted in information overload caused by an abundance of available, possibly relevant, yet likely not necessary information, distracting from the main insights.· From a data visualization perspective, the analytical team repurposed the graphs, charts and tables from their analytical tools and workflows, possibly to gain efficiencies. They did, however, overlook that data depictions for analysis must be designed very differently than data presentations for communication. Analytics-focused data visualizationsre aimed at maximizing the discovery and recognition of insight. Visual communication, in contrast, must be aimed at maximizing consumability, knowledge transfer and trust. While for analytical purposes perceptual and cognitive aspects dominate, for communication it is data-driven storytelling that must be emphasized.· Similarly, PowerPoint is not a toolfor creating reports but a presentation tool, assuming it to be narrated by a presenter. Far too often, PowerPoint slides and similar presentation decks are being overloaded with information,which reduces the effectiveness of the communication and the intended knowledge transfer.· In preparing the presentation, the analytics team also ignored the human limitations when it comes to information processing and retention:Study results have varied over the years but there is consensus that the human attention span can be as low as 7 minutes up to 12 minutes depending on the number of distractions in the environment. There is further scientific evidence that our attention span has in fact been decreasing over recent years with the explosion of digital distractions in our work environments. What does this mean? Using the rule of thumb of two (2) minutes per slide for effective presentation, then a presentation should not have more than six (6) informational slides. Or, to put it differently, after six (6) slides you are running the risk that your audience is losing its attention, so that you should try to ensure that the most important messages and insights are within those first slides. This assumes that slides are not too densely populated: From cognitive psychology we know, e.g., that humans cannot keep more than 7 +/- 2 chunks of information in their working memory at any point in time. Keep that in mind when you add information to your slides.· Finally, the fact that the presentation of findings was static and "pre-canned" limited the audience's ability to ask farther-reaching questionsand to engage in a closer discourse with the presenter(s), forcing the decision-making process into a time-consuming loop of recurrent, multi-week question, analysis, and presentation cycles.While the listed shortcomings showcased by our story of a typical decision-making process seem common and habitual, we do have the knowledge, best practices and technologies to work towards eliminating them. However, it will require a corresponding change in team- and decision-making habits as well as potentially workflows, to take full advantage of those resources. Here are a few recommendations along the way:It is important to rememberthat even the differentiation of data representation for communication and analysiscan and should be further delineated:· For communication, beyond narrated presentations and storytelling, there are different design affordances towards e.g., creating awareness through static info chartsfor general audiences, versus creating dashboards for monitoring data with an emphasis on quickly detecting deviations from established baselines, versus creating static reports for auditing and review purposes by a very knowledgeable audience.· For analysis, data visualization evangelist Stephen Few differentiates visual tools for analysis as faceted analytical displayto contrast them to dashboards, which predominately suit monitoring and reporting purposes. (After all, how analytical can you get with the dashboard in your car, which is where this term stems from?) Even within an analytical context, one can observe different design requirementswith increasing levels of interaction affordances when it comes to applications too inquiry-based analysis, i.e. the problem space is well understood, you know what you are looking for and are looking for answers to concrete questions,o investigative analysis, i.e. you try to understand your problem space, often to answer the why question, ando exploratory analysis, i.e. you try to define the problem space to avoid blind spots or to identify white spaces.In general, using the right tool based on data, analytical task and end-user is keyin gaining efficiencies and mitigating Using the right tool based on data, analytical task and end-user is key in gaining efficiencies and mitigating information overload, whether in analytical or communication contexts
<
Page 9 |
Page 11 >