| | January 20168CIOReviewBy Stephen Skinner, CIO, First Team Real EstateThe exponential growth of big data has led to significant challenges when it comes to finding meaning hidden within the patterns of large, unstructured datasets. By placing data into a visual context, data visualization technologies have the potential to help one explore and discover the structure and patterns of today's seemingly endless streams of new data. As a method of visual communication, data visualization is not new. There are many widely published examples from the 18th-19th centuries including the Periodic Table of the Elements, Charles Darwin's Tree of Life, Charles Minard's statistical graphic of Napoleon's Russian Campaign and others. Automated data visualization technologies are relatively new, and have come into broad use in recent years. Early examples of automated data visualization include 3D Finite Element Modeling and Computational Fluid Dynamics. Processing large datasets of this type requires significant technical expertise and fast, large-scale computing systems such as massively parallel supercomputing arrays. National supercomputing centers were established in the 1980's to encourage the development and adoption of data visualization. Many of the early projects in this area were related to engineering, defense, molecular modeling, simulation and the production of 3D computer animation. Educational institutions supporting these facilities were often responsible for the original development of supercomputer visualization software Visualizing Big Data to get the Big PictureStephen SkinnerIn My Opinion
<
Page 7 |
Page 9 >