CIOReview
| | February 20168CIOReviewBy Justin Baird, Director of Application Development, CubeSmartIn web applications today, thousands of telemetry data points are pulled into data warehousing systems and processed to give the company insight on how Customers are using the app. Organizations rely heavily on the offerings of Google, Microsoft, Adobe, and others to provide page-level insights on how a customer travels through applications. They also provide information about how optimal a user's experience was using the app. These page-level metrics such as time-on-page, user flow, and session duration is aggregated and married to geo and demographic data. They are the general indicators of the effectiveness and the design of the app. Collecting this application telemetry data comes at a cost, as every data point generated must be processed and stored. If not properly done, the performance is severely impacted. The analytics and monitoring companies have been very successful in balancing the data provided for server and application performance, verses the impact on the customer experience. As the demand for information becomes more sophisticated, the required granularity of data becomes smaller. The next level of granularity for customer data is Customer Telemetry. Customer Telemetry consists of recording nearly every customer movement and interaction from within the page. Gathering information every time a user hovers over an image, clicks into a control, and even time spent in various Asynchronous JavaScript and XML (AJAX) controls within a page, will allow the most granular insights of Customer behavior. The volume of data generated by capturing this level of detail could be upwards of tens of thousands of data points per Customer, dwarfing all other types of telemetry data. Imagine the ability to recreate a Customer's interaction with an app down to the mouse movements. This enhanced level of detail allows companies to segment the user base, and identify behavioral patterns on demand, instead of analyzing delayed data from a focus group. This agility could enable quick identification and optimization of problematic user experience elements or enhance application flows. Collecting tens of thousands of data points per user, without negatively affecting application performance, is now achievable by leveraging commodity web server hardware and service offerings from many cloud providers. The data acquisition process involves implementation of code tracking on every element on the page's mouse-in and mouse-out events. By recording the exact timestamp and control (as well as any other pertinent data) on those events, a UX designer or support person would be able to recreate a Customer's usage pattern on demand. Unless properly implemented, however, generating this volume of data could cripple the performance of the application, and more likely do more harm than good. The proposed technology stack is Operating System, platform, provider, and language agnostic running on commodity hardware. The only specialized component necessary would be the data warehousing system of choice and the necessary supporting equipment. Page Level On the page level, each element would have events executed any time a user interacts with that element. These include events such as, mouse over, hover, click, as well as entry into text boxes and combo boxes. Each of these events would spawn a short-lived background thread. By capturing the user data into Next Level Web Apps: Customer Telemetry Justin BairdIn Myopinion
< Page 7 | Page 9 >