Welcome!

@DXWorldExpo Authors: Liz McMillan, Yeshim Deniz, Pat Romanski, Elizabeth White, William Schmarzo

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Linux Containers, Machine Learning

@DXWorldExpo: Article

What It Takes to Deliver User Experience Management

UEM is one of the key aspects of the Application Performance Management

User experience can boost or kill your revenue. Unhappy users are likely to abandon a service they struggle with and go to your competitors. To effectively manage the experience of your users, you need to efficiently monitor and understand their transactions in your mobile, web and enterprise applications. More important and often overlooked, the practice of User Experience Management (UEM) does not end in the client application. Common user experience tools fail to ensure holistic UEM the same way many think performance management is only based on analyzing server logs. Neither approach will shed light on the true user experience.

UEM is one of the key aspects of the Application Performance Management. It can be realized through distinct technologies. In this article, we discuss why none one them can be the single solution to ensure enough visibility. We argue that a true, end-to-end UEM should rely on data gathered using different monitoring technologies.

Technologies for End-User Experience Monitoring
Although UEM is only one of the dimensions realizing APM, it is the one that gets most attention. End-user experience is the point where the business processes interact with the technology stack. We can define three types of technologies for UEM from different perspectives (see Figure 1) as:

Figure 1: Comparison of three perspectives of UEM

  1. Synthetic, Transaction-based UEM: Using synthetic scripts we can monitor web and intranet applications from different locations on both internet and internal WANs, using mobile or web client applications. This allows us to effortlessly simulate multiple end-user environments during off-peak or non-business hours to check service levels against the SLA. This gives you constant, repeatable and comparable measurements ready for comparison and baselines. It resembles testing car safety with dummies: you learn a lot about car safety but it might not directly translate to the experience of the actual passengers in real life situations (see Figure 2).
  2. Endpoint instrumentation: When we look at user experience from the perspective of the endpoint (mobile client or web-based) instrumentations we even get data to analyze user behavior. Nevertheless, performance analysis based on the endpoint instrumentation alone is like looking only at the car speedometer and a few other gauges at the dashboard in your car: it gives you plenty of information about your speed, RPMs and gas level but if the engine blows a piston stopping your car, your dashboard alone won't tell you the source of the problem... it just says your speed is zero. Endpoint instrumentation is also very dependent on the client's technology stack with which it has to interact seamlessly.
  3. Network Packet Capture and Analysis within the Data Center: Analyzing network traffic across the whole data center all the way to the end user application enables you to correlate end-user experience with the actual state of the network and services; it remains the only feasible UEM solution when the client front-end cannot be instrumented, but still uses TCP to connect to the data center. This perspective however, remains blind to the impact of everything that happens outside of Data Center, e.g., third-party components. Similar to Endpoint Instrumentation, all the car dashboard instruments might not always accurately correlate to the quality of our journey.

Figure 2: Automotive analogy for UEM: each perspective tells only part of the story

To read more, and learn additional insights, click here for the full article.

More Stories By Sebastian Kruk

Sebastian Kruk is a Technical Product Strategist, Center of Excellence, at Compuware APM Business Unit.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


DXWorldEXPO Digital Transformation Stories
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with bu...