Your Ultimate Toolset for EUC Performance Tests

Measuring perceived remote end-user experience in an organized, valid and repeatable way

Test Methodology Toolset Docs SimLoad Gallery Test Results

 

EUC Score measures and quantifies perceived end-user experience in remote application and digital workspace environments, both on premises and in the cloud. The underlying and vendor-independent test methodology is built on the combination of running simulated workloads, collecting telemetry data on the system under test, capturing screen videos on the endpoint device, and visualizing the results in a unique way - fast, precise, repeatable and intuitive.

The EUC Score Toolset was designed for proactive synthetic testing scenarios. Good examples are pre-production benchmarking and performance testing, ad-hoc system diagnostics and analysis, quality assurance and control, service level agreement management, and end-user computing research.

EUC Score

 

 

From the Community - For the Community and the Enterprise

If you are an individual or a small organization producing EUC benchmarking content and sharing it publicly at no cost, you can request access to the EUC Score Toolset as part of a non-commercial Community Subscription that is free of charge. If you are an enterprise conducting EUC benchmarking tests for commercial or End-User Computing product-related purposes, you must purchase an EUC Score Subscription. We define an enterprise as an organization with either more than 50 PCs or users, or with more than one million U.S. dollars in annual revenues. Please contact us for details: info

Confused by the "jargon"? Check out the technical or special terms used in the context of EUC benchmarking tests.

End-Users First - Six Benefits of Using EUC Score

Diagnose end-user pain symptoms and solve IT support sorrows with proactive synthetic testing

 

Identify potential pain

Identify potential pain
Pre-production capabilities, performance and load testing

Examine existing pain

Examine existing pain
Ad-hoc diagnostics in production environments

Prevent new pain

Prevent new pain
What-if analysis and comparison of new system designs and migration scenarios

 

Quantify pain relief success

Quantify pain relief success
Before-after analysis of system optimizations and software updates

Measure chronic pain

Measure chronic pain
DaaS and VDI service level agreement management

Deliver less pain by design

Deliver less pain by design
EUC software quality assurance and quality control

 

EUC Score Test Methodology

When interacting with virtual desktops or remote applications, there are several usability and performance aspects with significant relevance for most users: fast logon, short application load times, high refresh rate of the endpoint display, great graphics quality, unnoticeable user interface response time delays, support of all common media formats, and high session reliability. Only systems that come close to this ideal allow users to naturally emerge into the digital workspace through a range of endpoint devices with different capabilities and form factors.

We measure or score perceived user experience in virtual desktops or remote applications by running host-side synthetic workloads. Continue reading...

Learn about the three EUC Score test phases.

Design

Lego Image
Build a target environment and select automated workloads that simulate virtual desktop users doing realistic work.

Perform

Lego Image
Orchestrate automated test runs, capture client-side screen videos and collect host-side telemetry data.

Analyze

Lego Image
Analyze collected results by combining screen videos of the primary user session with telemetry data collected on the host machine.

 

Our benchmarking approach strictly follows best practices established in international research projects. The underlying test methodology is based on a range of pre-selected and adaptable application scenarios representing typical use cases. Both fully automated and manual test sequences are executed in a closely controlled way and recorded as screen videos. These videos are correlated to relevant feature and performance factors, using metrics closely tied to the actual remote end user experience on popular endpoint devices.

It is important to note that our EUC Score performance benchmarking methodology works across the boundaries of on-premises and cloud environments - it doesn't matter where the different test components are located.

Confused by the "jargon"? Check out the technical or special terms used in the context of EUC Score experiments.

Blog Image

Commonly Used EUC Score Lab Equipment

This article introduces you to the mandatory and optional equipment required for setting up an EUC Score test lab. The list of equipment includes recommended reference client devices, video capture devices for screen recording, WAN emulators for simulating different network conditions, and devices for measuring user input delay. Continue reading...