EUC Score Test Methodology

Overview

EUC Score introduces a benchmarking methodology and a toolset focused on what a remote user really sees on the endpoint screen while measuring the timely correlated load metrics generated on the host platform. Test engineers and IT infrastructure architects can apply the underlying commonly accepted best practices and "recipes" for benchmarking, quality management and performance optimization in End-User Computing (EUC) and Desktop-as-a-Service (DaaS) environments.

Our benchmarking approach is based on valid experimental evidence and rational discussion. Lab experiments conducted with the EUC Score toolset allow you to test EUC theories and to provide the basis for scientific knowledge. In other words, the findings from a properly designed EUC Score experiment can make the difference between guesswork and solid facts. Our approach can also call for a new EUC theory, either by showing that an accepted theory is incorrect, or by exhibiting a new phenomenon that is in need of explanation. EUC Score test engineers may also want to investigate a phenomenon just because it looks interesting.

It is critically important to pose a testable EUC question, state a hypothesis or theory, and then design an EUC Score experiment with the goal to answer the question or refine the theory. It is our strong believe that the learnings from EUC Score experiments allow us to deliver better remote end-user experience for consumers and business users.

The prerequisites for performing simple EUC Score experiments are as follows:

  • Provide a physical PC or a virtual machine with a Windows operating system, acting as the target system.
  • Prepare a user account that allows you to log in to the console of the target system.
  • Download the EUC Score Base installation package for installation on the target system.

The prerequisites for performing advanced EUC Score experiments are as follows.

  • Provide a physical PC or a virtual machine with a Windows operating system, acting as the target system. Windows must be configured in such a way that it allows remote connections.
  • Prepare or create one or multiple user accounts that allow you to connect to the target system. Ideally, one of these user accounts has administrative privileges.
  • Download the EUC Score Base package, the EUC Score Enterprise package and the Sync Player package for installation on the target system.
  • Provide a physical endpoint device with the remoting software installed that allows remote desktop connections to the target system.
  • Provide a screen video capture device and a video recorder PC with OBS Studio installed.

The Terminology page includes the technical or special terms used in the context of EUC Score experiments.

Phase 1: Design and Build a Test Environment

Building the EUC test lab environment represents an important phase in each EUC Score experiment. Test environments are built by EUC benchmarking experts or test engineers. This includes the following steps:

Lego Image
  1. Define a test goal. This may include the decision if it is a single-user or multi-user test setup.
  2. Build or provide one or multiple target systems, including guest VMs.
  3. Optional: Include Active Directory, file server resources and other backend services in the test infrastructure.
  4. Optional: Apply system optimization settings required by the test setup.
  5. Install the EUC Score Base package on the target system. It includes some Simloads, the Simload Runner, and a number of supplementary tools.
  6. Optional (Enterprise): Install the EUC Score Enterprise package on the target system. It adds more Simloads, the Avatar tray application, and EUC Score PowerShell modules. Some Simloads included in this package require the additional installation of applications and DirectX libraries.
  7. Review the installed Simloads with the help of the Simload Gallery. Check if all required third-party applications are working as expected and do not show "first start of application" or "accept license agreement" dialog boxes.
  8. Provide a physical endpoint device with remoting software. Document the specifications of this endpoint device.
  9. Prepare data collection: Adapt the Simload Runner or the Avatar telemetry configuration file according to target system and test setup.
  10. Recommended, but not mandatory: Connect a screen video capture device to the monitor output of the endpoint device.
  11. Recommended, but not mandatory: Provide a screen video recorder PC with OBS Studio installed and connected to the output of a video capture device.
  12. Optional: Set up and configure network components, such as internet routers and WAN emulators.

Check out the lab equipment page for more details on video capture devices and WAN emulators. Learn about the technical or special terms used in the context of EUC Score experiments.

It is important to note that the EUC Score toolset does not include a component or an agent that requires system privileges. Only if the test setup requires multi-user support, the toolset must be deployed with admin privileges. In single-user or VDI scenarios, standard user privileges are sufficient. Even copy-and-paste works if needed. This makes it a low-touch deployment.

EUC Score is compatible with all remoting protocols. So far, we have tested remote sessions connected over Microsoft RDP, Azure Virtual Desktop RDP SxS, Citrix HDX, VMware Blast, Teradici PCoIP, AWS NICE DCV, Frame FRP and several HTML5 protocols. Screen resolution is limited by the video capture device, the highest we can go with today's frame grabbers is 4k at 60fps.

Phase 2: Perform Test Runs

This project phase focuses on performing controlled and reproducible EUC Score experiments. Selected test sequences are executed in a closely controlled way and optionally recorded as screen videos using a video capture device or frame grabber connected to a physical endpoint device's video output.

Here are the steps to successfully perform EUC Score test runs and collect benchmark datasets:

Lego Image
  1. Connect user session(s) from the endpoint device.
  2. Important: Collect system information by running the SL0-TestScreen Simload and Check-System.cmd in the Scripts folder. The results are stored in the Results folder.
  3. Check that the telemetry settings in Tools\Telemetry.ini (Simload Runner) or Tools\Telemetry\TelemetryDataConfig.xml (Avatar) are working as expected.
  4. Select the Simloads used during the test sequence. The Simload Gallery helps you to make the right choices.
  5. Optional: Run secondary user sessions (“noisy neighbors”) from a load generator, creating a constant pre-defined background workload.
  6. Run selected primary Simloads and collect datasets.
  7. Optional (requires Enterprise package): Run selected primary Simloads from the Avatar command line and record the screen videos. Rename each screen video file to reflect its Simload name.
  8. Upload the benchmark datasets collected on the target system to an EUC Score data repository for further analysis.
  9. Store a copy of the telemetry configuration file as part of the benchmark dataset.
  10. Preview selected benchmark datasets.

The EUC Score toolset includes two host-side components for collecting telemetry data every second. One is built into Simload Runner, and the other - named Telemetry Collector - is integrated into the Avatar (plus it is exposed by PowerShell). They both allow the modification of the performance counter set through config files, but this should be limited to less than 25 counters for resource consumption reasons. The supplementary tools, such as Telemetry Tracker, included in the base installation package are using the Simload Runner method to collect telemetry data.

The Telemetry Collector component built into Avatar is specifically interesting for GPU-accelerated user sessions as it hooks into the Windows Driver API to collect GPU data even when there are no performance counters available. This is an EUC Score unique selling point.

Learn about the technical or special terms used in the context of EUC Score experiments.

Phase 3: Visualize and Analyze the Test Results

When feeding the results collected in the previous phase into a side-by-side video player with data overlay, called Sync Player, it presents screen videos and performance data in a way that is easy to understand and to interpret. This allows the analysis of the most important remote end-user experience factors, such as user interface response times, screen refresh cycles (frame rates), graphics formats and media types supported, perceived graphics and media performance, noticeable distortion of media caused by codecs and performance influenced by varying network conditions.

Below are the individual steps required to analyze the test results:

Lego Image
  1. Use the data files collected during an experiment and stored in the EUC Score data repository to create a Simload folder structure with the benchmark datasets, the score result file and the system information files on a Windows PC used to analyse EUC Score benchmark datasets.
  2. Optional: Install the Sync Player package on the Windows PC used to analyse EUC Score benchmark datasets.
  3. Optional (requires Sync Player): Create and edit the Sync Player build file for each test setup.
  4. Optional (requires Sync Player): Run the PowerShell build scripts to create Sync Player clips.
  5. Analyze test results, either by opening the telemetry files in Excel or by watching the Sync Player clips.
  6. Optional (requires Sync Player): Add findings to the Sync Player build file. Use this terminology to describe screen artifacts observed in the screen video recordings.
  7. Optional (requires Sync Player): Recreate Sync Player clips with the findings.
  8. Draw your conclusions and produce a summary of your findings.
  9. Share or publish the results.
  10. Suggest next steps or start all over again.

Check out some community experiments and their results, visualized by Sync Player.