EUC Score introduces a benchmarking methodology and a toolset focused on what a remote user really sees on the endpoint screen while measuring the timely correlated load metrics generated on the host platform. Test engineers and IT infrastructure architects can apply the underlying commonly accepted best practices and "recipes" for benchmarking, quality management and performance optimization in End-User Computing (EUC) and Desktop-as-a-Service (DaaS) environments.
Our benchmarking approach is based on valid experimental evidence and rational discussion. Lab experiments conducted with the EUC Score toolset allow you to test EUC theories and to provide the basis for scientific knowledge. In other words, the findings from a properly designed EUC Score experiment can make the difference between guesswork and solid facts. Our approach can also call for a new EUC theory, either by showing that an accepted theory is incorrect, or by exhibiting a new phenomenon that is in need of explanation. EUC Score test engineers may also want to investigate a phenomenon just because it looks interesting.
It is critically important to pose a testable EUC question, state a hypothesis or theory, and then design an EUC Score experiment with the goal to answer the question or refine the theory. It is our strong believe that the learnings from EUC Score experiments allow us to deliver better remote end-user experience for consumers and business users.
The prerequisites for performing simple EUC Score experiments are as follows:
The prerequisites for performing advanced EUC Score experiments are as follows.
The Terminology page includes the technical or special terms used in the context of EUC Score experiments.
Building the EUC test lab environment represents an important phase in each EUC Score experiment. Test environments are built by EUC benchmarking experts or test engineers. This includes the following steps:
Check out the lab equipment page for more details on video capture devices and WAN emulators. Learn about the technical or special terms used in the context of EUC Score experiments.
It is important to note that the EUC Score toolset does not include a component or an agent that requires system privileges. Only if the test setup requires multi-user support, the toolset must be deployed with admin privileges. In single-user or VDI scenarios, standard user privileges are sufficient. Even copy-and-paste works if needed. This makes it a low-touch deployment.
EUC Score is compatible with all remoting protocols. So far, we have tested remote sessions connected over Microsoft RDP, Azure Virtual Desktop RDP SxS, Citrix HDX, VMware Blast, Teradici PCoIP, AWS NICE DCV, Frame FRP and several HTML5 protocols. Screen resolution is limited by the video capture device, the highest we can go with today's frame grabbers is 4k at 60fps.
This project phase focuses on performing controlled and reproducible EUC Score experiments. Selected test sequences are executed in a closely controlled way and optionally recorded as screen videos using a video capture device or frame grabber connected to a physical endpoint device's video output.
Here are the steps to successfully perform EUC Score test runs and collect benchmark datasets:
The EUC Score toolset includes two host-side components for collecting telemetry data every second. One is built into Simload Runner, and the other - named Telemetry Collector - is integrated into the Avatar (plus it is exposed by PowerShell). They both allow the modification of the performance counter set through config files, but this should be limited to less than 25 counters for resource consumption reasons. The supplementary tools, such as Telemetry Tracker, included in the base installation package are using the Simload Runner method to collect telemetry data.
The Telemetry Collector component built into Avatar is specifically interesting for GPU-accelerated user sessions as it hooks into the Windows Driver API to collect GPU data even when there are no performance counters available. This is an EUC Score unique selling point.
Learn about the technical or special terms used in the context of EUC Score experiments.
When feeding the results collected in the previous phase into a side-by-side video player with data overlay, called Sync Player, it presents screen videos and performance data in a way that is easy to understand and to interpret. This allows the analysis of the most important remote end-user experience factors, such as user interface response times, screen refresh cycles (frame rates), graphics formats and media types supported, perceived graphics and media performance, noticeable distortion of media caused by codecs and performance influenced by varying network conditions.
Below are the individual steps required to analyze the test results:
Check out some community experiments and their results, visualized by Sync Player.