BenchmarkXPRT Blog banner

Category: Benchmarking

A Chrome-plated example

A couple of weeks ago, we talked about how benchmarks have to evolve to keep up with the changing ways people use their devices. One area where we are expecting a lot of change in the next few months is Chromebooks.

These web-based devices have become very popular, even outselling Macs for the first time in Q1 of this year. Chromebooks run Google Apps and a variety of third-party Chrome apps that also run on Windows, Mac, and Linux systems.

Back in May, Google announced that Android apps would be coming to Chromebooks. This exciting development will bring a lot more applications to the platform. Now, Google has announced that they will be “moving away” from the Chrome apps platform and will be phasing out Chrome app support on other platforms within the next two years.

Clearly, the uses of Chromebooks are likely to change a lot in coming months. Interestingly, part of the rationale Google gives for this decision is the development of powerful new Web APIs, which will have implications for WebXPRT as well.

As we’ve said before, we’ll be watching and adapting as the applications change.

Eric

Apples to apples?

PCMag published a great review of the Opera browser this week. In addition to looking at the many features Opera offers, the review included performance data from multiple benchmarks, which look at areas such as hardware graphics acceleration, WebGL performance, memory consumption, and battery life.

Three of the benchmarks have a significant, though not exclusive, focus on JavaScript performance: Google Octane 2.0, JetStream 1.1, and WebXPRT 2015. The three benchmarks did not rank the browsers the same way, and in the past, we‘ve discussed some of the reasons why this happens. In addition to the difference in tests, there are also sometimes differences in approaches that are worth considering.

For example, consider the test descriptions for JetStream 1.1. You’ll immediately notice that the tests are much lower-level tests than the ones in WebXPRT. However, consider these phrases from a few of the test descriptions:

  • code-first-load “…This test attempts to defeat the browser’s caching capabilities…”
  • splay-latency “Tests the worst-case performance…”
  • zlib “…modified to restrict code caching opportunities…”

 

While the XPRTs test typical performance for higher level applications, the tests in JetStream are tweaked to stress devices in very specific ways, some of which are not typical. The information these tests provide can be very useful for engineers and developers, but may not be as meaningful to the typical user.

I have to stress that both approaches are valid, but they are doing somewhat different things. There’s a cliché about comparing apples to apples, but not all apples are the same. If you’re making a pie, a Granny Smith would be a good choice, but for snacking, you might be better off with a Red Delicious. Knowing a benchmark’s purpose will help you find the results that are most meaningful to you.

Eric

The things we do now

We mentioned a couple of weeks ago that the Microsoft Store added an option to indicate holographic support, which we selected for TouchXPRT. So, it was no surprise to see Microsoft announce that next year, they will release an update to Windows 10 that enables mainstream PCs to run the Windows Holographic shell. They also announced that they‘re working with Intel to develop a reference architecture for mixed-reality-ready PCs. Mixed-reality applications, which combine the real world with a virtual reality, demand sophisticated computer vision, and applications that can learn about the world around them.

As we’ve said before, we are constantly watching how people use their devices. One of the most basic principles of the XPRT benchmarks is to test devices using the same kinds of work that people do in the real world. As people find new ways to use their devices, the workloads in the benchmarks should evolve as well. Virtual reality, computer vision, and machine learning are among the technologies we are looking at.

What sorts of things are you doing today that you weren’t a year ago? (Other than Pokémon GO – we know about that one.) Would you like to see those sorts of workloads in the XPRTs? Let us know!

Eric

An anniversary update

The Windows 10 Anniversary Update release is scheduled for August 2, and we’ve been running the XPRTs on the Windows Insider preview builds. While we can’t publish performance data from developer builds, we’re happy to say that WebXPRT and TouchXPRT run well on the Anniversary Update.

The story for HDXPRT 2014 is more complicated. Back in May, we reported that it would not run on more recent versions of Windows. However, we’ve identified steps that enable HDXPRT to run on the current stable Windows 10 build, as well as the latest Anniversary Update preview. It’s running well, but it’s possible that testers will encounter other issues as Microsoft releases new builds.

We have included the steps below. We’re considering an update to HDXPRT 2014 that will incorporate these changes. If you have any comments or suggestions related to HDXPRT, please let us know.

Justin

Summary
In addition to the normal system configuration requirements for HDXPRT, testers must also overwrite HDXPRT’s CPU-Z files with newer versions and change the default browser from Microsoft Edge to Internet Explorer. After configuring the system for HDXPRT testing, testers may encounter errors related to administrative privileges when attempting to launch Microsoft Edge. Returning User Account Control settings to their default pre-configuration state resolves the problem.

Process
1. Install the latest version of CPU-Z.
      a. Open any browser and download the latest version of CPU-Z for Windows
          (currently CPU-Z 1.76).
      b. Install CPU-Z on the system, using the default settings and installation path.
2. Install the HDXPRT 2014 benchmark using the default installation process. Reboot the system
    after installation.
3. Copy all the files from the C:\Program Files\CPUID\CPU-Z\ directory to the C:\Program Files
    (x86)\HDXPRT\bin, and overwrite the existing CPU-Z files.
4. Change the default browser from Microsoft Edge to Internet Explorer:
      a. Open the Windows Settings app and select System/Default apps.
      b. Under Web browser, click the Edge icon, and select Internet Explorer from the list.
      c. At the Before you switch window, click Switch anyway.
      d. Close the Settings app.
5. Adjust SmartScreen and security settings:
      a. Open Internet Explorer.
      b. Go to Settings/Internet options/Security, and make the following changes for the Internet
           and Trusted Sites zones:
            i. Select Custom Level.
            ii. Disable SmartScreen Filter.
            iii. Under Launching applications and unsafe files, click Enable (not Secure).
            iv. Click OK, and click Apply. If a warning message appears, click Yes.
6. Restart the system.
7. Open HDXPRT and run the benchmark normally.

If, after installing HDXPRT, you encounter an error related to administrative permissions when trying to open Microsoft Edge, return User Account Controls to the default setting, and restart the system. The default User Account Control setting is the third notch from the bottom: “Notify me only when apps try to makes changes to my computer.”

Open source?

We’re proud of the BenchmarkXPRT Development Community and its accomplishments over the last five years. We’re also thankful for the contributions the members of the community have made. One of the benefits of membership is access to the source code for all the XPRT performance tools. This has meant that the code is available to anyone willing to take the easy step of joining the community.

Behind our decision to use this model rather than a more traditional, open-source model was the need to control derivative works. The license agreement for the source allows members to modify the source, but not to claim that the results from that derivative code are XPRT results. For example, as a member, you may download the TouchXPRT source and modify the workloads for your specific purposes, but you can’t refer to the results as TouchXPRT results.

After much thought and discussion, we have come to believe that we can protect the benchmarks’ reputation within a traditional, open-source framework. While our original concerns are still valid, we think that the success and stature of the XPRTs is such that we can make it available via open source.

However, before we take this step, we want to hear the thoughts, concerns, and opinions of both our community members and the wider public.

Please note that if we do make the code open source, the other benefits of being a member—access to requests for comment, design documents, and community previews—will not change.

Please let us know that you think. Email us or contact us on Twitter.

Bill

Seeing the future

Back in April we wrote about how Bill’s trip to IDF16 in Shenzhen got us thinking about future benchmarks. Technologies like virtual reality, the Internet of things, and computer vision are going to open up lots of new applications.

Yesterday I saw an amazing article that talked about an automatic computer vision system that is able to detect early-stage esophageal cancer from endoscopy images. These lesions can be difficult for physicians to detect, and the system did very well when compared to four experts who participated in the test. The article contains a link to the original study, for those of you who want more detail.

To me, this is the stuff of science fiction. It’s a very impressive accomplishment. Clearly, new technologies are going to lead to many new and exciting applications.

While this type of application is more specialized than the typical XPRT, things like this get us really excited about the possibilities for the future.  Have you seen an application that impressed you recently? Let us know!

Eric

Check out the other XPRTs:

Forgot your password?