BenchmarkXPRT Blog banner

Category: WebXPRT

Apples to apples?

PCMag published a great review of the Opera browser this week. In addition to looking at the many features Opera offers, the review included performance data from multiple benchmarks, which look at areas such as hardware graphics acceleration, WebGL performance, memory consumption, and battery life.

Three of the benchmarks have a significant, though not exclusive, focus on JavaScript performance: Google Octane 2.0, JetStream 1.1, and WebXPRT 2015. The three benchmarks did not rank the browsers the same way, and in the past, we‘ve discussed some of the reasons why this happens. In addition to the difference in tests, there are also sometimes differences in approaches that are worth considering.

For example, consider the test descriptions for JetStream 1.1. You’ll immediately notice that the tests are much lower-level tests than the ones in WebXPRT. However, consider these phrases from a few of the test descriptions:

  • code-first-load “…This test attempts to defeat the browser’s caching capabilities…”
  • splay-latency “Tests the worst-case performance…”
  • zlib “…modified to restrict code caching opportunities…”

 

While the XPRTs test typical performance for higher level applications, the tests in JetStream are tweaked to stress devices in very specific ways, some of which are not typical. The information these tests provide can be very useful for engineers and developers, but may not be as meaningful to the typical user.

I have to stress that both approaches are valid, but they are doing somewhat different things. There’s a cliché about comparing apples to apples, but not all apples are the same. If you’re making a pie, a Granny Smith would be a good choice, but for snacking, you might be better off with a Red Delicious. Knowing a benchmark’s purpose will help you find the results that are most meaningful to you.

Eric

An anniversary update

The Windows 10 Anniversary Update release is scheduled for August 2, and we’ve been running the XPRTs on the Windows Insider preview builds. While we can’t publish performance data from developer builds, we’re happy to say that WebXPRT and TouchXPRT run well on the Anniversary Update.

The story for HDXPRT 2014 is more complicated. Back in May, we reported that it would not run on more recent versions of Windows. However, we’ve identified steps that enable HDXPRT to run on the current stable Windows 10 build, as well as the latest Anniversary Update preview. It’s running well, but it’s possible that testers will encounter other issues as Microsoft releases new builds.

We have included the steps below. We’re considering an update to HDXPRT 2014 that will incorporate these changes. If you have any comments or suggestions related to HDXPRT, please let us know.

Justin

Summary
In addition to the normal system configuration requirements for HDXPRT, testers must also overwrite HDXPRT’s CPU-Z files with newer versions and change the default browser from Microsoft Edge to Internet Explorer. After configuring the system for HDXPRT testing, testers may encounter errors related to administrative privileges when attempting to launch Microsoft Edge. Returning User Account Control settings to their default pre-configuration state resolves the problem.

Process
1. Install the latest version of CPU-Z.
      a. Open any browser and download the latest version of CPU-Z for Windows
          (currently CPU-Z 1.76).
      b. Install CPU-Z on the system, using the default settings and installation path.
2. Install the HDXPRT 2014 benchmark using the default installation process. Reboot the system
    after installation.
3. Copy all the files from the C:\Program Files\CPUID\CPU-Z\ directory to the C:\Program Files
    (x86)\HDXPRT\bin, and overwrite the existing CPU-Z files.
4. Change the default browser from Microsoft Edge to Internet Explorer:
      a. Open the Windows Settings app and select System/Default apps.
      b. Under Web browser, click the Edge icon, and select Internet Explorer from the list.
      c. At the Before you switch window, click Switch anyway.
      d. Close the Settings app.
5. Adjust SmartScreen and security settings:
      a. Open Internet Explorer.
      b. Go to Settings/Internet options/Security, and make the following changes for the Internet
           and Trusted Sites zones:
            i. Select Custom Level.
            ii. Disable SmartScreen Filter.
            iii. Under Launching applications and unsafe files, click Enable (not Secure).
            iv. Click OK, and click Apply. If a warning message appears, click Yes.
6. Restart the system.
7. Open HDXPRT and run the benchmark normally.

If, after installing HDXPRT, you encounter an error related to administrative permissions when trying to open Microsoft Edge, return User Account Controls to the default setting, and restart the system. The default User Account Control setting is the third notch from the bottom: “Notify me only when apps try to makes changes to my computer.”

Getting it right

Back in April Bill announced that we are working on a cross-platform benchmark. We asked for your thoughts and comments, and you’ve been great! We really appreciate all the great ideas.

We’ve been using code from MobileXPRT and TouchXPRT as the basis for some experiments. In his post, Bill talked about the difficulty of porting applications. However, even though we have expertise in porting applications, it’s proving more difficult than we originally thought. Benchmarks are held to a higher standard than most applications. It’s not enough for the code to run reliably and efficiently, it must compare the different platforms fairly.

One thing we know for sure: getting it right is going to take a while. However, we owe it to you to make sure that the benchmark is reliable and fair on all platforms it supports. We will, of course, keep you informed as things progress.

In the meantime, keep sending your ideas!
Eric

One benchmark to test them all

It’s no secret that the XPRTs are a great way to get device results you can count on. Tens of thousands of people over six continents have used the XPRTs to help them make smart buying choices, and over a thousand media outlets have quoted XPRT results when reporting on the hottest tech. WebXPRT has always been the “go to” XPRT, because you can use it to test the widest range of devices. WebXPRT runs in the browser, however, so browser performance influences the results.

For a long time, our members and others have asked for a tool that would let you compare application performance on any type of device. People want a cross-platform XPRT that runs on devices the same way apps do.

We’re excited to announce that we’re going to create just that tool! Specifically, we’re going to create a version of MobileXPRT that runs on Android, iOS, and Windows.

This will not be easy. At one point in my career, I was in charge of a group that ported applications between platforms, and I learned from hands-on experience that doing that job well is very difficult. It’s not enough to simply make the application run; it also has to run efficiently on each type of system. MobileXPRT works at the application level, so we’ll have to deal with the many differences in the operating system architectures and APIs. We’ll have to make sure the code runs well on all three target OSes.

We’re willing to do all this work because the need for such a tool has never been greater. More and more devices hit the market all the time, and choosing the ones you want is tougher than ever. iPhone or Android phone? Windows tablet, Android tablet, or iPad?

The coming MobileXPRT will let buyers around the world answer those questions.

We’re not going to do this work in isolation. We will reach out to the OS vendors, because we want their input, comments, and help. We’ll make the source available to them, and we welcome their critiques and guidance in creating the best possible version for each OS.

Of course, we very much want your input, too. Do you have any thoughts about what you’d like to see in a cross-platform XPRT? If so, let us know!

Bill

Last week in the XPRTs
We published the XPRT Weekly Tech Spotlight on the Apple iPhone SE.
We added one new MobileXPRT ’15 result.
We added seven new WebXPRT ’15 results.

Focusing the spotlight

As you may have heard, the Samsung Galaxy S7 is the XPRT Weekly Tech Spotlight this week.  As we were testing it, we noticed that our WebXPRT scores were about 8 percent lower than those reported by AnandTech.

The folks at AnandTech do a good job on their reviews, so we wanted to understand the discrepancy in scores. The S7 comes in a couple of models, so we started by verifying that our model was the same as theirs. It was.

The next step was to check their configuration against ours, and this is where we found the difference. Both phones were running the same version of Android, but the S7 AnandTech tested used Chrome 48 while the S7 we tested came preloaded with Chrome 49. In our testing, we’ve noticed that upgrading from Chrome 48 to Chrome 49 has a noticeable performance impact on certain devices. On the Samsung Galaxy S6, the scores went down about 10 percent. In all cases we’ve seen, the decrease is driven largely by the Stock Option Pricing workload.

This isn’t the first time we’ve written about browser versions affecting results. WebXPRT is a browsing benchmark, and the browser has a legitimate impact on performance. When you’re comparing results, it’s always important to look at all the factors involved.

Justin

Last week in the XPRTs

We published the XPRT Weekly Tech Spotlight on the Samsung Galaxy S7.
We added two new BatteryXPRT ’14 results.
We added one new MobileXPRT ’15 result.
We added four new WebXPRT ’15 results.

Women develop new perspectives for the XPRTs

Last weekend, we had the great privilege of co-hosting the first XPRT Women Code-a-Thon with the Seattle chapter of ChickTech. We couldn’t be happier with the results!

Our goal was to bring together a group of women and invite them to develop ideas for new device workloads—workloads that we might include in future versions of MobileXPRT and WebXPRT. The 20 participants—some working individually, and others working as teams—not only met that goal, they did a great deal more.

On the coding front, the participants achieved an impressive amount of work in a very short time. Though we awarded only three prizes, everyone generated interesting and useful ideas. Our prizes went to the following people:

1st place: Viveret Steele, for a 3D-modeling workload

2nd place: Annmarie Aidoo, for a geolocation workload

3rd place: Molly Fallen and Alex Trimble, for an audio-enhancement workload

These four people went home with checks, but winning wasn’t what motivated anyone to participate. Everyone was excited about developing software and working with others. The social side of the event proved to be as meaningful as the technical. People talked, formed friendships and mentoring relationships, and discussed seeking other events like this one. Two people said the event changed their lives.

In the weeks ahead, we’ll be sharing some more information about the event. In the meantime, we’re proud to have been part of it.

Jennie Faries

Last week in the XPRTs

We published the XPRT Weekly Tech Spotlight on the Microsoft Surface 3.
We added two new CrXPRT ’15 results.
We added two new MobileXPRT ’13 results.
We added six new WebXPRT ’15 results.

Check out the other XPRTs:

Forgot your password?