BenchmarkXPRT Blog banner

Category: Performance benchmarking

Upcoming experiments

Next week, we’ll be releasing the design overview for WebXPRT 2015. WebXPRT 2013 has been an enormous success, having been run tens of thousands of times.

One of the big improvements we are considering for WebXPRT 2015 is adding experimental tests. A big reason for WebXPRT’s success is that it runs on almost every Web-enabled device. We consider it essential to preserve this broad compatibility. However, there are interesting Web technologies that simply are not available on all devices.

Our proposal is to add experimental tests to WebXPRT. These tests would be optional and would not be included in the Overall score, so WebXPRT would still be able to compare the performance of widely different devices. We are looking at technologies such as Web Workers, WebGL, and pre-compiled JavaScript (asm.js).

In addition to adding experimental tests, we are looking at ways to improve the UI, add automation, add new tests, update old tests, and more!

If you are a community member, you’ll get a notice when the overview is available. We will definitely want to know what you think! If you are not a member, it’s a great time to join.

If you have any thoughts on these ideas, or have ideas of your own, please let us know!

Eric

Comment on this post in the forums

CrXPRT is here!

Today we are releasing the CrXPRT 2014 Community Preview (CP1). As mentioned in a previous post, CrXPRT contains performance and battery life tests. The performance suite includes five scenarios utilizing Web browsing and JavaScript workloads, plus Portable Native Client (PNaCl) and WebGL-based scenarios. The battery life test incorporates all of the performance workloads and adds video playback, audio playback, and HTML5 gaming scenarios.

The battery life test in CP1 builds on the lessons we learned from developing BatteryXPRT 2014 for Android. In fact, we’ve been able to improve on the testing time. BatteryXPRT 2014 requires 5.5 hours to estimate battery life; CP1 can estimate battery life in only 3.5 hours. The battery test in CP1 still requires the device be put in developer mode, so we’re investigating the new Chrome OS battery status APIs. We hope these will make it possible to remove this restriction in a future release.

The estimates for battery life are generally pretty accurate. However, we have seen runs where the battery life results were much higher than expected. We are continuing to investigate this. If you see an anomalous result, please let us know. It is worth noting that the performance scores have been very consistent.

Because this is a community preview, you have to be a community member to download it. However, joining is very easy.

Check out the new CrXPRT, and let us know what you think!

Eric

Comment on this post in the forums 

Seeing the whole picture

In past posts, we’ve discussed how people tend to focus on hardware differences when comparing performance or battery life scores between systems, but software factors such as OS version, choice of browser, and background activity often influence benchmark results on multiple levels.

For example, AnandTech recently published an article explaining how a decision by Google Chrome developers to increase Web page rendering times may have introduced a tradeoff between performance and battery life. To increase performance, Chrome asks Windows to use 1ms interrupt timings instead of the default 15.6ms timing. Unlike other applications that wait for the default timing, Chrome ends up getting its work done more often.

The tradeoff for that increased performance is that waking up the OS more frequently can diminish the effectiveness of a system’s innate power-saving attributes, such as a tick-less kernel and timer coalescing in Windows 8, or efficiency innovations in a new chip architecture. In this case, because of the OS-level interactions between Chrome and Windows, a faster browser could end up having a greater impact on battery life than might initially be suspected.

The article discusses the limitations of their test in detail, specifically with regards to Chrome 36 not being able to natively support the same HiDPI resolution as the other browsers, but the point we’re drawing out here is that accurate testing involves taking all relevant factors into consideration. People are used to the idea that changing browsers may impact Web performance, but not so much is said about a browser’s impact on battery life.

Justin

Comment on this post in the forums

Looking for a bargain?

There are many benefits to being a member of the community: the XPRT community previews, the source code for the benchmarks, the monthly newsletter, and more. To join the community, all you’ve had to do up until now is sign up and pay a one-time $20 fee. Our goal with the fee was to make sure that people who joined were serious.

Today, we’re announcing a change. We recognize that, for some companies, getting that $20 fee reimbursed can be a hassle. So, if you work for a device maker, OEM, chip manufacturer, or retailer, you’ll be able to join the community for free.

Here’s how it works: Simply fill out the form, use your company e-mail address, and click the option to be considered for a free membership. We’ll send you an email within one business day to verify the address is real and then activate your membership.

Simple, right?

Justin

Comment on this post in the forums

More details to come

As we’ve been saying the past couple of months, we’re working on a benchmark for Chrome OS. The experimentation phase is winding down, and we are starting to shape the code into a useable benchmark. The design plan will leverage existing WebXPRT tests, of course. However, we’ve gone far beyond that. The benchmark will include video playback, 3D modeling via WebGL, and even an HTML5 game.  The test also uses Chrome OS’ native execution capability. The benchmark will actually use the Portable Native Client (PNaCl), as PNaCl is the recommended tool chain for native client. It also gives the benchmark the ability to run on more platforms.

As we mentioned before, we’re including a battery test as part of the new benchmark. So far, we haven’t found a way to remove the requirement to put the device in developer mode for the battery test.

Next week, we’ll publish a design document for the community to review. As always, the design document is based on the comments and suggestions we received combined with our own research and experimentation.

Eric

Comment on this post in the forums

It makes a difference

Ars Technica reported this week that they tested the developer preview of Android L and saw a whopping 36 percent improvement in battery life! Google made improving battery life a priority, and it sounds like they are succeeding. I can’t wait to test Android L with BatteryXPRT.

This is a spectacular example of how a change in software can change benchmark results, but it’s hardly unique. I’ve written before about how background activity on a phone depressed my friend’s WebXPRT scores. AnandTech used both IE 11 and Chrome 30 to test the Surface Pro 2 with a variety of benchmarks, including WebXPRT, SunSpider, Octane, Browsermark, and others. Browser choice had a noticeable impact on results – about a 40 percent difference for WebXPRT and a 76 percent difference for SunSpider!

People are generally pretty aware that changing the hardware changes performance. However, sometimes they lose track of software differences. When you compare scores, it’s not always possible to keep all the variables the same, but it’s crucial to know what the differences are.

In other BenchmarkXPRT news, we’re making some final adjustments to HDXPRT 2014, and the general release is just around the corner.

Eric

Comment on this post in the forums

Check out the other XPRTs:

Forgot your password?