BenchmarkXPRT Blog banner

Category: Benchmarking

We haven’t mentioned this in a while

I had a conversation with a community member yesterday who wanted to know whether we would test his device with one of the XPRTs. The short answer is “Absolutely!” The somewhat longer answer follows.

If you send us a device you want us to test, we will do so, with the appropriate set of XPRTs, free of charge. You will know that an impartial, third-party lab has tested your device using the best benchmarking practices. After we share the results with you, you will have three options: (1) to keep the results private, (2) to have us make the results public immediately in the appropriate XPRT results databases, or (3) to delay releasing the results until a future date. Regardless of your choice, we will keep the device so that we can use it as part of our testbed for developing and testing future versions of the XPRTs.

When we add the results to our online databases, we will cite Principled Technologies as the source, indicating that we stand behind the results.

The free testing includes no collateral beyond publishing the results. If you would like to publicize them through a report, an infographic, or any of the other materials PT can provide, just let us know and the appropriate person will contact you to discuss the how much those services would cost.

If you’re interested in getting your device tested for free, contact us at BenchmarkXPRTSupport@principledtechnologies.com.

Eric

Question we get a lot

“How come your benchmark ranks devices differently than [insert other benchmark here]?” It’s a fair question, and the reason is that each benchmark has its own emphasis and tests different things. When you think about it, it would be unusual if all benchmarks did agree.

To illustrate the phenomenon, consider this excerpt from a recent browser shootout in VentureBeat:

 
While this looks very confusing, the simple explanation is that the different benchmarks are testing different things. To begin with, SunSpider, Octane, JetStream, PeaceKeeper, and Kraken all measure JavaScript performance. Oort Online measures WebGL performance. WebXPRT measures both JavaScript and HTML 5 performance. HTML5Test measures HTML5 compliance.

Even with benchmarks that test the same aspect of browser performance, the tests differ. Kraken and SunSpider both test the speed of JavaScript math, string, and graphics operations in isolation, but run different sets of tests to do so. PeaceKeeper profiles the JavaScript from sites such as YouTube and FaceBook.

WebXPRT, like the other XPRTs, uses scenarios that model the types of work people do with their devices.

It’s no surprise that the order changes depending on which aspect of the Web experience you emphasize, in much the same way that the most fuel-efficient cars might not be the ones with the best acceleration.

This is a bigger topic than we can deal with in a single blog post, and we’ll examine it more in the future.

Eric

The benefits of membership

We have a couple of goodies for community members coming tomorrow.

The TouchXPRT 2016 design overview will tell you what we’re planning for the upcoming community preview. Thanks to everyone who’s contributed ideas. Let us know if the design overview omits anything you’d like to see in the benchmark.

The MobileXPRT 2015 source code and the instructions for building MobileXPRT 2015 will be available as well. Community members have access to the source for all the XPRT benchmarks. Making the source available is a pillar of the community model.

Look for the design overview and source code in the members’ area.

If you aren’t yet a member, this is a great time to join!

Eric

Impressed and Excited

A couple of weeks ago we talked about some initiatives we’ve been exploring. This week, we’re happy to be able to talk about a new project. We’re sponsoring a senior project with the computer science department at North Carolina State University (NCSU).

As part of their education, small teams of seniors work with local companies on various programming projects. The students are expected to put significant time into these projects, and the tasks aren’t easy. You can get a sense of the range and complexity of these projects by looking at past project proposals.

We submitted a proposal for creating an experimental benchmark test to NCSU and they accepted it and assigned us a team of students. I’ve met a couple of times now with them. I’m impressed and very excited about what they’re going to do. We’re not ready to talk about the project yet, but if they’re successful, we’ll make the new test available on the BenchmarkXPRT site. We might even include it in a future XPRT!

Our hope is that if this project is successful, we can replicate it at other schools and help train the next generation of benchmark developers. As a bonus, the BenchmarkXPRT community will get some fresh perspectives and some new experimental test tools.

Eric

BenchmarkXPRT in China

Last week, we talked about some of the changes we’re making to the BenchmarkXPRT site to make it easier to use. This week, we’d like to talk a bit about improvements we’ve been making to support our users in China.

As you may remember, the first of the XPRTs to have a Chinese UI was BatteryXPRT. We’ve since released WebXPRT 2015 and MobileXPRT 2015, both of which have also have Chinese UIs. We’re also in the process of getting MobileXPRT 2015 listed in several major Chinese app stores. (MobileXPRT 2013 is currently available from Xiaomi and Zhushou 360.)

In other words, we’re always thinking of ways to enhance the XPRT experience for our users in China. To improve download speeds, we’ve long hosted WebXPRT on a mirror site in Singapore. Recently, based on feedback from our users and our own analysis, we’ve changed the way that the privacy notice is displayed on that site. The change allows you to run WebXPRT without loading any Google analytics, which means faster load times for all users.

We will continue to work to improve our localization. This is an area where we can use the help of the community. If you have translation skills and want to contribute the strings for a UI in your language, let us know.

Eric

Last week in the XPRTs

We added a new TouchXPRT result
We added a new HDXPRT result

MobileXPRT 2015 is here!

Today, we’re releasing MobileXPRT 2015, the latest version of our tool for evaluating the performance of Android devices. The BenchmarkXPRT Development Community has been using a community preview for several weeks, but now anyone can run MobileXPRT and publish their results.

MobileXPRT 2015 is compatible with systems running Android 4.4 and above. It is a 64-bit app, but will work on both 32-bit and 64-bit hardware. The new release includes the same performance workloads as MobileXPRT 2013, but not the UX Tests. If you need the UX tests, MobileXPRT 2013 will continue to be available here.

MobileXPRT 2015 is available at MobileXPRT.com and on the Google Play store. Alternatively, you can download the app using either of the links below:

 

After trying out MobileXPRT 2015, please submit your scores here and send any comments to BenchmarkXPRTsupport@principledtechnologies.com.  We’re eager to hear and see how you’ll use this tool!

Check out the other XPRTs:

Forgot your password?