BenchmarkXPRT Blog banner

Category: MobileXPRT

Diversity

Fall is beautiful in North Carolina. The temperature is dropping.  The leaves are changing color, making the hills scarlet and orange.  And, of course, the stores have been decorated for Christmas since Halloween.

As we head into the biggest shopping season of the year, it’s a great time to be getting XPRT results from the hottest devices. In the last few weeks, we’ve published results from

  • tablets such as the Apple iPad Air, Google Nexus 7 2, and both the Microsoft Surface 2 and Microsoft Surface Pro 2
  • phones such as the Apple iPhone 5c, Apple iPhone 5s, and LG G2
  • devices you might not have expected, such as the Amazon Kindle Fire HDX, Barnes and Noble Nook HD+, and NVIDIA Shield

The diversity of devices is nice to see. The results come from PT testing, the press, and benchmark users. Note that you don’t have to be a community member to submit results. The person who submitted the MobileXPRT Nook HD+ results was not a member. If you’ve tested something interesting, send the results on!

Eric

Comment on this post in the forums

Broadening our appeal

As I mentioned last week, we’ve asked the PT design team to help improve the XPRT benchmarks. I’m learning a lot working with them. As someone who’s been involved with benchmarking a long time, it can be a shock to realize that there are people who think “ms” is a magazine, “geomean” has something to do with the environment, and “UX” sounds like it would be a great name for a band. But the fact is that most consumers don’t need to know any of these terms, so our benchmarks shouldn’t rely on them, either.

This collaboration already paying off. The PT design team rewrote the MobileXPRT FAQ, making it much more extensive, accessible, and fun to read. We think the new FAQ is greatly improved, and it’s certainly more informative. We’ll be upgrading the FAQs for the other benchmarks in the near future.

Our efforts are going far beyond FAQs. Data presentation, graphics, the basic UI design philosophy—everything is on the table. Let us know what you think by emailing benchmarkxprtsupport@principledtechnolgies.com or by posting on the forums.

Eric

Comment on this post in the forums

Staying out in the open

Back in July, Anandtech publicized some research about possible benchmark optimizations in the Galaxy S4. Yesterday, Anandtech published a much more comprehensive article, “The State of Cheating in Android Benchmarks.” It’s well worth the read.

Anandtech doesn’t accuse any of the benchmarks of being biased—it’s the OEMS who are supposedly doing the optimizations. I will note that none of the XPRT benchmarks are among the whitelisted CPU tests. That being said, I imagine that everyone in the benchmark game is concerned about any implication that their benchmark could be biased.

When I was a kid, my parents taught me that it’s a lot harder to cheat in the open. This is one of the reasons we believe so strongly in the community model for software development. The source code is available to anyone who joins the community. It’s impossible to hide any biases. At the same time, it allows us to control derivative works. That’s necessary to avoid biased versions of the benchmarks being published. We think the community model strikes the right balance.

However, any time there is a system, someone will try to game it. We’ll always be on the lookout for optimizations that happen outside the benchmarks.

Eric

Comment on this post in the forums

Sources

If you’ve checked out the MobileXPRT and WebXPRT pages recently, you’ve probably noticed that the number of results has started to grow. The results are coming from three sources:

  • Internal testing at PT.
  • Results submitted by the public.
  • Results published on the Web. We link back to the source from these results. Results published on-line include results in reports PT publishes for clients and reviews by other parties.

While we are excited about the growing number of results, we do sanity check them. We compare the results with other runs for the same device when available, or with similar devices if not.

The source code for the benchmarks is available, and we encourage experimentation. However, it should go without saying that valid runs must come from the builds of the benchmarks the development community has published. We can’t compare two results generated by different builds.

That being said, if you change the code and get an interesting result, by all means do contact us. You may have discovered something that we’ll want to include in a future version

Keep the results coming and keep experimenting!

Eric

Comment on this post in the forums

Improvements

We recently made some changes to how MobileXPRT installs. Previously, when you installed MobileXPRT from Google Play, it downloaded the user experience (UX) tests from the Principled Technologies servers. This required you to have the Unknown sources option set, which allows installation of non-market apps, even if you installed MobileXPRT from Google Play.

We have removed this restriction. MobileXPRT now installs the UX tests from Google play, so installation is cleaner. However, the UX tests are not intended to be installed outside of MobileXPRT.

An unrelated fix now makes sure that MobileXPRT cleans out all of its content when you uninstall it. This fix required the location of the results file to move. See Submit your MobileXPRT 2013 results! for details.

If you prefer, you can still install MobileXPRT from the PT site. However, when installing from the PT site, you will still need the Unknown sources option set.

We continue looking for ways to improve the benchmarks. If you have any things you’d like to see us improve, please post to the forums or send an e-mail to benchmarkxprtsupport@principledtechnologies.com

Eric

Comment on this post in the forums

On to the next thing

Last week, we released MobileXPRT 2013 to the public and published it as a free app on Google Play. On Monday, we will release the source code to the community. It hasn’t been long since we released the source code for MobileXPRT CP 1.1, but it’s an important part of the community model that the source for the current version is available to the community.

While we were putting the finishing touches on MobileXPRT, we’ve been hard at work on HDXPRT 2013. The feedback on HDXPRT made it clear that the benchmark should be smaller, faster, and easier to install. We have been working to keep all the value of the benchmark, and update the workloads to reflect current usage, even as we slim it down.

Speaking of HDXPRT, as we mentioned in The show is in previews, HDXPRT 2012 has issues running on Windows 8.1. However, we have had some success getting HDXPRT to run on Windows 8.1 by using beta drivers from Intel, AMD, and NVIDIA. We are still investigating this, and hope to have a general workaround for this soon.

There’s lots more stuff in the pipeline. Exciting times ahead!

Eric

Comment on this post in the forums

Check out the other XPRTs:

Forgot your password?