BenchmarkXPRT Blog banner

Category: Benchmarking

Here’s to 100 more!

This week’s Essential Phone entry marks the 100th device that we’ve featured in the XPRT Weekly Tech Spotlight! It’s a notable milestone for us as we work toward our goal of building a substantial library of device information that buyers can use to compare devices. In celebration, I thought it would be fun to share some Spotlight-related stats.

Our first Spotlight entry was the Google Pixel C way back on February 8, 2016, and we’ve featured a wide array of devices since then:

  • 33 phones
  • 16 laptops
  • 16 tablets
  • 16 2-in-1s
  • 6 small-form-factor PCs
  • 5 desktops
  • 5 game consoles
  • 3 all-in-ones



In addition to a wide variety of device types, we try to include a wide range of vendors. So far, we’ve featured devices from Acer, Alcatel, Alienware, Amazon, Apple, ASUS, BLU, CHUWI, Dell, Essential, Fujitsu, Google, HP, HTC, Huawei, Intel, LeEco, Lenovo, LG, Microsoft, NVIDIA, OnePlus, Razer, Samsung, Sony, Syber, Xiaomi, and ZTE. We look forward to adding many more to that list during the year ahead.

XPRT Spotlight is a great way for device vendors and manufacturers to share PT-verified specs and test results with buyers around the world. If you’re interested in sending in a device for testing, please contact XPRTSpotlight@PrincipledTechnologies.com.

There’s a lot more to come for XPRT Spotlight, and we’re constantly working on new features and improvements for the page. Are there any specific devices or features that you would like to see in the Spotlight? Let us know.

Justin

Machine learning in 2018

We are almost to the end of 2017 and, as you have probably guessed, we will not have a more detailed proposal of our machine learning benchmark ready by the end of the year.

The key aspects of the benchmark proposal we wrote about a few months ago haven’t changed, but we are running behind schedule. We are still hoping to have the proposal ready in Q1 2018 and the tool based on that proposal later in the year. We will keep you posted.

In the meantime, we hope you enjoy as much as we did the recent CGP Grey tech video explanation of machine learning. There are actually two videos—the first one gives a general overview and then the second one does a better job of looking at the current state of machine learning. It talks mainly about the training aspects of machine learning rather than the inference aspects that we are looking into with AIXPRT/MLXPRT.

From all of us in the BenchmarkXPRT Development Community, we hope you and yours have a wonderful holiday and a great start to 2018!

Bill

The WebXPRT 3 Community Preview is here!

Today we’re releasing the WebXPRT 3 Community Preview (CP). As we discussed in the blog last month, in the new version of WebXPRT, we updated the photo-related workloads with new images and a new deep learning task for the Organize Album workload. We also added an optical character recognition task to the Local Notes workload and combined a portion of the DNA Sequence Analysis scenario with a writing sample/spell check scenario to simulate an online homework hub in the new “Online Homework” workload.

Also, longtime WebXPRT users will immediately notice a completely new, but clean and straightforward, UI. We’re still tweaking aspects of the UI and implementing full functionality for certain features such as social media sharing and German language translation, but we don’t anticipate making any significant changes to the overall test or individual workloads before the general release.

As with all community previews, the WebXPRT 3 CP is available only to BenchmarkXPRT Development Community members, who can access the link from the WebXPRT tab in the Members’ Area.

After you try the WebXPRT 3 CP, please send us your comments. Thanks and happy testing!

Justin

News about WebXPRT and BatteryXPRT

Last month, we gave readers a glimpse of the updates in store for the next WebXPRT, and now we have more news to report on that front.

The new version of WebXPRT will be called WebXPRT 3. WebXPRT 3 will retain the convenient features that made WebXPRT 2013 and WebXPRT 2015 our most popular tools, with more than 200,000 combined runs to date. We’ve added new elements, including AI, to a few of the workloads, but the test will still run in 15 minutes or less in most browsers and produce the same easy-to-understand results that help compare browsing performance across a wide variety of devices.

We’re also very close to publishing the WebXPRT 3 Community Preview. For those unfamiliar with our open development community model, BenchmarkXPRT Development Community members have the ability to preview and test new benchmark tools before we release them to the general public. Community previews are a great way for members to evaluate new XPRTs and send us feedback. If you’re interested in joining, you can register here.

In BatteryXPRT news, we recently started to see unusual battery life estimates and high variance when running battery life tests at the default length of 5.25 hours. We think this may be due to changes in how new OS versions are reporting battery life on certain devices, but we’re in the process of extensive testing to learn more. In the meantime, we recommend that BatteryXPRT users adjust the test run time to allow for a full rundown.

Do you have questions or comments about WebXPRT or BatteryXPRT? Let us know!

Justin

How to submit results for the WebXPRT Processor Comparison Chart

The WebXPRT 2015 Processor Comparison Chart is in its second month, and we’re excited to see that people are browsing the scores. We’re also starting to receive more WebXPRT score submissions for publication, so we thought it would be a good time to describe how that process works.

Unlike sites that publish any results they receive, we hand-select results from internal lab testing, end-of-test user submissions, and reliable tech media sources. In each case, we evaluate whether the score is consistent with general expectations. For sources outside of our lab, that evaluation includes checking to see whether there is enough detailed system information to get a sense of whether the score makes sense. We do this for every score on the WebXPRT results page and the general XPRT results page.

If we decide to publish a WebXPRT result, that score automatically appears in the processor comparison chart as well. If you would like to submit your score, the submission process is quick and easy. At the end of the WebXPRT test run, click the Submit button below the individual workload scores, complete the short submission form, and click Submit again. The screenshot below shows how the form would look if I submitted a score at the end of a WebXPRT run on my personal system.

WebXPRT results submission

After you submit your score, we’ll contact you to confirm the name we should display as the source for the data. You can use one of the following:

  • Your first and last name
  • “Independent tester,” if you wish to remain anonymous
  • Your company’s name, provided that you have permission to submit the result in their name. If you want to use a company name, we ask that you provide your work email address.


We will not publish any additional information about you or your company without your permission.

We look forward to seeing your score submissions, and if you have suggestions for the processor chart or any other aspect of the XPRTs, let us know!

Justin

Nothing to hide

I recently saw an article in ZDNet by my old friend Steven J. Vaughan-Nichols that talks about how NetMarketShare and StatCounter reported a significant jump in the operating system market shares for Linux and Chrome OS. One frustration Vaughan-Nichols alluded to in the article is the lack of transparency into how these firms calculated market share, so he can’t gauge how reliable they are. Because neither NetMarketShare nor StatCounter disclosed their methods, there’s no sure way for interested observers to verify the numbers. Steven prefers the data from the federal government’s Digital Analytics Program (DAP). DAP makes its data freely available, so you can run your own calculations. Transparency generates trust.

Transparency is a core value for the XPRTs. We’ve written before about how statistics can be misleading. That’s why we’ve always disclosed exactly how the XPRTs calculate performance results, and the way BatteryXPRT calculates battery life. It’s also why we make each XPRT’s source code available to community members. We want to be open and honest about how we do things, and our open development community model fosters the kind of constructive feedback that helps to continually improve the XPRTs.

We’d love for you to be a part of that process, so if you have questions or suggestions for improvement, let us know. If you’d like to gain access to XPRT source code and previews of upcoming benchmarks, today is a great day to join the community!

Eric

Check out the other XPRTs:

Forgot your password?