BenchmarkXPRT Blog banner

Category: Benchmark metrics

TouchXPRT: a great tool for evaluating Windows performance

From time to time, we remember that some XPRT users have experience with only one or two of the benchmark tools in our portfolio. They might have bookmarked a link to WebXPRT they found in a tech review or copied the HDXPRT installer package from a flash drive in their lab, but are unaware of other members of the XPRT family that could be useful to them. To spread the word on the range of capabilities the XPRTs offer, we occasionally highlight one of the XPRT tools in the blog . Last week, we discussed CrXPRT, a benchmark for evaluating the performance and battery life of Chrome OS devices. Today, we focus on TouchXPRT, our app for evaluating the performance of Windows 10 devices.

While our first benchmark, HDXPRT, is a great tool for assessing how well Windows machines handle media creation tasks using real commercial applications, it’s simply too large to run on most Windows tablets, 2-in-1s, and laptops with limited memory. To test those devices, we developed the latest version of TouchXPRT as a Universal Windows Platform app. As a Windows app, installing TouchXPRT is easy and quick (about 15 minutes). It runs five tests that simulate common photo, video, and music editing tasks; measures how quickly the device completes each of those tasks; and provides an overall score. It takes about 15 minutes to run on most devices. Labs can also automate testing using the command line or a script.

Want to run TouchXPRT?

Download TouchXPRT from the Microsoft Store or from TouchXPRT.com. The TouchXPRT 2016 release notes provide step-by-step instructions. To compare device scores, go to the TouchXPRT 2016 results page, where you’ll find scores from many Windows 10 devices.

Want to dig into the details?

Check out the Exploring TouchXPRT 2016 white paper. In it, we discuss the TouchXPRT development process, its component tests and workloads, and how it calculates individual workload and overall scores. We also provide instructions for automated testing.

BenchmarkXPRT Development Community members also have access to the TouchXPRT source code, so consider joining the community today. There’s no obligation and membership is free for members of any company or organization with an interest in benchmarks.

If you’ve been looking for a Windows performance evaluation tool that’s easy to use and has the flexibility of a UWP app, give TouchXPRT a try and let us know what you think!

Justin

All about the AIXPRT Community Preview

Last week, Bill discussed our plans for the AIXPRT Community Preview (CP). I’m happy to report that, despite some last-minute tweaks and testing, we’re close to being on schedule. We expect to take the CP build live in the coming days, and will send a message to community members to let them know when the build is available in the AIXPRT GitHub repository.

As we mentioned last week, the AIXPRT CP build includes support for the Intel OpenVINO, TensorFlow (CPU and GPU), and TensorFlow with NVIDIA TensorRT toolkits to run image-classification workloads with ResNet-50 and SSD-MobileNet v1 networks. The test reports FP32, FP16, and INT8 levels of precision. Although the minimum CPU and GPU requirements vary by toolkit, the test systems must be running Ubuntu 16.04 LTS. You’ll be able to find more detail on those requirements in the installation instructions that we’ll post on AIXPRT.com.

We’re making the AIXPRT CP available to anyone interested in participating, but you must have a GitHub account. To gain access to the CP, please contact us and let us know your GitHub username. Once we receive it, we’ll send you an invitation to join the repository as a collaborator.

We’re allowing folks to quote test results during the CP period, and we’ll publish results from our lab and other members of the community at AIXPRT.com. Because this testing involves so many complex variables, we may contact testers if we see published results that seem to be significantly different than those from comparable systems. During the CP period, On the AIXPRT results page, we’ll provide detailed instructions on how to send in your results for publication on our site. For each set of results we receive , we’ll disclose all of the detailed test, software, and hardware information that the tester provides. In doing so, our goal is to make it possible for others to reproduce the test and confirm that they get similar numbers.

If you make changes to the code during testing, we ask that you email us and describe those changes. We’ll evaluate if those changes should become part of AIXPRT. We also require that users do not publish results from modified versions of the code during the CP period.

We expect the AIXPRT CP period to last about four to six weeks, placing the public release around the end of March or beginning of April. In the meantime, we welcome your thoughts and suggestions about all aspects of the benchmark.

Please let us know if you have any questions. Stay tuned to AIXPRT.com and the blog for more developments, and we look forward to seeing your results!

JNG

Out with the old, and in with the new

What we now know as the BenchmarkXPRT Development Community started many years ago as the HDXPRT Development Community forum. At the time, the community was much smaller, and HDXPRT was our only benchmark. When a member wanted to run the benchmark, they submitted a request, and then received an installation DVD in the mail.

With hundreds of members, more than a half dozen active benchmarks, and the online availability of all our tools, the current community is a much different organization. Instead of the original forum, most of our interaction with members takes place through the blog, the monthly newsletter, direct email, and our social media accounts. Because of the way the community has changed, and because the original forum is no longer very active, we believe that the time and resources that we devote to maintaining the forum could be better spent on building and maintaining other community assets. To that end, we’ve decided to end support for the original BenchmarkXPRT forum.

As always, community members’ voices are an important consideration in what we do. If you have any questions or concerns about the decision to close down the original forum, please let us know as soon as possible.

On another note, we want to thank the community members who’ve participated in the HDXPRT 4 Community Preview. Testing has gone well, and we’re planning to release HDXPRT 4 to the public towards the end of next week!

Justin

Engaging AI

In December, we wrote about our recent collaboration with students from North Carolina State University’s Department of Computer Science. We challenged the students to create a software console that includes an intuitive user interface, computes a performance metric, and uploads results to our database. The specific objective was to make it easy for testers to configure and run an implementation of the TensorFlow framework. In general, we hoped that the end product would model some of the same basic functions we plan to implement with AIXPRT, our machine-learning performance evaluation tool, currently under development.

The students did an outstanding job, and we hope to incorporate some of their work into AIXPRT in the future. We’ve been calling the overall project “Engaging AI” because it produced a functional tool that can help users interact with TensorFlow, and it was the first time that the students had an opportunity to work with AI tools. You can read more details on the Engaging AI page. We also have a new video that describes the project, including the new skillsets our students acquired to achieve success.

engaging-ai-vid

Finally, interested BenchmarkXPRT Development Community members can access to the project’s source code and additional documentation on our XPRT Experiments page. We hope you’ll check it out!

Justin

Principled Technologies and the BenchmarkXPRT Development Community release MobileXPRT 3, a free performance evaluation app for Android devices

Durham, NC, February 1— Principled Technologies and the BenchmarkXPRT Development Community have released MobileXPRT 3, a free app that gives objective information about how well a tablet, smartphone, or any other Android device handles common tasks. Anyone can go to MobileXPRT.com to compare existing performance results on a variety of devices, and to download the app for themselves. MobileXPRT 3 is also available in the Google Play Store.

MobileXPRT 3 is a benchmark that evaluates the capabilities of Android devices by running six performance scenarios (Apply Photo Effects, Create Photo Collages, Create Slideshow, Encrypt Personal Content, Detect Faces to Organize Photos, and Scan Receipts for Spreadsheet). It also provides an overall measure by generating a single performance score. “MobileXPRT is a popular, easy-to-use benchmark run by manufacturers, tech journalists, and consumers all around the world,” said Bill Catchings, co-founder of Principled Technologies, which administers the BenchmarkXPRT Development Community. “We believe that MobileXPRT 3 is a great addition to MobileXPRT’s legacy of providing relevant and reliable performance data for Android devices.”

MobileXPRT is part of the BenchmarkXPRT suite of performance evaluation tools, which includes WebXPRT, TouchXPRT, CrXPRT, BatteryXPRT, and HDXPRT. The XPRTs help users get the facts before they buy, use, or evaluate tech products such as computers, tablets, and phones.

To learn more about the BenchmarkXPRT Development Community, go to www.BenchmarkXPRT.com.

About Principled Technologies, Inc.
Principled Technologies, Inc. is a leading provider of technology marketing and learning & development services. It administers the BenchmarkXPRT Development Community.

Principled Technologies, Inc. is located in Durham, North Carolina, USA. For more information, please visit www.PrincipledTechnologies.com.

Company Contact
Justin Greene

BenchmarkXPRT Development Community
Principled Technologies, Inc.
1007 Slater Road, Ste. 300
Durham, NC 27703

BenchmarkXPRTsupport@PrincipledTechnologies.com

An update on the AIXPRT Request for Comments preview

As we approach the end of the original feedback window for the AIXPRT Request for Comments preview build, we want to update folks on the status of the project and what to expect in the coming weeks.

First, thanks to those who’ve downloaded the AIXPRT OpenVINO package and sent in their questions and comments. We value your feedback, and it’s instrumental in making AIXPRT a better tool. We’re currently working through some issues with the TensorFlow and TensorRT packages, and hope to add support for those to the RFC preview build repository very soon.

We’re also hoping to have a full-fledged community preview (CP) ready in mid to late February. Like our other community previews, the AIXPRT CP would be solid enough to allow folks to start quoting numbers. We typically make our benchmarks available to the general public four to six weeks after the community preview period begins, so if that schedule holds, it would place the public AIXPRT release around the end of March.

In light of the schedule described above, you still have time to gain access to the AIXPRT RFC preview build and give your feedback, so let us know if you’d like to check it out. The installation and testing process can take less than an hour, but getting everything properly set up can take a few tries. We are hard at work trying to make that process more straightforward. We welcome your input on all aspects of the benchmark, including workloads, ease of use, metrics, scores, and reporting.

Thanks for your help!

Justin

Check out the other XPRTs:

Forgot your password?