BenchmarkXPRT Blog banner

Making AIXPRT easier to use

We’re glad to see so much interest in the AIXPRT CP2 build. Over the past few days, we’ve received two questions about the setup process: 1) where to find instructions for setting up AIXPRT on Windows, and 2) whether we could make it easier to install Intel OpenVINO on test systems.

In response to the first question, testers can find the relevant instructions for each framework in the readme files included in the AIXPRT install package. Instructions for Windows installation are in section 3 of the OpenVINO and TensorFlow readmes. Please note that whether you’re running AIXPRT on Ubuntu or Windows, be sure to read the “Known Issues” section in the readme, as there may be issues relevant to your specific configuration.

The readme files for each respective framework in the CP2 package are located here:

  • AIXPRT_0.5_CP2\AIXPRT_OpenVINO_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning
  • AIXPRT_0.5_CP2\AIXPRT_TensorFLow_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning
  • AIXPRT_0.5_CP2\AIXPRT_TensorFlow_TensorRT_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning


We’re also working on consolidating the instructions into a central document that will make it easier for everyone to find the instructions they need.

In response to the question about OpenVINO installation, we’re working on an AIXPRT CP2 package that includes a precompiled version of OpenVINO R5.0.1 for easy installation on Windows via a few quick commands, and a script that installs the necessary OpenVINO dependencies. We’re currently testing the build, and we’ll make it available to testers as soon as possible.

The tests themselves will not change, so the new build will not influence existing results from Ubuntu or Windows. We hope it will simply facilitate the setup and testing process for many users.

We appreciate each bit of feedback that we receive, so if you have any suggestions for AIXPRT, please let us know!

Justin

News on AIXPRT development

(more…)

Check out the new XPRTs around the world infographic!

If you’ve followed the XPRT blog for a while, you know that we occasionally update the community on some of the reach metrics we track by publishing a new version of the “XPRTs around the world” infographic. The metrics we track include completed test runs, benchmark downloads, and mentions of the XPRTs in advertisements, articles, and tech reviews. Gathering this information gives us insight into how many people are using the XPRT tools, and updating the infographic helps readers and community members see the impact the XPRTs are having around the world.

This week, we published a new infographic, which include the following highlights:

  • The XPRTs have been mentioned more than 13,900 times on over 4,000 unique sites.
  • Those mentions include more than 10,300 articles and reviews.
  • Those mentions originated in over 629 cities located in 67 countries on six continents. New cities of note include Bangalore, India; Donetsk, Ukraine; Lima, Peru; and Santiago, Chile.
  • The BenchmarkXPRT Development Community now includes 230 members from 76 companies and organizations around the world.


In addition to the growth in web mentions and community members, the XPRTs have now delivered more than 520,000 real-world results! We’re grateful for everyone who’s helped us get this far. Your participation is vital to our achieving our goal: to provide benchmark tools that are reliable, relevant, and easy to use.

Justin

Improvements to the AIXPRT results table

Over the last few weeks, we’ve gotten great feedback about the kinds of data points people are looking for in AIXPRT results, as well as suggestions for how to improve the AIXPRT results viewer. To make it easier for visitors to find what they’re looking for, we’ve made a number of changes:

  • You can now filter results in categories such as framework, target hardware, batch size, and precision, and can designate minimum throughput and maximum latency scores. When you select a value from a drop-down menu or enter text, the results change immediately to reflect the filter.
  • You can search for variables such as processor vendor or processor speed.
  • The viewer displays eight results per page by default and lets you change this to 16, 48, or Show all.

 

The following features of the viewer, which have been present previously, can help you to navigate more efficiently:

  • Click the tabs at the top of the table to switch from ResNet-50 network results to SSD-MobileNet network results.
  • Click the header of any column to sort the data on that variable. One click sorts A-Z and two clicks sort Z-A.
  • Click the link in the Source column to visit a detailed page on that result. The page contains additional test configuration and system hardware information and lets you download results files.

 

We hope these changes will improve the utility of the results table. We’ll continue to add features to improve the experience. If you have any suggestions, please let us know!

Justin

We want to hear your thoughts about the AIXPRT development schedule

We released the second AIXPRT Community Preview (CP2) about two weeks ago. The main additions in CP2 were the ability to run certain test configurations in Windows (OpenVINO CPU/GPU and TensorFlow CPU), the option to download the installer package from the AIXPRT tab in the XPRT Members’ Area, and a demo mode.

We’re also investigating ways to support TensorFlow GPU and TensorFlow-TensorRT testing in Windows, and we’d like to eventually add support for TensorRT testing in Ubuntu and Windows. If development and pre-release testing go as planned, we may roll out some of these extra features by the end of June. However, it’s possible that getting all the pieces that we want in place will require a multi-step release process. If so, we’re considering two approaches: (1) issuing a third community preview (CP3) and (2) preparing a general availability (GA) release, to which we would add features over the months following the release. Neither of these paths is likely to affect test results from the currently supported configurations.

Would you like to work with another community preview, or would it be better for us to move straight to a GA release and add features as they become ready? We want to follow the approach that the majority of community members prefer, so please let us know what you think. As always, we also welcome any questions, concerns, or suggestions regarding the AIXPRT development process.

Justin

Transparent goals

Recently, Forbes published an article discussing a new report on phone battery life from Which?, a UK consumer advocacy group. In the report, Which? states that they tested the talk time battery life of 50 phones from five brands. During the tests, phones from three of the brands lasted longer than the manufacturers’ claims, while phones from another brand underperformed by about five percent. The fifth brand’s published battery life numbers were 18 to 51 percent higher than Which? recorded in their tests.

Folks can read the article for more details about the tests and the brands. While the report raises some interesting questions, and the article provides readers with brief test methodology descriptions from Which? and one manufacturer, we don’t know enough about the tests to say which set of claims is correct. Any number of variables related to test workloads or device configuration settings could significantly affect the results. Both parties may be using sound benchmarking principles in good faith, but their test methodologies may not be comparable. As it is, we simply don’t have enough information to evaluate the study.

Whether the issue is battery life or any other important device spec, information conflicts, such as the one that the Forbes article highlights, can leave consumers scratching their heads, trying to decide which sources are worth listening to. At the XPRTs, we believe that the best remedy for this type of problem is to provide complete transparency into our testing methodologies and development process. That’s why our lab techs verify all the hardware specs for each XPRT Weekly Tech Spotlight entry. It’s why we publish white papers explaining the structure of our benchmarks in detail, as well as how the XPRTs calculate performance results. It’s also why we employ an open development community model and make each XPRT’s source code available to community members. When we’re open about how we do things, it encourages the kind of honest dialogue between vendors, journalists, consumers, and community members that serves everyone’s best interests.

If you love tech and share that same commitment to transparency, we’d love for you to join our community, where you can access XPRT source code and previews of upcoming benchmarks. Membership is free for anyone with a verifiable corporate affiliation. If you have any questions about membership or the registration process, please feel free to ask.

Justin

Check out the other XPRTs:

Forgot your password?