BenchmarkXPRT Blog banner

Tag Archives: AI

Making AIXPRT easier to use

We’re glad to see so much interest in the AIXPRT CP2 build. Over the past few days, we’ve received two questions about the setup process: 1) where to find instructions for setting up AIXPRT on Windows, and 2) whether we could make it easier to install Intel OpenVINO on test systems.

In response to the first question, testers can find the relevant instructions for each framework in the readme files included in the AIXPRT install package. Instructions for Windows installation are in section 3 of the OpenVINO and TensorFlow readmes. Please note that whether you’re running AIXPRT on Ubuntu or Windows, be sure to read the “Known Issues” section in the readme, as there may be issues relevant to your specific configuration.

The readme files for each respective framework in the CP2 package are located here:

  • AIXPRT_0.5_CP2\AIXPRT_OpenVINO_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning
  • AIXPRT_0.5_CP2\AIXPRT_TensorFLow_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning
  • AIXPRT_0.5_CP2\AIXPRT_TensorFlow_TensorRT_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning


We’re also working on consolidating the instructions into a central document that will make it easier for everyone to find the instructions they need.

In response to the question about OpenVINO installation, we’re working on an AIXPRT CP2 package that includes a precompiled version of OpenVINO R5.0.1 for easy installation on Windows via a few quick commands, and a script that installs the necessary OpenVINO dependencies. We’re currently testing the build, and we’ll make it available to testers as soon as possible.

The tests themselves will not change, so the new build will not influence existing results from Ubuntu or Windows. We hope it will simply facilitate the setup and testing process for many users.

We appreciate each bit of feedback that we receive, so if you have any suggestions for AIXPRT, please let us know!

Justin

News on AIXPRT development

(more…)

AIXPRT Community Preview 2 is almost here!

In last week’s blog, we predicted that the second AIXPRT Community Preview (CP2) would be ready for release later this month. Since then, the development process has accelerated, and we now expect to release CP2 as early as tomorrow, May 10.

Those who have access to the existing AIXPRT Community Preview GitHub repository will be able to access CP2 the same way as before. In addition to making the build available on GitHub, we’ll also post CP2 on an AIXPRT tab in the XPRT Members’ Area (login required). If you don’t have a BenchmarkXPRT Development Community membership, please contact us and we’ll help you register.

Testing with AIXPRT CP2 in Ubuntu will be the same as with the first CP, and none of the CP2 changes will affect results. In Windows, testers will be able to use OpenVINO to target a system’s CPU and GPU, and TensorFlow to target CPUs. We’re still investigating ways to support TensorFlow GPU and TensorFlow-TensorRT testing in Windows.

We’re also continuing to work on the improvements to the AIXPRT results viewer that we mentioned last week. We won’t be able to implement all of the changes by tomorrow, but rather than waiting until we’re finished, we’ll be rolling out improvements as they become ready.

We’ll continue to keep everyone up to date with AIXPRT news here in the blog. If you have any questions or comments, please let us know.

Justin

An update on AIXPRT development

It’s been almost two months since the AIXPRT Community Preview went live, and we want to provide folks with a quick update. Community Preview periods for the XPRTs generally last about a month. Because of the complexity of AIXPRT and some of the feedback we’ve received, we plan to release a second AIXPRT Community Preview (CP2) later this month.

One of the biggest additions in CP2 will be the ability to run AIXPRT on Windows. AIXPRT currently requires test systems to run Ubuntu 16.04 LTS. This is fine for testers accustomed to Linux environments, but presents obstacles for those who want to test in a traditional Windows environment. We will not be changing the tests themselves, so this update will not influence existing results from Ubuntu. We plan to make CP2 available for download from the BenchmarkXPRT website for people who don’t wish to deal with GitHub.

Also, after speaking with testers and learning more about the kinds of data points people are looking for in AIXPRT results, we’ve decided to make significant adjustments to the AIXPRT results viewer. To make it easier for visitors to find what they’re looking for, we’ll add filters for key categories such as batch size, toolkit, and latency percentile (e.g., 50th, 90th, 99th), among others. We’ll also allow users to set desired ranges for metrics such as throughput and latency.

Finally, we’re adding a demo mode that displays some images and other information on the screen while a test is running to give users a better idea what is happening. While we haven’t seen results change while running in demo mode, users should not publish demo results or use them for comparison.

We hope to release CP2 in the second half of May and a GA version in mid-June. However, this project has more uncertainties than we usually encounter with the XPRTs, so that timeline could easily change.

We’ll continue to keep everyone up to date with AIXPRT news here in the blog. As always, we appreciate your suggestions. If you have any questions or comments about AIXPRT, please let us know.

Bill

Answering questions about the AIXPRT Community Preview

Over the last two weeks, we’ve received a few questions about the AIXPRT Community Preview. Specifically, community members have asked about the project’s focus, possible future steps, and the results table. We decided to answer each of these here in the blog, since others are likely to have the same questions. We encourage folks to submit any new questions they may have.

PT previously stated that AIXPRT would be focused on edge devices. The current published results are from desktops and laptops. Is the focus of AIXPRT changing?

In the past, we did say that the focus of AIXPRT would be edge inference devices. After much feedback, we’ve come to understand that focus is probably too restrictive. PCs and laptops are using inference machine learning, and a decent amount of inference is taking place on servers in the cloud until phones are capable enough to handle the workloads. We now see all of these devices as potential targets for AIXPRT.

How did you choose the current results in your database?

We ran the AIXPRT CP on some of the systems we used during development and testing. We will continue to publish additional results as we test available systems in our lab. We’d love to get results from the community that cover a wider base of devices.

Will you be publishing results from servers?

We welcome server results submissions from the community, and will review them for publication on our site.

Will AIXPRT ever be available for Windows systems?

This is a possibility we’re actively exploring, and we hope to be able to share more about it soon.

What’s the best way to navigate the results table?

AIXPRT can run three toolkits, utilize two networks, and target CPU or GPU hardware. Together, these configuration options produce a lot of data points. To make it easier to handle all these variables, we’re working to improve the navigation, sorting, and filtering capabilities of the results table. In the meantime, a few tips:

  • There are two tabs at the top of the table, one for the ResNet-50 network and one for the SSD-MobileNet network. You can click the tabs to move between results for these networks.
  • Clicking any of the column headers will sort the data in that column A-Z (with the first click) or Z-A (with a second click).
  • To see if an individual test targeted a system’s CPU or GPU, read the description in the Summary column, e.g. Intel Core i7-7600U GPU / OpenVINO.
  • Clicking the entry in the Source column will take you to a more detailed page listing additional test configuration and system hardware information.

 

We’ll continue to share more information about AIXPRT in the coming weeks. Do you have additional questions or comments about AIXPRT? Let us know.

Justin

More, faster, better: The future according to Mobile World Congress 2019

More is more data, which the trillions of devices in the coming Internet of Things will be pumping through our air into our (computing) clouds in hitherto unseen quantities.

Faster is the speed at which tomorrow’s 5G networks will carry this data—and the responses and actions from our automated assistants (and possibly overlords).

Better is the quality of the data analysis and recommendations, thanks primarily to the vast army of AI-powered analytics engines that will be poring over everything digital the planet has to say.

Swimming through this perpetual data tsunami will be we humans and our many devices, our laptops and tablets and smartphones and smart watches and, ultimately, implants. If we are to believe the promise of this year’s Mobile World Congress in Barcelona—and of course I do want to believe it, who wouldn’t?—the result of all of this will be a better world for all humanity, no person left behind. As I walked the show floor, I could not help but feel and want to embrace its optimism.

The catch, of course, is that we have a tremendous amount of work to do between where we are today and this fabulous future.

We must, for example, make sure that every computing node that will contribute to these powerful AI programs is up to the task. From the smartphone to the datacenter, AI will end up being a very distributed and very demanding workload. That’s one of the reasons we’ve been developing AIXPRT. Without tools that let us accurately compare different devices, the industry won’t be able to keep delivering the levels of performance improvements that we need to realize these dreams.

We must also think a lot about how to accurately measure all other aspects of our devices’ performance, because the demands this future will place on them are going to be significant. Fortunately, the always evolving XPRT family of tools is up to the task.

The coming 5G revolution, like all tech leaps forward before it, will not come evenly. Different 5G devices will end up behaving differently, some better and some worse. That fact, plus our constant and growing reliance on bandwidth, suggests that maybe the XPRT community should turn its attention to the task of measuring bandwidth. What do you think?

One thing is certain: we at the Benchmark XPRT Development Community have a role to play in building the tools necessary to test the tech the world will need to deliver on the promise of this exciting trade show. We look forward to that work.

Check out the other XPRTs:

Forgot your password?