BenchmarkXPRT Blog banner

Tag Archives: TensorFlow

Making AIXPRT easier to use

We’re glad to see so much interest in the AIXPRT CP2 build. Over the past few days, we’ve received two questions about the setup process: 1) where to find instructions for setting up AIXPRT on Windows, and 2) whether we could make it easier to install Intel OpenVINO on test systems.

In response to the first question, testers can find the relevant instructions for each framework in the readme files included in the AIXPRT install package. Instructions for Windows installation are in section 3 of the OpenVINO and TensorFlow readmes. Please note that whether you’re running AIXPRT on Ubuntu or Windows, be sure to read the “Known Issues” section in the readme, as there may be issues relevant to your specific configuration.

The readme files for each respective framework in the CP2 package are located here:

  • AIXPRT_0.5_CP2\AIXPRT_OpenVINO_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning
  • AIXPRT_0.5_CP2\AIXPRT_TensorFLow_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning
  • AIXPRT_0.5_CP2\AIXPRT_TensorFlow_TensorRT_0.5_CP2.zip\AIXPRT\Modules\Deep-Learning


We’re also working on consolidating the instructions into a central document that will make it easier for everyone to find the instructions they need.

In response to the question about OpenVINO installation, we’re working on an AIXPRT CP2 package that includes a precompiled version of OpenVINO R5.0.1 for easy installation on Windows via a few quick commands, and a script that installs the necessary OpenVINO dependencies. We’re currently testing the build, and we’ll make it available to testers as soon as possible.

The tests themselves will not change, so the new build will not influence existing results from Ubuntu or Windows. We hope it will simply facilitate the setup and testing process for many users.

We appreciate each bit of feedback that we receive, so if you have any suggestions for AIXPRT, please let us know!

Justin

News on AIXPRT development

(more…)

We want to hear your thoughts about the AIXPRT development schedule

We released the second AIXPRT Community Preview (CP2) about two weeks ago. The main additions in CP2 were the ability to run certain test configurations in Windows (OpenVINO CPU/GPU and TensorFlow CPU), the option to download the installer package from the AIXPRT tab in the XPRT Members’ Area, and a demo mode.

We’re also investigating ways to support TensorFlow GPU and TensorFlow-TensorRT testing in Windows, and we’d like to eventually add support for TensorRT testing in Ubuntu and Windows. If development and pre-release testing go as planned, we may roll out some of these extra features by the end of June. However, it’s possible that getting all the pieces that we want in place will require a multi-step release process. If so, we’re considering two approaches: (1) issuing a third community preview (CP3) and (2) preparing a general availability (GA) release, to which we would add features over the months following the release. Neither of these paths is likely to affect test results from the currently supported configurations.

Would you like to work with another community preview, or would it be better for us to move straight to a GA release and add features as they become ready? We want to follow the approach that the majority of community members prefer, so please let us know what you think. As always, we also welcome any questions, concerns, or suggestions regarding the AIXPRT development process.

Justin

AIXPRT Community Preview 2 is almost here!

In last week’s blog, we predicted that the second AIXPRT Community Preview (CP2) would be ready for release later this month. Since then, the development process has accelerated, and we now expect to release CP2 as early as tomorrow, May 10.

Those who have access to the existing AIXPRT Community Preview GitHub repository will be able to access CP2 the same way as before. In addition to making the build available on GitHub, we’ll also post CP2 on an AIXPRT tab in the XPRT Members’ Area (login required). If you don’t have a BenchmarkXPRT Development Community membership, please contact us and we’ll help you register.

Testing with AIXPRT CP2 in Ubuntu will be the same as with the first CP, and none of the CP2 changes will affect results. In Windows, testers will be able to use OpenVINO to target a system’s CPU and GPU, and TensorFlow to target CPUs. We’re still investigating ways to support TensorFlow GPU and TensorFlow-TensorRT testing in Windows.

We’re also continuing to work on the improvements to the AIXPRT results viewer that we mentioned last week. We won’t be able to implement all of the changes by tomorrow, but rather than waiting until we’re finished, we’ll be rolling out improvements as they become ready.

We’ll continue to keep everyone up to date with AIXPRT news here in the blog. If you have any questions or comments, please let us know.

Justin

All about the AIXPRT Community Preview

Last week, Bill discussed our plans for the AIXPRT Community Preview (CP). I’m happy to report that, despite some last-minute tweaks and testing, we’re close to being on schedule. We expect to take the CP build live in the coming days, and will send a message to community members to let them know when the build is available in the AIXPRT GitHub repository.

As we mentioned last week, the AIXPRT CP build includes support for the Intel OpenVINO, TensorFlow (CPU and GPU), and TensorFlow with NVIDIA TensorRT toolkits to run image-classification workloads with ResNet-50 and SSD-MobileNet v1 networks. The test reports FP32, FP16, and INT8 levels of precision. Although the minimum CPU and GPU requirements vary by toolkit, the test systems must be running Ubuntu 16.04 LTS. You’ll be able to find more detail on those requirements in the installation instructions that we’ll post on AIXPRT.com.

We’re making the AIXPRT CP available to anyone interested in participating, but you must have a GitHub account. To gain access to the CP, please contact us and let us know your GitHub username. Once we receive it, we’ll send you an invitation to join the repository as a collaborator.

We’re allowing folks to quote test results during the CP period, and we’ll publish results from our lab and other members of the community at AIXPRT.com. Because this testing involves so many complex variables, we may contact testers if we see published results that seem to be significantly different than those from comparable systems. During the CP period, On the AIXPRT results page, we’ll provide detailed instructions on how to send in your results for publication on our site. For each set of results we receive , we’ll disclose all of the detailed test, software, and hardware information that the tester provides. In doing so, our goal is to make it possible for others to reproduce the test and confirm that they get similar numbers.

If you make changes to the code during testing, we ask that you email us and describe those changes. We’ll evaluate if those changes should become part of AIXPRT. We also require that users do not publish results from modified versions of the code during the CP period.

We expect the AIXPRT CP period to last about four to six weeks, placing the public release around the end of March or beginning of April. In the meantime, we welcome your thoughts and suggestions about all aspects of the benchmark.

Please let us know if you have any questions. Stay tuned to AIXPRT.com and the blog for more developments, and we look forward to seeing your results!

JNG

Preparing for the AIXPRT Community Preview

Thanks to everyone who downloaded the AIXPRT Request for Comments (RFC) preview build. Next week, we’re planning to publish the AIXPRT Community Preview (CP). The AIXPRT CP build includes support for the Intel OpenVINO, TensorFlow (CPU and GPU), and TensorFlow with NVIDIA TensorRT toolkits to run image-classification workloads with ResNet-50 and SSD-MobileNet v1 networks. The test reports FP32, FP16, and INT8 levels of precision. As with the RFC build, the test systems must be running Ubuntu 16.04 LTS. The minimum CPU and GPU requirements vary according to the toolkit being used, and we will publish more details about the hardware minimums next week.

As with our other community previews, we think the AIXPRT CP candidate is solid enough to allow folks to start quoting test results. During CP periods, we generally allow members to publish their own results, but wait until the build is available to the public before we post results on our site. Because community feedback is especially important for AIXPRT, we will handle things a bit differently. During the CP period, we’ll publish results that we produce as well as those from other members of the community, which you’ll be able to view at AIXPRT.com.

We’ll also provide detailed instructions for publishing results and sending them to us. Because of the high number of variables in each potential test configuration, we’ll ask testers to disclose more test, software, and hardware information than in the past. We will make this information available along with the results on AIXPRT.com. Our goal is that others can reproduce these numbers and confirm that they get similar results.

Our CP periods typically last four to six weeks before we make the benchmark available to the general public. If that schedule holds, it would place the public AIXPRT release around the end of March. During the CP period, we welcome your thoughts and suggestions about all aspects of the benchmark.

Also, we normally restrict access to our CPs to BenchmarkXPRT Development Community members. However, because we’re seeking broad input from experts in this field, we’ll gladly make the CP available to anyone interested in participating who has a GitHub account. To gain access, please contact us and let us know your GitHub username. Once we receive it, we’ll send you an invitation to join the repository as a collaborator.

Please let us know if you have any questions. We look forward to hearing your feedback.

Bill

Check out the other XPRTs:

Forgot your password?