BenchmarkXPRT Blog banner

Category: AIXPRT

Engaging AI

In December, we wrote about our recent collaboration with students from North Carolina State University’s Department of Computer Science. We challenged the students to create a software console that includes an intuitive user interface, computes a performance metric, and uploads results to our database. The specific objective was to make it easy for testers to configure and run an implementation of the TensorFlow framework. In general, we hoped that the end product would model some of the same basic functions we plan to implement with AIXPRT, our machine-learning performance evaluation tool, currently under development.

The students did an outstanding job, and we hope to incorporate some of their work into AIXPRT in the future. We’ve been calling the overall project “Engaging AI” because it produced a functional tool that can help users interact with TensorFlow, and it was the first time that the students had an opportunity to work with AI tools. You can read more details on the Engaging AI page. We also have a new video that describes the project, including the new skillsets our students acquired to achieve success.

engaging-ai-vid

Finally, interested BenchmarkXPRT Development Community members can access to the project’s source code and additional documentation on our XPRT Experiments page. We hope you’ll check it out!

Justin

An update on the AIXPRT Request for Comments preview

As we approach the end of the original feedback window for the AIXPRT Request for Comments preview build, we want to update folks on the status of the project and what to expect in the coming weeks.

First, thanks to those who’ve downloaded the AIXPRT OpenVINO package and sent in their questions and comments. We value your feedback, and it’s instrumental in making AIXPRT a better tool. We’re currently working through some issues with the TensorFlow and TensorRT packages, and hope to add support for those to the RFC preview build repository very soon.

We’re also hoping to have a full-fledged community preview (CP) ready in mid to late February. Like our other community previews, the AIXPRT CP would be solid enough to allow folks to start quoting numbers. We typically make our benchmarks available to the general public four to six weeks after the community preview period begins, so if that schedule holds, it would place the public AIXPRT release around the end of March.

In light of the schedule described above, you still have time to gain access to the AIXPRT RFC preview build and give your feedback, so let us know if you’d like to check it out. The installation and testing process can take less than an hour, but getting everything properly set up can take a few tries. We are hard at work trying to make that process more straightforward. We welcome your input on all aspects of the benchmark, including workloads, ease of use, metrics, scores, and reporting.

Thanks for your help!

Justin

New XPRTs for the new year

Happy 2019! January is already a busy time for the XPRTs, so we want to share a quick preview of what community members can expect in the coming months.

The MobileXPRT 3 community preview (CP) is still open, but draws to a close on January 18th. If you are not familiar with the updates and changes we implemented in the newest version of MobileXPRT, you can read more in the blog. Members can access this APK on the MobileXPRT tab in the Members’ Area. We also posted an installation guide that provides both a general overview of the app and detailed instructions for each step. The entire process takes about five minutes on most devices. If you haven’t already, give it a try!

We also recently published the first AIXPRT Request for Comments (RFC) preview build, an early version of one of the tools we’re developing to evaluate machine learning performance. You can find more details in Bill’s most recent blog post and on AIXPRT.com. Only BenchmarkXPRT Development Community members have access to our RFCs and the opportunity to provide feedback. However, because we’re seeking broad input from experts in this field, we’ll gladly make anyone interested in participating a member. To gain access to the AIXPRT repository, please send us a request.

Work on the HDXPRT 4 CP candidate build continues, and we hope to publish the preview for community members this month. We appreciate everyone’s patience as we work to get this right. We think it will be worth the wait.

On a general note, I’ll be travelling to CES 2019 in Las Vegas next week. CES is a great opportunity for us to survey emerging tech and industry trends, and I look forward to sharing my thoughts from the show. If you’ll be there and would like to discuss any aspect of the XPRTs in person, let me know.

Justin

The AIXPRT Request for Comments preview build

In the next few days, we’ll be publishing the first AIXPRT tool as a Request for Comments (RFC) preview build, an early version of one of the AIXPRT tools we’re developing to help evaluate machine learning performance.

We’re inviting folks to run the workload and send in their thoughts and suggestions. Only BenchmarkXPRT Development Community members have access to our RFCs and the opportunity to provide feedback. However, because we’re seeking broad input from experts in this field, we’ll gladly make anyone interested in participating a member.

This AIXPRT RFC preview build includes support for the Intel OpenVINO computer vision toolkit to run image classification workloads with ResNet-50 and SSD-MobileNet v1 networks. The test reports FP32 and FP16 levels of precision. The system requirements are:

  • Operating system = Ubuntu 16.04
  • CPU = 6th to 8th generation Intel Core or Xeon processors, or Intel Pentium processors N4200/5, N3350/5, N3450/5 with Intel HD Graphics


We welcome input on all aspects of the benchmark, including scope, workloads, metrics and scores, user experience, and reporting. We will add support for TensorFlow and TensorRT to the AIXPRT RFC preview build during the preview period. We are accepting feedback through January 25th, 2019, after which we’ll collect and evaluate responses before publishing the next build. Because this is an RFC release, we ask that testers do not publish scores or use the results for comparison purposes.

We’ll send out a community announcement when the RFC preview build is officially available, and we’ll also post an announcement and RFC preview build user guide on AIXPRT.com. We’re hosting the AIXPRT RFC preview build in a dedicated GitHub repository, so please contact us at BenchmarkXPRTsupport@principledtechnologies.com to gain access.

This is just the next step for AIXPRT. With your help, we hope to add more workloads and other frameworks in the coming months. We look forward to receiving your feedback!

Bill

XPRT collaborations: North Carolina State University

For those of us who work on the BenchmarkXPRT tools, a core goal is involving new contributors and interested parties in the benchmark development process. Adding voices to the discussion fosters the collaboration and innovation that lead to powerful benchmark tools with lasting relevance.

One vehicle for outreach that we especially enjoy is sponsoring a student project through North Carolina State University. Each semester, the Senior Design Center in the university’s Department of Computer Science partners with external companies and organizations to provide student teams with an opportunity to work on real-world programming projects. If you’ve followed the XPRTs for a while, you may remember previous student projects such as Nebula Wolf, a mini-game that shows how well different devices handle games, and VR Demo, a virtual reality prototype workload based on a room escape scenario.

This fall, a team of NC State students is developing a software console for automating machine learning tests. Ideally, the tool will let future testers specify custom workload combinations, compute a performance metric, and upload results to our database. The project will also assess the impact of the framework on performance scores. In fact, the console will perform many of the same functions we plan to implement with AIXPRT.

The students have worked very hard on the project, and have learned quite a bit about benchmarking practices and several new software tools. The project will wrap up in the next couple of weeks, and we’ll share additional details as soon as possible. Early next year, we’ll publish a video about the experience.

If you’d like to join the NC State students and hundreds of other XPRT community members in the future of benchmark development, please let us know!

Justin

Notes from the lab: Updates on HDXPRT 4, MobileXPRT 3, and AIXPRT

The next couple of months will be very busy with XPRT activity, so we want to update readers on what to expect. Depending on a number of factors, we expect to release HDXPRT 4 and MobileXPRT 3 community previews (CPs) within the next four to six weeks. We’re also hoping to publish an early AIXPRT request-for-comment (RFC) build on GitHub within the same time frame. Here’s a little more detail about each of these developments.

HDXPRT 4: We originally planned to release the HDXPRT 4 CP several weeks ago. As we recently discussed in the blog, a lot has changed in the Windows 10 development world within a short period of time, and Microsoft has released a number of new Redstone 5/October 2018 Update builds in quick succession. While our HDXPRT 4 CP candidate testing went well overall, we observed some inconsistent workload scores when testing on some of the new Windows builds. Since then, we believe we’ve narrowed down the list of possible causes to a few specific graphics driver versions, but we’re still testing to make sure there are no other immediate issues. As soon as we’re confident in that assessment, we’ll release the CP along with any relevant information about the affected graphics drivers.

MobileXPRT 3: MobileXPRT 3 development is progressing nicely, and we’re close to completing a CP candidate build. We’ll test that build extensively on our library of Android phones and tablets, and barring any unforeseen issues, we plan to release the CP in the next few weeks.

AIXPRT: AIXPRT is the umbrella name for a set of tools we’re developing to help evaluate machine learning performance. After a great deal of research, we’re getting closer to releasing a build – tentatively called the AIXPRT RFC – for community members and other interested parties to download and review. For a number of reasons, the AIXPRT RFC process will be a little different than our normal XPRT RFC and CP process. We’ll be offering more information on the AIXPRT RFC build over the next several weeks.

We’re grateful to everyone who’s contributed in any way to each of these projects, and we look forward to sharing the benchmarks with the world. If you have any questions about the XPRTs, please don’t hesitate to ask!

Justin

Check out the other XPRTs:

Forgot your password?