BenchmarkXPRT Blog banner

Category: Source code

Gain a deeper understanding of WebXPRT 4 with our results calculation white paper

More people around the world are using WebXPRT 4 now than ever before. It’s exciting to see that growth, which also means that many people are visiting our site and learning about the XPRTs for the first time. Because new visitors may not know how the XPRT family of benchmarks differs from other benchmarking efforts, we occasionally like to revisit the core values of our open development community here in the blog—and show how those values translate into more free resources for you.

One of our primary values is transparency in all our benchmark development and testing processes. We share information about our progress with XPRT users throughout the development process, and we invite people to contribute ideas and feedback along the way. We also publish both the source code of our benchmarks and detailed information about how they work, unlike benchmarks that use a “black box” model.

For WebXPRT 4 users who are interested in knowing more about the nuts and bolts of the benchmark, we offer several information-packed resources, including our focus for today, the WebXPRT 4 results calculation and confidence interval white paper. The white paper explains the WebXPRT 4 confidence interval, how it differs from typical benchmark variability, and the formulas the benchmark uses to calculate the individual workload scenario scores and overall score on the end-of-test results screen. The paper also provides an overview of the statistical methodology that WebXPRT uses to translate raw timings into scores.

In addition to the white paper’s discussion of the results calculation process, we’ve also provided a results calculation spreadsheet that shows the raw data from a sample test run and reproduces the calculations WebXPRT uses to generate both the workload scores and an overall score.

In potential future versions of WebXPRT, it’s likely that we’ll continue to use the same—or very similar—statistical methodologies and results calculation formulas that we’ve documented in the results calculation white paper and spreadsheet. That said, if you have suggestions for how we could improve those methods or formulas—either in part or in whole—please don’t hesitate to contact us. We’re interested in hearing your ideas!

The white paper is available on WebXPRT.com and on our XPRT white papers page. If you have any questions about the paper or spreadsheet, WebXPRT, or the XPRTs in general, please let us know.

Justin

Working with the WebXPRT 4 source code

In our last blog post, we discussed the WebXPRT 4 source code and how you can contact us to request free access to the build package. In this post, we’ll address two questions that users sometimes ask about code access. The first question is, “How do I build a local instance of WebXPRT?” The second is, “What can I do with it?”

How to build a local WebXPRT 4 instance

After we receive your request, we’ll send you a secure link to the current WebXPRT 4 build package, which contains all the necessary source code files and installation instructions. You will need a system to use as a server, and you will need to be familiar with Apache, PHP, and MySQL configuration to follow the build instructions. WebXPRT 4 uses a LAMP (Linux, Apache, MySQL, and PHP) setup on the “server” side, but it’s also possible to set up an instance with a WAMP or XAMPP stack.

The build instructions include a step-by-step methodology for setup. If you are familiar with LAMP stack configuration, the build and configuration process should take about two to three hours, depending on whether your LAMP-related extensions and libraries are current.

What you can do with a local WebXPRT 4 instance

We allow users to set up their own WebXPRT 4 instances for purposes of review, internal testing, or experimentation.

One use-case example is internal OEM lab testing. Some labs use WebXPRT to conduct extensive testing on preproduction hardware, and they follow stringent security guidelines to avoid the possibility of any hardware or test information leaving the lab. Even though we have our own strict policies about how we handle the little amount of data that WebXPRT gathers from tests, a local WebXPRT 4 instance provides those labs with an extra layer of security for sensitive tests.

We do ask that users publish results only from tests that they run on WebXPRT.com. As we mentioned in our most recent post, benchmarking requires a product that is consistent to enable valid comparisons over time. We allow people to download the source, but we reserve the right to control derivative works and which products can use the name “WebXPRT.” That way, when people see WebXPRT scores in tech press articles or vendor marketing materials, they can run their own tests on WebXPRT.com and be confident that they’re using the same standard for comparison.

If you have any questions about using the WebXPRT 4 source code, let us know!

Justin

Accessing the WebXPRT 4 source code

If you’re new to the XPRTs, you may not be aware that we provide free access to XPRT benchmark source code. Publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. By allowing interested parties to access and review our source code, we’re encouraging openness and honesty in the benchmarking industry. We’re also inviting constructive feedback that can help ensure that the XPRTs continue to improve and contribute to a level playing field for all the types of products they measure.

While we do offer free access to the XPRT source code, we’ve decided to offer the code upon request instead of using a permanent download link. This approach prevents bots or other malicious actors from downloading the code. It also has the benefit of allowing us to interact with users who are interested in the source code and answer any questions they may have. We’re always keen to learn more about what others are thinking about the XPRTs and the types of work they measure.

We recently received some questions about accessing the WebXPRT 4 source code, which made us realize that we needed to make a clearer way for people to ask for the code. In response, we added a “Request WebXPRT 4 source code” link to the gray Helpful Info box on WebXPRT.com (see it in the screenshot below). Clicking the link will allow you to email the BenchmarkXPRT Support team directly and request the code.

After we receive your request, we’ll send you a secure link to the current WebXPRT 4 build package. For those users who wish to set up a local instance of WebXPRT 4 for their own internal testbeds, the package will contain all the necessary files and installation instructions. We allow folks to set up their own instances for purposes of review, internal testing, or experimentation, but we ask that users publish only test results from the official WebXPRT 4 site.

While we offer free access to XPRT source code, our approach to derivative works differs from some traditional open-source models that encourage developers to change products and even take them in different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

If you have any questions about accessing the WebXPRT 4 source code, let us know!

Justin

Accessing XPRT source code

We recently received a question from member of the tech press about whether we would be willing to supply them with the WebXPRT 4 source code, along with instructions for how to set up a local instance of the benchmark for their internal testbed. We were happy to help, and they are now able to automate WebXPRT 4 runs within their own isolated network.

If you’re a new XPRT tester, you may not be aware that we provide free access to the source code for each of the XPRT benchmarks. Publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. By allowing all interested parties to access and review our source code, we’re encouraging openness and honesty in the benchmarking industry and are inviting the kind of constructive feedback that helps to ensure that the XPRTs continue to contribute to a level playing field.

While XPRT source code is available to the public, our approach to derivative works differs from some open-source models. Traditional open-source models encourage developers to change products and even take them in different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

Accessing XPRT source code is a straightforward process. The source code for CloudXPRT is freely available in our CloudXPRT GitHub repository. If you’d like to download and review the source code for WebXPRT 4 or any of the other XPRTs, or get instructions for how to build one of the benchmarks, all you need to do is contact us at benchmarkxprtsupport@principledtechnologies.com. Your feedback is valuable!

Justin

The ongoing evolution of the BenchmarkXPRT Development Community

This November will mark the tenth anniversary of the BenchmarkXPRT Development Community, which we originally called the HDXPRT Development Community. Since the early days of HDXPRT, our community has grown to include about 275 members from over 85 companies and organizations, and we’ve added seven benchmarks to the XPRT family. We initially mailed HDXPRT DVDs to testers interested in a new way to evaluate PC performance, and now thousands of users around the world download our benchmarks and rely on them to help measure the performance of everything from tablets to laptops to high-end datacenter hardware.

As the XPRTs continue to grow and evolve, we’ve worked to make sure that the resources that we offer—and the ways we offer them—continue to meet the needs of XPRT testers and community members. As we expand in the AI and datacenter spaces with AIXPRT and CloudXPRT, our user group is becoming larger and more diverse than ever. We have already made some changes to better serve this expanding group, and will be making additional changes over the months ahead.

The first set of changes relate to our community membership model. Originally, membership in the BenchmarkXPRT Development Community required a $20 fee and provided access to preview versions of new benchmarks, the ability to submit ideas for future benchmarks, and regular updates through our monthly newsletter and community announcements. To remove the financial obstacle to joining, we introduced a fee waiver process a few years ago.

Also, we know that some OEM employees and members of the tech press are interested in the XPRTs, but are unable to join the community for one reason or another. With these people in mind, we recently experimented with making the CloudXPRT Preview publicly available. Releasing preview builds to all who are interested makes it more likely that users will incorporate the XPRTs into their test suites, and we have decided to adopt this practice for other benchmarks going forward.

In the coming months, we’ll be updating parts of our website to increase access to XPRT content. For example, certain content such as source code for most of the XPRTs is currently available only to members. We plan to remove the login requirement for access to this material.

Please keep in mind that membership in the BenchmarkXPRT Development Community continues to offer exclusive opportunities. Members can join groups such as the CloudXPRT Results Review Group and offer direct input into the design of future benchmarks. Members also receive our monthly newsletters.

If you have any questions about the XPRTs or community membership, please feel free to ask!

Justin

Now available: An updated CloudXPRT Preview build and source code

Today, we published an updated CloudXPRT Preview build (v0.97), along with the build’s source code. The new build fixes a few minor bugs, and makes several improvements to help facilitate installation, setup, and testing. The fixes do not affect CloudXPRT test results, so results from the new build are comparable to results from the original build (v0.95). You can find more detailed information about the changes in last week’s blog.

The CloudXPRT Preview v0.97 source code is available to the public via the CloudXPRT GitHub repository. As we’ve discussed in the past, publishing XPRT source code is part of our commitment to making the XPRT development process as transparent as possible. By allowing all interested parties to download and review our source code, we’re encouraging openness and honesty in the benchmarking industry and are inviting the kind of constructive feedback that helps to ensure that the XPRTs continue to contribute to a level playing field.

While the CloudXPRT source code is available to the public, our approach to derivative works differs from some open-source models. Traditional open-source models encourage developers to change products and even take them in different directions. Because benchmarking requires a product that remains static to enable valid comparisons over time, we allow people to download the source, but we reserve the right to control derivative works. This discourages a situation where someone publishes an unauthorized version of the benchmark and calls it an “XPRT.”

We encourage you to download and review the source and send us any feedback you have. Your questions and suggestions may influence future versions of CloudXPRT.

If you have any questions about CloudXPRT or the source code, please let us know!

Justin

Check out the other XPRTs:

Forgot your password?