BenchmarkXPRT Blog banner

Category: Benchmark metrics

Updating our WebXPRT 4 browser performance comparisons with new gear

Once or twice per year, we refresh an ongoing series of WebXPRT comparison tests to see if recent updates have reordered the performance rankings of popular web browsers. We published our most recent comparison in January, when we used WebXPRT 4 to compare the performance of five browsers on the same system.

This time, we’re publishing an updated set of comparison scores sooner than we normally would because we chose to move our testing to a newer reference laptop. The previous system—a Dell XPS 13 7930 with an Intel Core i3-10110U processor and 4 GB of RAM—is now several years old. We wanted to transition to a system that is more in line with current mid-range laptops. By choosing to test on a capable mid-tier laptop, our comparison scores are more likely to fall within the range of scores we would see from an typical user today.

Our new reference system is a Lenovo ThinkPad T14s Gen 3 with an Intel Core i7-1270P processor and 16 GB of RAM. It’s running Windows 11 Home, updated to version 23H2 (22631.3527). Before testing, we installed all current Windows updates and tested on a clean system image. After the update process was complete, we turned off updates to prevent any further updates from interfering with test runs. We ran WebXPRT 4 three times each on five browsers: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. In Figure 1 below, each browser’s score is the median of the three test runs.

In our last round of tests—on the Dell XPS 13—the four Chromium-based browsers (Brave, Chrome, Edge, and Opera) produced close scores, with Edge taking a small lead among the four. Each of the Chromium browsers significantly outperformed Firefox, with the slowest of the Chromium browsers (Brave) outperforming Firefox by 13.5 percent.

In this round of tests—on the Lenovo ThinkPad T14s—the scores were very tight, with a difference of only 4 percent between the last-place browser (Brave) and the winner (Chrome). Interestingly, Firefox no longer trailed the four Chromium browsers—it was squarely in the middle of the pack.

Figure 1: The median scores from running WebXPRT 4 three times with each browser on the Lenovo ThinkPad T14s.

Unlike previous rounds that showed a higher degree of performance differentiation between the browsers, scores from this round of tests are close enough that most users wouldn’t notice a difference. Even if the difference between the highest and lowest scores was substantial, the quality of your browsing experience will often depend on factors such as the types of things you do on the web (e.g., gaming, media consumption, or multi-tab browsing), the impact of extensions on performance, and how frequently the browsers issue updates and integrate new technologies, among other things. It’s important to keep such variables in mind when thinking about how browser performance comparison results may translate to your everyday web experience.

Have you tried using WebXPRT 4 to test the speed of different browsers on the same system? If so, we’d love for you to tell us about it! Also, please tell us what other WebXPRT data you’d like to see!

Justin

Want to see your WebXPRT 4 results on WebXPRT.com? Here’s how to submit them for review

In a recent post, we discussed some key features that the WebXPRT 4 results viewer tool has to offer. In today’s post, we’ll cover the straightforward process of submitting your WebXPRT 4 test results for possible publication in the viewer.

Unlike sites that publish all submissions, we publish only results that meet our evaluation criteria. Those results can come from OEM labs, third-party labs, reliable tech media sources, or independent user submissions. What’s important to us is that the scores must be consistent with general expectations, and for sources outside of our labs and data centers, they must include enough detailed system information that we can determine whether the score makes sense. That being said, if your scores are a little bit different from what you see in our database, please don’t hesitate to send them to us for consideration. It costs you nothing.

The actual result submission process is quick and easy. At the end of the WebXPRT test run, click the Submit your results button below the overall score, complete the short submission form, and click Submit again. Please be as specific as possible when filling in the system information fields. Detailed device information helps us assess whether individual scores represent valid test runs.

Figure 1 below shows how the form would look if I submitted a score at the end of a recent WebXPRT 4 run on one of the test systems here in our lab.

Figure 1: A screenshot of the WebXPRT 4 end-of-test results submission screen.

After you submit your score, we’ll contact you to confirm how we should display the source of the result in our database. You can choose one of the following:

  • Your first and last name
  • “Independent tester” (for users who wish to remain anonymous)
  • Your company’s name, if you have permission to submit the result in their name. If you want to use a company name, please provide a valid company email address that corresponds with the company name.

As always, we will not publish any additional information about you or your company without your permission.

We look forward to seeing your scores! If you have questions about WebXPRT 4 testing or results submission, please let us know!

Justin

The WebXPRT 4 results viewer: A powerful tool for browsing hundreds of test results

In our recent blog post about the XPRT results database, we promised to discuss the WebXPRT 4 results viewer in more detail. We developed the results viewer to serve as a feature-rich interactive tool that visitors to WebXPRT.com can use to browse the test results that we’ve published on our site, dig into the details of each result, and compare scores from multiple devices. The viewer currently has almost 700 test results, and we add new PT-curated entries each week.

Figure 1 shows the tool’s default display. Each vertical bar in the graph represents the overall score of a single test result, with bars arranged left-to-right, from lowest to highest. To view a single result in detail, hover over a bar to highlight it, and a small popup window will display the basic details of the result. You can then click to select the highlighted bar. The bar will turn dark blue, and the dark blue banner at the bottom of the viewer will display additional details about that result.

Figure 1: The WebXPRT 4 results viewer tool’s default display

In the example in Figure 1, the banner shows the overall score (237), the score’s percentile rank (66th) among the scores in the current display, the name of the test device, and basic hardware configuration information. If the source of the result is PT, you can click the Run info button in the bottom right-hand corner of the display to see the run’s individual workload scores. If the source is an external publisher, users can click the Source link to navigate to the original site.

The viewer includes a drop-down menu that lets users quickly filter results by major device type categories, plus a tab with additional filtering options, such as browser type, processor vendor, and result source. Figure 2 shows the viewer after I used the device type drop-down filter to select only laptops.

Figure 2: Screenshot from the WebXPRT 4 results viewer showing results filtered by the device type drop-down menu.

Figure 3 shows the viewer as I use the filter tab to explore additional filter options, such as processor vendor.

Figure 3: Screenshot from the WebXPRT 4 results viewer showing the filter options available with the filter tab.

The viewer will also let you pin multiple specific runs, which is helpful for making side-by-side comparisons. Figure 4 shows the viewer after I pinned four runs and viewed them on the Pinned runs screen.

Figure 4: Screenshot from the WebXPRT 4 results viewer showing four pinned runs on the Pinned runs screen.

Figure 5 shows the viewer after I clicked the Compare runs button. The overall and individual workload scores of the pinned runs appear in a table.

Figure 5: Screenshot from the WebXPRT 4 results viewer showing four pinned runs on the Compare runs screen.

We hope that you’ll enjoy using the results viewer to browse our WebXPRT 4 results database and that it will become one of your go-to resources for device comparison data.  

Are there additional features you’d like to see in the viewer, or other ways we can improve it? Please let us know, and send us your latest test results!

Justin

Want to know how your device performs? Explore the XPRT results database

If you only recently started using the XPRT benchmarks, you may not know about one of the free resources we offer—the XPRT results database. Our results database currently holds more than 3,650 test results from over 150 sources, including global tech press outlets, OEM labs, and independent testers. It serves as a treasure trove of current and historical performance data across all the XPRT benchmarks and hundreds of devices. You can use these results and the results of the same XPRTs on your device to get a sense of how well your device performs.

We update the results database several times a week, adding selected results from our own internal lab testing, reliable media sources, and end-of-test user submissions. (After you run one of the XPRTs, you can choose to submit the results, but don’t worry—this is opt-in. Your results do not automatically appear in the database.) Before adding a result, we also look at any available system information and evaluate whether the score makes sense and is consistent with general expectations.

There are three primary ways that you can explore the XPRT results database.

The first is by visiting the main BenchmarkXPRT results browser, which displays results entries for all of the XPRT benchmarks in chronological order (see the screenshot below). You can filter the results by selecting a benchmark from the drop-down menu. You can also type values, such as a vendor name (e.g., Dell) or the name of a tech publication (e.g., PCWorld) into the free-form filter field. For results we’ve produced in our lab, clicking “PT” in the Source column takes you to a page with additional configuration information for the test system. For sources outside our lab, clicking the source name takes you to the original article or review that contains the result.

The second way to access our published results is by visiting the results page for an individual XPRT benchmark. Start by going to the page of the benchmark that interests you (e.g., CrXPRT.com) , and looking for the blue View Results button. Clicking the button takes you to a page that displays results for only that benchmark. You can use the free-form filter on the page to filter those results, and you can use the Benchmarks drop-down menu to jump to the other individual XPRT results pages.

The third way to view our results database is with the WebXPRT 4 results viewer. The viewer provides an information-packed, interactive tool with which you can explore data from the curated set of WebXPRT 4 results we’ve published on our site. We’ll discuss the features of the WebXPRT 4 results viewer in more detail in a future post.

You can use any of these approaches to compare the results of an XPRT on your device with our many published results. We hope you’ll take some time to explore the information in our results database and that it proves to be helpful to you. If you have ideas for new features or suggestions for improvement, we’d love to hear from you!

Justin

XPRT mentions in the tech press

One of the ways we monitor the effectiveness of the XPRT family of benchmarks is to regularly track XPRT usage and reach in the global tech press. Many tech journalists invest a lot of time and effort into producing thorough device reviews, and relevant and reliable benchmarks such as the XPRTs often serve as indispensable parts of a reviewer’s toolkit. Trust is hard-earned and easily lost in the benchmarking community, so we’re happy when our benchmarks consistently achieve “go-to” status for a growing number of tech assessment professionals around the world.

Because some of our newer readers may be unaware of the wide variety of outlets that regularly use the XPRTs, we occasionally like to share an overview of recent XPRT-related tech press activity. For today’s blog, we want to give readers a sampling of the press mentions we’ve seen over the past few months.

Recent mentions include:

Each month, we send out a BenchmarkXPRT Development Community newsletter that contains the latest updates from the XPRT world and provides a summary of the previous month’s XPRT-related activity, including new mentions of the XPRTs in the tech press. If you don’t currently receive the monthly BenchmarkXPRT newsletter but would like to join the mailing list, please let us know! There is no cost to join, and we will not publish or sell any of the contact information you provide. We will send only the monthly newsletter and occasional benchmark-related announcements, such as news about patches or new releases.

Justin

Working with the WebXPRT 4 source code

In our last blog post, we discussed the WebXPRT 4 source code and how you can contact us to request free access to the build package. In this post, we’ll address two questions that users sometimes ask about code access. The first question is, “How do I build a local instance of WebXPRT?” The second is, “What can I do with it?”

How to build a local WebXPRT 4 instance

After we receive your request, we’ll send you a secure link to the current WebXPRT 4 build package, which contains all the necessary source code files and installation instructions. You will need a system to use as a server, and you will need to be familiar with Apache, PHP, and MySQL configuration to follow the build instructions. WebXPRT 4 uses a LAMP (Linux, Apache, MySQL, and PHP) setup on the “server” side, but it’s also possible to set up an instance with a WAMP or XAMPP stack.

The build instructions include a step-by-step methodology for setup. If you are familiar with LAMP stack configuration, the build and configuration process should take about two to three hours, depending on whether your LAMP-related extensions and libraries are current.

What you can do with a local WebXPRT 4 instance

We allow users to set up their own WebXPRT 4 instances for purposes of review, internal testing, or experimentation.

One use-case example is internal OEM lab testing. Some labs use WebXPRT to conduct extensive testing on preproduction hardware, and they follow stringent security guidelines to avoid the possibility of any hardware or test information leaving the lab. Even though we have our own strict policies about how we handle the little amount of data that WebXPRT gathers from tests, a local WebXPRT 4 instance provides those labs with an extra layer of security for sensitive tests.

We do ask that users publish results only from tests that they run on WebXPRT.com. As we mentioned in our most recent post, benchmarking requires a product that is consistent to enable valid comparisons over time. We allow people to download the source, but we reserve the right to control derivative works and which products can use the name “WebXPRT.” That way, when people see WebXPRT scores in tech press articles or vendor marketing materials, they can run their own tests on WebXPRT.com and be confident that they’re using the same standard for comparison.

If you have any questions about using the WebXPRT 4 source code, let us know!

Justin

Check out the other XPRTs:

Forgot your password?