VPT offers a tool to compare test run reports, allowing users to analyze and identify any differences between them. Users can easily compare up to five test run reports.
Go to Report Comparison
Report Comparison can be accessed from two different places:
1. Reports Page – Navigate to Reports Page and select Compare Reports
2. Test Run Summary Report Page – Navigate to a Project and Select a Test Run. Click on the three-dot menu in the top right section and click on Report Comparison to navigate to the Report Comparison page.
The user is taken to the Report Comparison page where they can select test runs to compare. If navigation was from a test run summary page; the test run report is already selected in the Test run filter, all the tabs are expanded, and data is visible. User can select additional Test Runs from the filter.
Report Comparison Page
On the report comparison page, you can select up to five test runs to compare the output results like Run Highlights, Average Throughput, Average Response Time, Maximum Response Time, Minimum Response Time, and Total Bandwidth data.
In the test runs field, you have the option to either select a test run name or type the test run name to filter the list. The field is not case sensitive.
The report comparison page has multiple tabs below the Report Comparison tab. Users can select any of these tabs to view the comparison data for their selected test runs.
Test Runs Selection field
The “Test Runs” input field will enable user to select any available tests from the dropdown or manually type in the name of the test.
The test run list is organized by projects in ascending order according to their creation date. In the test run field, users can input the name of the test run. It will filter the test run names based on the matched characters.
When a User selects the test run name from the list, the test is added to the comparison.
A maximum of 5 tests can be selected for comparison. When 5 tests are added to the comparison, the other tests in the selection dropdown list are disabled.
Run Highlights
The Run Highlights tab allows you to compare test run highlights. By selecting anywhere on the tab, you will be able to view a comprehensive summary of the test run details.
Run Highlights table contains Test Run Name and metrics described in the Metrics Captured article. The metrics can be sorted according to the user’s preference, making it easy to analyze and compare data across test runs.
Average Throughput
The Average Throughput tab allows you to compare Throughput data across tests. By selecting anywhere on the tab, the corresponding graph is displayed.
The average throughput graph shows the average number of requests that are being sent to the target application per second during a test. Data from each test run is represented by a different color line in the graph.
Average Response Time
The Average response time of all transactions during a test run execution. Data from each test run is represented by a different color line in the graph.
Maximum Response Time
The Maximum response time of all transactions during a test run execution. Data from each test run is represented by a different color line in the graph.
Minimum Response Time
The Minimum response time of all transactions during a test run execution. Data from each test run is represented by a different color line in the graph.
Total Bandwidth
The Total Bandwidth is a measure of the amount of data received in responses from the target application. Data from each test run is represented by a different color line in the graph.