MAN-06 Test Management
Manual for managing manual and automated tests with Ketryx Platform
Last updated
Manual for managing manual and automated tests with Ketryx Platform
Last updated
© 2024 Ketryx Corporation
This manual describes the setup and processes for managing both manual and automated tests with Ketryx Platform.
Ketryx supports managing tests via its standard items and , via third-party test management solutions such as , and integrates with automated test runs through or any tool that can call the Ketryx .
Tests are defined using the Test Case item type in Ketryx, where the Steps and Expected behavior for each test case are declared. Each Test Case is associated with one or more Tested items. Like Requirements and Software Item Specs, Test Cases are "long-lived" items that are introduced in a certain version and remain in effect unless/until they are marked as obsolete in a certain version. See for details.
Before a product version can be released in Ketryx, all Requirements and other configuration items need to trace to tests (per the configured traceability mode) and all those tests need to be executed, unless they are excluded from the .
Manual test executions are represented by the Test Execution item type in Ketryx, which contains the Steps and Expected behavior from a Test Case, along with the actual Observed behavior and a Test result. Test Executions are "point-wise" items associated with one particular version, as defined by the Introduced in version field.
Test Executions can be created in the following ways:
On the Ketryx Test management page by selecting Test Cases, selecting a particular version, and pressing Create executions for selection.
In Jira by using the New Test Execution in the Traceability widget of a particular Test Case, or by manually creating a Test Execution issue from scratch. In these cases, Introduced in version needs to be set manually on the Jira issue.
See for details.
If Test Executions are created for Test Cases that have automated tests associated with them, they will inherit the Observed behavior from the logs of those automated tests (see the section on below).
By default, Ketryx assumes that all available Test Cases in a project are executed in each version where they are effective (based on their Introduced in version and Obsolete in version fields). Sometimes only a subset of those tests are only actually relevant for a release, though, depending on what parts of the system changed and need to be (re-) tested. You can define a more granular release test plan on the Test management page in Ketryx, by selecting a version, a subset of the available tests, and choosing Manage release test plan > Exclude selected tests. Note that, by default, all tests in a project are included.
After excluding tests from the release test plan, you can include them again by selecting them and choosing Manage release test plan > Include selected tests.
Ketryx does not require excluded Test Cases to be executed, while still considering their tested items to be properly traced to tests.
A non-default release test plan needs to be approved by the relevant approval steps defined by the Test Plan approval rules. The approval state of a test plan by each approval step can be tracked in the sidebar on the Test management page. To approve a test plan, select the relevant version on the Test management page and click Approve in the sidebar. If there are any changes to the test plan (i.e., more Test Cases are excluded), it needs to be re-approved before the version can be released.
The test plan of a released version cannot be changed anymore.
Requirements and specifications only associated with excluded tests are indicated as excluded in the Requirements traceability matrix page and document.
Xray Test Plans are associated with a particular version using Jira's Fix versions field (or another version field in a custom configuration)
Xray Test Plans contain Xray Test Executions
Xray Test Executions contain one or more test runs, where each test run executes a particular Xray Test
Xray Tests can be associated with their tested items through Jira's native tests link
To fetch data from Xray, Ketryx needs an API key, which can be created in the Xray app settings (in Jira under Apps > Manage your apps > XRAY > API Keys).
In Ketryx, enter the respective Client ID and Client Secret in the organization settings under Organization > Connections. This requires the Manage connections permission.
If Jira is configured with Xray issue types, the Ketryx-specific issue type scheme will be created with a Ketryx Test Execution instead of the usual Test Execution issue type, to disambiguate them.
With Xray, you will usually maintain one or more Xray Test Plans in Jira and associate them with a version. To replicate the same selection of executed tests in the release test plan in Ketryx, do the following on the Test management page:
Select the relevant version to be released
Activate the filter Not contained in Xray test plan
Select all filtered tests by clicking the checkbox in the top-left corner of the table
Exclude the selected tests by choosing Manage release test plan > Exclude selected items
Confirm the exclusion and approve the test plan
To interact with Ketryx programmatically, such as from GitHub Actions or via the HTTP-based API, you need an API key. API keys can be managed under Organization > API keys. This is only allowed for organization owners and needs to be confirmed with an electronic signature.
An API key's name only serves as a "nickname" to recognize it later. It is not needed in the actual authentication mechanism.
An API key can have the following permissions:
Report test results: allows reporting of automated test results to Ketryx
Retrieve release status: allows retrieving the release status of a version from Ketryx and other checks
After creating an API key, a secret token of the form KXTK_...
is displayed. Copy this token and store it in a secure place. For security reasons, it cannot be retrieved again later. This token is needed to authenticate programmatic requests to Ketryx.
API keys remain active until they are revoked.
In addition, the Ketryx project ID (e.g., KXPRJ49GQYFQ5RR9KRTPWTRTC39YZ9W
) must be passed to the action's project
parameter.
By default, the reported build and tests are associated with a project version in Ketryx based on the build commit SHA. Any version whose release ref pattern (as configured in the project settings) resolves to the given commit is considered relevant for the build. To override this default, you can pass an explicit value (either the full version name or the Ketryx version ID) to the version
parameter.
If you have several "parallel" builds that report test results, you can set the build-name
parameter to disambiguate them. For each unique build name associated with a version, the most recent build is considered the effective build, overriding any previous builds (and associated automated tests) with the same name. For instance, this can be useful if you have a CI build that runs unit tests, while other CI builds run end-to-end tests, and they should not override each other.
To interact with Ketryx from other CI/CD platforms or local systems, you can use its HTTP-based API. API requests are authorized using the HTTP Authorization
header
with the secret token from a Ketryx API key.
Note the following when using the API at /api/v1/builds
:
The Ketryx project ID must be passed via the project
parameter.
Either the version
or the commitSha
parameter must be set. If a commit SHA is given, any version whose release ref pattern (as configured in the project settings) resolves to the given commit is considered relevant for the build.
Build artifacts (including test result files) should be uploaded via the API at /api/v1/build-artifacts
first, which returns a file ID that can be passed to the /builds
API. Artifacts can be associated with the overall build or with individual tests.
Artifacts of type cucumber-json
or junit-xml
reported at the overall build level are automatically parsed by Ketryx and yield automated test executions.
Test results can also be reported directly using the tests
parameter, explicitly specifying a testedItem
, result
, title
, artifacts
, etc.
Just like the build-name
parameter in the GitHub Action (described above), a buildName
can be specified to disambiguate several parallel builds that should not override each other.
Ketryx detects automated tests in various test file formats and associates them with tested configuration items based on annotations that mention the item's ID. That ID can be the:
Item's Jira issue key (e.g., SAMD-45
)
Full Ketryx item ID (e.g., KXITM5QJ97Z4X3X91AVRSZYZ7JHHPDK
)
Using Git-based item's itemId
to associate Automated Tests with a configuration item is not supported at the moment.
The details of how tests are annotated depend on the test file format, as described below.
The tested items can be:
Test Cases (which is typically the case for end-to-end tests that are predefined in Jira and approved).
Requirements (which is typically the case for unit, integration or end-to-end tests that are not managed in Jira).
Software Item Specifications (which is typically the case for unit, integration or end-to-end tests that are not managed in Jira).
Risks (which is typically the case for unit, integration or end-to-end tests that are not managed in Jira).
Ketryx does not impose a limitation on which or how many configuration items are referenced as tested items in an automated test case. If you want to enforce that all automated tests are associated with a predefined Test Case (e.g., so that you can approve the test steps before they are executed), you can activate the option Require automated tests to be associated with test cases in the project settings.
Use @implements
tag to only reference an existing Jira Test Case item, in any other case please use the @tests
tag.
Use the json
reporter ("formatter") to generate a JSON report to send to Ketryx.
Generate a JUnit XML report to send to Ketryx.
Use the junitxml
output format to generate a JUnit XML report to send to Ketryx.
Generate a JUnit XML report to send to Ketryx.
Most test frameworks in common programming languages (such as JavaScript and Python described above) support generating a JUnit XML report. Ketryx reads item IDs from <property>
elements, attributes (tested-item-id
or assertions
), or annotations in test names. All the following examples would be recognized:
A name annotation at the level of the <testsuite>
is also recognized, and associates all included test cases with the given element (unless they have a more specific association).
The Test management page and testing report documents show both manual Test Executions and automated test executions.
The detail page for each automated test shows what build it is associated with, the tested items (typically Test Cases or Software Item Specifications), the log output of the test, artifacts (e.g., images) uploaded with the test, and other relevant details.
Automated tests can also be reached from the detail page of each project build.
If a Test Case has both automated tests associated with it and manual test executions, the manual test executions take precedence. This allows you to manually "override" an automated test result.
You can select Test Cases with automated test executions on the Test management page and choose Create executions for selection to create corresponding Test Execution items. Those Test Executions will inherit the Steps and Expected behavior from the Test Case, and the logs (as Observed behavior) and Test result from the automated test execution. Test Executions still need to be approved and put into a controlled state as usual; so this is a way to manually review and sign off on individual automated tests.
To require a manual review of all effective automated tests, activate the option Require a manual test execution for each effective automated test execution in the project settings under Test management. This will prevent automated test results from being considered directly for full traceability; instead, there has to be a manual Test Execution created through Create executions for selection for the effective automated test of each Test Case.
Tests from other are excluded by default. To opt into executing them, you can explicitly include them in the release test plan. Note that the corresponding Test Execution still needs to happen in the current (referencing) project, as opposed to the referenced project.
Ketryx integrates with by fetching data from Xray Tests, Test Executions, and Test Plans and mapping them to Ketryx items:
See also .
Workflows in can report builds and test results to Ketryx using the .
The API key's secret token should be stored as an in GitHub, and exposed to the Ketryx GitHub Action as in the following:
See the for details.
See the for details.
.
Use an annotation of the form @tests:ID
or @implements:ID
on a Scenario
in syntax.
In , , and similar JavaScript test frameworks, mention the tested item ID in the test name itself:
In , use the record_property
callback to store the tested item ID as additional metadata on a test case:
Use the @tests:ID
annotations in comments above test methods. .