MAN-06 Test Management

Manual for managing manual and automated tests with Ketryx Platform

1. Introduction

This manual describes the setup and processes for managing both manual and automated tests with Ketryx Platform.

Ketryx supports managing tests via its standard items Test Case and Test Execution, via third-party test management solutions such as Xray for Jira, and integrates with automated test runs through GitHub Actions or any tool that can call the Ketryx API.

2. Manual Tests

2.1. Creating and Executing Tests

Tests are defined using the Test Case item type in Ketryx, where the Steps and Expected behavior for each test case are declared. Each Test Case is associated with one or more Tested items. Like Requirements and Software Item Specs, Test Cases are "long-lived" items that are introduced in a certain version and remain in effect unless/until they are marked as obsolete in a certain version. See WI-04 Test Case for details.

Before a product version can be released in Ketryx, all Requirements and other configuration items need to trace to tests (per the configured traceability mode) and all those tests need to be executed, unless they are excluded from the release test plan.

Manual test executions are represented by the Test Execution item type in Ketryx, which contains the Steps and Expected behavior from a Test Case, along with the actual Observed behavior and a Test result. Test Executions are "point-wise" items associated with one particular version, as defined by the Introduced in version field.

Test Executions can be created in the following ways:

  1. On the Ketryx Test management page by selecting Test Cases, selecting a particular version, and pressing Create executions for selection.

  2. In Jira by using the New Test Execution in the Traceability widget of a particular Test Case, or by manually creating a Test Execution issue from scratch. In these cases, Introduced in version needs to be set manually on the Jira issue.

See WI-05 Test Execution for details.

If Test Executions are created for Test Cases that have automated tests associated with them, they will inherit the Observed behavior from the logs of those automated tests (see the section on Automated tests below).

2.2. Defining a Release Test Plan

By default, Ketryx assumes that all available Test Cases in a project are executed in each version where they are effective (based on their Introduced in version and Obsolete in version fields). Sometimes only a subset of those tests are only actually relevant for a release, though, depending on what parts of the system changed and need to be (re-) tested. You can define a more granular release test plan on the Test management page in Ketryx, by selecting a version, a subset of the available tests, and choosing Manage release test plan > Exclude selected tests. Note that, by default, all tests in a project are included.

After excluding tests from the release test plan, you can include them again by selecting them and choosing Manage release test plan > Include selected tests.

Tests from other referenced projects are excluded by default. To opt into executing them, you can explicitly include them in the release test plan. Note that the corresponding Test Execution still needs to happen in the current (referencing) project, as opposed to the referenced project.

Ketryx does not require excluded Test Cases to be executed, while still considering their tested items to be properly traced to tests.

A non-default release test plan needs to be approved by the relevant approval groups defined by the Test Plan approval rules. To approve a test plan, select the relevant version on the Test management page and choose Manage release test plan > Approve test plan. If there are any changes to the test plan (i.e., more Test Cases are excluded), it needs to be re-approved before the version can be released.

The test plan of a released version cannot be changed anymore.

Requirements and specifications only associated with excluded tests are indicated as excluded in the Requirements traceability matrix page and document.

2.3. Xray Tests

Ketryx integrates with Xray Test Management for Jira by fetching data from Xray Tests, Test Executions, and Test Plans and mapping them to Ketryx items:

  1. Xray Test Plans are associated with a particular version using Jira's Fix versions field (or another version field in a custom configuration)

  2. Xray Test Plans contain Xray Test Executions

  3. Xray Test Executions contain one or more test runs, where each test run executes a particular Xray Test

  4. Xray Tests can be associated with their tested items through Jira's native tests link

2.3.1. Xray Setup

To fetch data from Xray, Ketryx needs an API key, which can be created in the Xray app settings (in Jira under Apps > Manage your apps > XRAY > API Keys).

In Ketryx, enter the respective Client ID and Client Secret in the organization settings under Organization > Connections. This requires the Manage connections permission.

If Jira is configured with Xray issue types, the Ketryx-specific issue type scheme will be created with a Ketryx Test Execution instead of the usual Test Execution issue type, to disambiguate them.

2.3.2. Using Xray Test Plans

With Xray, you will usually maintain one or more Xray Test Plans in Jira and associate them with a version. To replicate the same selection of executed tests in the release test plan in Ketryx, do the following on the Test management page:

  1. Select the relevant version to be released

  2. Activate the filter Not contained in Xray test plan

  3. Select all filtered tests by clicking the checkbox in the top-left corner of the table

  4. Exclude the selected tests by choosing Manage release test plan > Exclude selected items

  5. Confirm the exclusion and approve the test plan

3. Automated Tests

3.1. Creating a Ketryx API key

To interact with Ketryx programmatically, such as from GitHub Actions or via the HTTP-based API, you need an API key. API keys can be managed under Organization > API keys. This is only allowed for organization owners and needs to be confirmed with an electronic signature.

An API key's name only serves as a "nickname" to recognize it later. It is not needed in the actual authentication mechanism.

An API key can have the following permissions:

  1. Report test results: allows reporting of automated test results to Ketryx

  2. Retrieve release status: allows retrieving the release status of a version from Ketryx and other checks

After creating an API key, a secret token of the form KXTK_... is displayed. Copy this token and store it in a secure place. For security reasons, it cannot be retrieved again later. This token is needed to authenticate programmatic requests to Ketryx.

API keys remain active until they are revoked.

See also API authentication.

3.2. Setting up GitHub Actions

Workflows in GitHub Actions can report builds and test results to Ketryx using the Ketryx GitHub Action.

The API key's secret token should be stored as an encrypted secret in GitHub, and exposed to the Ketryx GitHub Action as in the following:

api-key: ${{ secrets.KETRYX_API_KEY }}

In addition, the Ketryx project ID (e.g., KXPRJ49GQYFQ5RR9KRTPWTRTC39YZ9W) must be passed to the action's project parameter.

By default, the reported build and tests are associated with a project version in Ketryx based on the build commit SHA. Any version whose release ref pattern (as configured in the project settings) resolves to the given commit is considered relevant for the build. To override this default, you can pass an explicit value (either the full version name or the Ketryx version ID) to the version parameter.

If you have several "parallel" builds that report test results, you can set the build-name parameter to disambiguate them. For each unique build name associated with a version, the most recent build is considered the effective build, overriding any previous builds (and associated automated tests) with the same name. For instance, this can be useful if you have a CI build that runs unit tests, while other CI builds run end-to-end tests, and they should not override each other.

See the action configuration documentation for details.

3.3. Using the HTTP-based API

To interact with Ketryx from other CI/CD platforms or local systems, you can use its HTTP-based API. API requests are authorized using the HTTP Authorization header

Authorization: Bearer KXTK_...

with the secret token from a Ketryx API key.

Note the following when using the API at /api/v1/builds:

  1. The Ketryx project ID must be passed via the project parameter.

  2. Either the version or the commitSha parameter must be set. If a commit SHA is given, any version whose release ref pattern (as configured in the project settings) resolves to the given commit is considered relevant for the build.

  3. Build artifacts (including test result files) should be uploaded via the API at /api/v1/build-artifacts first, which returns a file ID that can be passed to the /builds API. Artifacts can be associated with the overall build or with individual tests.

  4. Artifacts of type cucumber-json or junit-xml reported at the overall build level are automatically parsed by Ketryx and yield automated test executions.

  5. Test results can also be reported directly using the tests parameter, explicitly specifying a testedItem, result, title, artifacts, etc.

  6. Just like the build-name parameter in the GitHub Action (described above), a buildName can be specified to disambiguate several parallel builds that should not override each other.

See the build API documentation for details.

3.4. Associating Automated Tests with Configuration Items

Ketryx detects automated tests in various test file formats and associates them with tested configuration items based on annotations that mention the item's ID. That ID can be the:

The details of how tests are annotated depend on the test file format, as described below.

The tested item can be a Test Case (which is typically the case for end-to-end tests that are predefined in Jira and approved) or a Software Item Spec (which is typically the case for unit tests that are not managed in Jira). If you want to enforce that all automated tests are associated with a predefined Test Case (e.g., so that you can approve the test steps before they are executed), you can activate the option Require automated tests to be associated with test cases in the project settings.

3.4.1. Cucumber

Use an annotation of the form @tests:ID or @implements:ID on a Scenario in Gherkin syntax.

Use @implements tag to only reference an existing Jira Test Case item, in any other case please use the @tests tag.

Feature: Mobile app

  @tests:SAMD-45
  Scenario: Test mobile app login
    Given User is on login screen
    Then User can log in successfully

Use the json reporter ("formatter") to generate a JSON report to send to Ketryx.

3.4.2. JavaScript

In Jest, Mocha, and similar JavaScript test frameworks, mention the tested item ID in the test name itself:

describe('Mobile app login @tests:SAMD-45', () => {
    it('logs in user successfully', () => {
        // ...
    });
});

Generate a JUnit XML report to send to Ketryx.

3.4.3. Python

In pytest, use the record_property callback to store the tested item ID as additional metadata on a test case:

def test_mobile_app_login(record_property):
    record_property('tested-item-id', 'SAMD-45')
    # ...

Use the junitxml output format to generate a JUnit XML report to send to Ketryx.

3.4.4. Java, Kotlin and Swift

Use the @tests:ID annotations in comments above test methods. Learn more about how to associate automated tests with configuration items in Java, Kotlin and Swift in the Git-based Configuration Items manual.

Generate a JUnit XML report to send to Ketryx.

3.4.5. JUnit XML

Most test frameworks in common programming languages (such as JavaScript and Python described above) support generating a JUnit XML report. Ketryx reads item IDs from <property> elements, attributes (tested-item-id or assertions), or annotations in test names. All the following examples would be recognized:

<testsuites>
  <testsuite name="Suite 1">
      <testcase name="Test 1">
          <properties>
              <property name="tested-item-id" value="SAMD-45" />
          </properties>
      </testcase>
      <testcase name="Test 2">
          <properties>
              <property name="assertions" value="SAMD-45" />
          </properties>
      </testcase>
      <testcase name="Test 3" tested-item-id="SAMD-45" />
      <testcase name="Test 4" assertions="SAMD-45" />
      <testcase name="Test 5 @tests:SAMD-45" />
  </testsuite>
</testsuites>

A name annotation at the level of the <testsuite> is also recognized, and associates all included test cases with the given element (unless they have a more specific association).

3.5. Reporting of Automated Tests

The Test management page and testing report documents show both manual Test Executions and automated test executions.

The detail page for each automated test shows what build it is associated with, the tested item (typically a Test Case or a Software Item Spec), the log output of the test, artifacts (e.g., images) uploaded with the test, and other relevant details.

Automated tests can also be reached from the detail page of each project build.

4. Automated Tests with Manual Review

If a Test Case has both automated tests associated with it and manual test executions, the manual test executions take precedence. This allows you to manually "override" an automated test result.

You can select Test Cases with automated test executions on the Test management page and choose Create executions for selection to create corresponding Test Execution items. Those Test Executions will inherit the Steps and Expected behavior from the Test Case, and the logs (as Observed behavior) and Test result from the automated test execution. Test Executions still need to be approved and put into a controlled state as usual; so this is a way to manually review and sign off on individual automated tests.

To require a manual review of all effective automated tests, activate the option Require a manual test execution for each effective automated test execution in the project settings under Test management. This will prevent automated test results from being considered directly for full traceability; instead, there has to be a manual Test Execution created through Create executions for selection for the effective automated test of each Test Case.

Last updated

© 2024 Ketryx Corporation