WI-04 Test Case

Work Instruction for Test Case configuration items

1. Introduction

1.1. Purpose

This Work Instruction provides the tasks required as part of the Test Case (i.e., Test Case Specification) configuration item lifecycle.

1.2. Scope

This Work Instruction covers the complete Test Case lifecycle, from creation to obsolescence.

1.3. Records and evidence

Records for each Test Case will be held based on the records and retention policy. Test Cases are used to generate the following artifacts:

  • Change request verification report

  • Risk control matrix (risk control measures only)

  • Risk management file

  • Risk matrix

  • Test plan (with details)

  • Testing report

  • Traceability matrix

1.4. Responsibilities

As listed in the procedure description, each task in the Test Case item's lifecycle will be completed by a member that is part of one of the following approval groups. When any of these approval group members can perform the task, Anyone is listed.

  • Item Assignee: The person authoring and responsible for the Test Case. This organization member is responsible for managing/completing the Test Case lifecycle activities. This Item Assignee can change from time to time.

  • R&D Leads: The person accountable for the Test Case. The R&D Lead verifies that the Test Case provides sufficient test coverage of the tested Requirement(s) and/or Software Item Specification(s). The R&D Lead is consulted to approve the Test Case.

  • Quality Managers: The person responsible for verifying that the Test Case is correctly documented. The Quality Manager is consulted to approve the Test Case.

2. Procedure description

2.1. Step 1: Log into Jira

Anyone

Log into your Jira organization, e.g., your-company.atlassian.net.

2.2. Step 2: Create

Anyone

  1. Define an appropriate title in the Summary field.

  2. Define an appropriate Assignee (the Item Assignee).

  3. Provide additional preliminary information in the extra fields. (All fields can still be edited later.)

2.3. Step 3: Navigate to the issue page

Anyone

Using the popover shown by Jira or through other means, navigate to the Jira page of the Test Case, e.g., your-company.atlassian.net/browse/TC-1.

2.4. Step 4: Change status to In Progress

Anyone

Change the issue status to In Progress using the issue status selector.

2.5. Step 5: Draft/Author Test Case

Item Assignee

As needed, fill in the information in the fields of the Test Case.

Ensure that at least the following fields are filled out:

  • Summary (title)

  • Assignee

  • Description

  • Steps

  • Expected behavior

  • Introduced in version (unless the Test Case was introduced at the start of the project)

  • Tested items. These constitute the design input for this Test Case.

  • Test type. Is this a Verification or a Validation case? If it is the former, what kind of Verification?

The Test Case should be SMART (Specific, Measurable, Achievable, Relevant, and Testable). The Test Case should also:

  • be traceable to all other needed items, with clearly defined interfaces,

  • not conflict with another Test Case,

  • conform to any provided design input.

2.6. Step 6: Change status to Resolved (Ready for Review)

Anyone

Once the Test Case is completed and ready for design verification, change its status to Resolved.

2.7. Step 7: Review as Owner

Item Assignee

Review the Test Case to verify:

  • The Test Case is traceable to all other needed items, and all interfaces are defined.

  • The Test Case is Specific, Measurable, Achievable, Relevant, and Testable (SMART).

  • The design output (the Test Case and its siblings) conforms to the design input (the Test Case's Tested items).

  • Read through the Test Case and ensure the following fields are correctly filled out:

    • Steps: Are the steps clear, and are they comprehensive enough to arrive at the expected behavior?

    • Expected behavior: Is the expected behavior clear and does it sufficiently cover all the desired behaviors of the tested items?

    • Description: Is the description clear?

    • Tested items: Are the correct items linked, and is the list complete?

    • Introduced in version: Is it linked to the correct version?

    • Test type: Is this a Verification or a Validation case? If it is the former, what kind of Verification?

If the verification fails, reopen the ticket and, if needed, provide a comment on the reason it failed, then go to Step 5. If verification passes, approve the Test Case as seen in the screenshot below and continue to Step 10.

2.8. Step 8: Review as R&D Lead

R&D Lead

Review the Test Case to verify:

  • The Test Case is traceable to all other needed items, and all interfaces are defined.

  • The Test Case is Specific, Measurable, Achievable, Relevant, and Testable (SMART).

  • The design output (the Test Case and its siblings) conforms to the design input (the Test Case's Tested items).

  • Read through the Test Case and ensure the following fields are correctly filled out:

    • Steps: Are the steps clear, and are they comprehensive enough to arrive at the expected behavior?

    • Expected behavior: Is the expected behavior clear, and does it sufficiently cover all the desired behaviors of the tested items?

    • Description: Is the description clear?

    • Tested items: Are the correct items linked, and is the list complete?

    • Introduced in version: Is it linked to the correct version?

    • Test type: Is this a Verification or a Validation case? If it is the former, what kind of Verification?

If the verification fails, reopen the ticket and, if needed, provide a comment on the reason it failed, then go to Step 5. If verification passes, approve the Test Case as seen in the screenshot below and continue to Step 10.

2.9. Step 9: Review as Quality Manager

Quality Manager

Review the Test Case to verify:

  • The Test Case is traceable to all other needed items, and all interfaces are defined.

  • The Test Case is Specific, Measurable, Achievable, Relevant, and Testable (SMART).

  • The design output (the Test Case and its siblings) conforms to the design input (the Test Case's Tested items).

  • Read through the Test Case and ensure the following fields are correctly filled out:

    • Steps: Are the steps clear, and are they comprehensive enough to arrive at the expected behavior?

    • Expected behavior: Is the expected behavior clear, and does it sufficiently cover all the desired behaviors of the tested items?

    • Description: Is the description clear?

    • Tested items: Are the correct items linked, and is the list complete?

    • Introduced in version: Is it linked to the correct version?

    • Test type: Is this a Verification or a Validation case? If it is the former, what kind of Verification?

If the verification fails, reopen the ticket and, if needed, provide a comment on the reason it failed, then go to Step 5. If verification passes, approve the Test Case as seen in the screenshot below and continue to Step 10.

2.10. Step 10: Review as Quality Control

Quality Control

Review the Test Case to verify:

  • The Test Case is traceable to all other needed items, and all interfaces are defined.

  • The Test Case is Specific, Measurable, Achievable, Relevant, and Testable (SMART).

  • The design output (the Test Case and its siblings) conforms to the design input (the Test Case's Tested items).

  • Read through the Test Case and ensure the following fields are correctly filled out:

    • Steps: Are the steps clear, and are they comprehensive enough to arrive at the expected behavior?

    • Expected behavior: Is the expected behavior clear, and does it sufficiently cover all the desired behaviors of the tested items?

    • Description: Is the description clear?

    • Tested items: Are the correct items linked and is the list complete?

    • Introduced in version: Is it linked to the correct version?

    • Test type: Is this a Verification or a Validation case? If it is the former, what kind of Verification?

If the verification fails, reopen the ticket and, if needed, provide a comment on the reason it failed, then go to Step 5. If verification passes, approve the Test Case as seen in the screenshot below and continue to Step 10.

2.11. Step 11: Transition to a controlled state

Ketryx

Only Ketryx can move a Test Case to a controlled and effective state by transitioning its status to Closed. Ketryx moves the Test Case to a Closed state after all approval rules have been passed, i.e., all required groups have approved the Test Case.

Ketryx automatically adds a comment to the Jira issue with a link to the effective controlled record in Ketryx.

2.12. Step 12: Change

Item Assignee

Following a Change Request (i.e., the issue needs to be modified), reopen the Test Case to create a new record, and go back to Step 4.

2.13. Step 13: Mark as obsolete

Item Assignee

To mark a Test Case as obsolete (i.e., as not effective anymore, starting with a given version),

  • reopen it for change (Step 11),

  • set the version it will be obsolete in (i.e., the first version that it will not be effective anymore) in the Obsolete in version field,

  • resolve the issue (Step 6),

  • approve the item (Steps 7-9).

3. Procedure flow diagram

4. Item schema

  • Description (rich text): The Test Case description should describe the purpose of the test. It should be clear what the test is trying to achieve.

  • Introduced in version (version reference): The first version this item is effective in. If empty, the item is considered effective from the start of the project.

  • Obsolete in version (version reference): The version the item is becoming obsolete in, i.e., the first version that this item is not effective anymore.

  • Steps (rich text): The steps to follow to complete the test. The more detailed the instructions, the better. If any pre-requirements are necessary for the execution of a test, these should be explicitly mentioned.

  • Expected behavior (rich text): What the user should expect to happen on the screen once the test has been carried out. Being as clear as possible will eliminate any misunderstandings that should arise when the tester compares his observations to the described expectations.

  • Tested items (-> Requirement, Software Item Spec, Hardware Item Spec, Anomaly, CAPA, Change Request): The items this Test Case verifies or validates. It may be multiple items.

  • Test type: The type of the test. May either be Verification, Verification (regression), Verification (unit), Verification (integration), Verification (system) or Validation.

    • Software Validation: The process of evaluating software during or at the end of the development process to determine whether it satisfies specified requirements. IEEE-STD-610

    • Software Verification: The process of evaluating software to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase. IEEE-STD-610

5. Traceability to other configuration items

The following relations can be defined from a Test Case to other configuration items:

  • Test Case tests CAPA, Change Request, Hardware Item Spec, Requirement, Software Item Spec

The following relations can be defined from other configuration items to a Test Case:

  • Test Case is executed by Test Execution

  • Test Case risk-controls Risk

Last updated

© 2024 Ketryx Corporation