Info |
---|
These flows are being developed in late-July 2020, and are intended for use with the SAF and PLACES Jira products (i.e. the GPS Mobile App, and Safe Places). This is an early draft of the processes we hope to follow. Feedback welcome. |
Contents:
Table of Contents |
---|
...
New Features workflow
New Features are tracked in Jira by either Stories or Epics (with just a couple of minor differences between them in terms of process).
Once a feature is ready for test, the Developer should:
...
Triage QA will analyze the ticket, determine the required testing, and create a number of “QA:” subtasks (details below), which are then passed to Testers in state READY FOR TEST
...
Assign themselves to the “QA:” subtask ticket
Once you being to work on the new feature, transition status to IN TEST
Update the ticket with details of the testing you completed for the new feature.
Once testing is complete, debrief with the test lead (see “debriefs” below).
After the debrief is complete, transition status to DONE
Exceptions:
If you find bugs in the new feature, raise them in Jira as described here: How to raise bugs found in Testing
If you find a blocking bug that means you cannot go on with the testing, then transition status to BLOCKED and link the Jira tickets to show that the bug blocks the testing.
If you get stuck on a ticket and cannot progress for any other reason, assign the ticket to “Triage QA” where one of the leads will pick up and review
...
We run debriefs for “QA:” subtasks. We don’t typically run debriefs for bug fixes, but can do so at the tester’s tester or test lead’s discretion.
The purpose of a debrief is to review and agree on:
What has been learned in testing, and whether any further testing should be planned
Whether the documentation of the testing done is adequate
Whether any of the bugs found should block us from considering this testing to be “DONE” (typically because the bugs were major, or because significant further testing will be needed once the bugs are fixed).
Any testability issues, or impediments, that we should be looking to resolve
A debrief can take sevaral several forms, depending on the experience of the tester, and the complexity of what was tested. This could be:
a face-to-face video conference, reviewing the testing & bugs in detail
or, in simpler cases, a brief slack conversation.
The debrief must reach a clear position on whether or not the required testing can be considered “DONE” - and if more . Also, if further testing is needed, whether it will be done under this ticket, or whether a new ticket will be raised to cover it.
...
Assign tickets to themselves that they are working on
If they don’t have any work assigned, review work in the “Test Queue”, and assign this work to themselves, and do it! Where set, use ticket priority as a hint as to what order to tackle things in.
Test queue is here:
If the tester cannot complete the an assigned ticket in a reasonable timeframe, and that work could reasonably be assigned to someone else, pass that work back to “Test Queue” so that someone else can pick it up
If a tester finds work assigned to “Test Queue” that is not clearly defined enough to progress, they should comment to explain the issue and assign back to “Triage QA” with comments explaining why
For “QA:” subtasks, always debrief with a test lead before marking the task as DONE
...
Other experienced testers who would like to contribute as test leads, please do let us know.
See also related article on guidance for Test Leads:
Flows for Testing Work in Jira - Additional Info for Test Leads