docs: Adding a testplan folder with a README and TEMPLATE for testplan (#4306)
Added a testplan folder in docs, with a README and TEMPLATE for testplan We want community involvement with testplans, since we believe that testplans written as early as possible before implementation starts work like a checklist and as soon as we finish the implementation we can compare it to the test plan to check for any bugs. Also even before implementation is finished, if a developer can read a test plan beforehand they can prevent most bugs, hence why we believe that the earlier we write a test plan the more bugs we can prevent. Co-authored-by: Igor <38665104+ihaid@users.noreply.github.com>
This commit is contained in:
parent
ed9bde8f69
commit
596124f0f6
166
docs/testplan/README.md
Normal file
166
docs/testplan/README.md
Normal file
@ -0,0 +1,166 @@
|
|||||||
|
# How To Create a Basic Testplan!
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
![](https://github.com/storj/storj/raw/main/resources/logo.png)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
### Why write Testplans?
|
||||||
|
|
||||||
|
We believe test plans are important, Testplans written as early as possible before implementation starts work like a checklist and as soon as we finish the implementation we can compare it to the test plan to check for any bugs. Also even before implementation is finished, if a developer can read a test plan beforehand they can prevent most bugs, hence why we believe that the earlier we write a test plan the more bugs we can prevent!
|
||||||
|
|
||||||
|
### Hi! Here is a guide on how to write a basic **testplan** that will go over how to
|
||||||
|
|
||||||
|
- figure out a **_test name_**
|
||||||
|
|
||||||
|
- group similar test cases into something that we can turn into a task (**_test scenarios_**)
|
||||||
|
|
||||||
|
- add a **_description_** of what the test does
|
||||||
|
|
||||||
|
- add **_comments_** related to the test, comments that add links to additional information f.e blueprints, design drafts (even if they are private) or even comments requesting a feature that relates to the test!
|
||||||
|
|
||||||
|
- take all this info and place it into a neat **testplan template**!
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## Test Name
|
||||||
|
|
||||||
|
#### Let's say we have these tests in mind for the multinode dashboard UI
|
||||||
|
|
||||||
|
1. Seeing if creating a new node works with correct information
|
||||||
|
|
||||||
|
2. Seeing if creating a new node works with incorrect information
|
||||||
|
|
||||||
|
3. Seeing if creating a new node works with an already existing node with the same information
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
<!-- end of the list -->
|
||||||
|
|
||||||
|
Now we can just simplify these tests and create test names for these test cases, so 1, 2 and 3 become
|
||||||
|
|
||||||
|
1. Add new node with correct information
|
||||||
|
|
||||||
|
2. Add new node with incorrect information
|
||||||
|
|
||||||
|
3. Add new node with existing node information
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## Test Scenarios
|
||||||
|
|
||||||
|
Using the aforementioned tests above we have these test cases in mind for the multinode dashboard UI
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
- Add new node with correct information
|
||||||
|
|
||||||
|
- Add new node with incorrect information
|
||||||
|
|
||||||
|
- Add new node with existing node information
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
Here we can see that we are able to group up these testcases into a task or **test scenario**, since each of these tests fall under the **add a new node button** we can just label these tests under the scenario **New Node Button Functionality**. We would also add a **number ID** to make the test plan easier to read
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## Test Description
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
Now building onto the test cases we can describe what should happen in each test **(EXPECTED OUTCOME)** case f.e using a conditional if-then statement. So we can take our previous test cases
|
||||||
|
|
||||||
|
1. Add new node with correct information
|
||||||
|
|
||||||
|
2. Add new node with incorrect information
|
||||||
|
|
||||||
|
3. Add new node with existing node information
|
||||||
|
|
||||||
|
<!-- end of the list -->
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
**_and add a conditional statement so we can know how to test it and what to look for to consider the test as passing or failing_**
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
1. If a user clicks on add a new node button and inputs **correct node** info, then user **should be able to see node ID, disk space used, disk space left, bandwidth used, earned currency, version and status of said node**
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
2. If a user clicks on add a new node button and inputs **incorrect node** info, then user **should not be able to see node ID, disk space used, disk space left, bandwidth used, earned currency, version, status of said node and instead a error message stating node information was incorrect**
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
3. If a user clicks on add a new node button and inputs **an existing node on their dashboard**, then the user **should just recieve an error message stating their node is already on the dashboard**
|
||||||
|
|
||||||
|
<!-- end of the list -->
|
||||||
|
|
||||||
|
>So if we conduct test 2, and see that **we aren't able to view the node ID, disk space used, disk spaced left, bandwidth used, earned currency, version, status of said node on the dashboard** and **_NO_** error message stating the node information was incorrect, then the test failed
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## Comments
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
In this section of the testplan users can input their own **comments, why said tests were added, improvements needed and what the actual outcomes for said tests are** when it falls out of line towards expected outcomes f.e going back to **test case 2**. We see that although inputting incorrect node information doesn't allow us to see what the multinode dashboard usually allows us to see since the node information is incorrect in this case, we still weren't shown an **error message stating node information was incorrect**, so in this case the user can comment that the actual result was that although the user was not able to see information like the node ID, disk space used etc. the user was supposed to see the error message stating node information was incorrect but did not in this case
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## Testplan Template and How to Submit Testplan
|
||||||
|
|
||||||
|
By now if you have followed along thus far you should have a pretty awesome high level testplan! Once you have replaced each category under each column the next step
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
| Test Scenario | Test Case | Test Description | Comments |
|
||||||
|
|
||||||
|
| :---------------- | :------------ | :--------------- | :---------- |
|
||||||
|
|
||||||
|
| Test Scenario 1 | Test Case 1.1 | TC1.1 Description | TC1.1 Comment |
|
||||||
|
|
||||||
|
| | Test Case 1.2 | TC1.2 Descriotion | TC1.2 Comment |
|
||||||
|
|
||||||
|
| Test Scenario 2 | Test Case 2.1 | TC2.1 Description | TC2.1 Comment |
|
||||||
|
|
||||||
|
| | Test Case 2.2 | TC2.2 Descriotion | TC2.2 Comment |
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
is to head to [testplan repo](https://github.com/storj/storj/tree/main/docs/testplan) and open a pull request titled **Test Plan** on said feature by [using our template here](TEMPLATE.md)! If you think you are done writing your awesome test plan, head to the testplan repo and open a pull request with an appropriate title containing the keywords, **Test Plan** along with the **Test Plan** label.
|
||||||
|
|
||||||
|
## Support
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
If you have any questions or suggestions please reach out to us on [our community forum](https://forum.storj.io/) or file a support ticket at [https://support.storj.io](https://support.storj.io/).
|
||||||
|
|
73
docs/testplan/TEMPLATE.md
Normal file
73
docs/testplan/TEMPLATE.md
Normal file
@ -0,0 +1,73 @@
|
|||||||
|
|
||||||
|
# Storj Basic Testplan Template
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## [Title]
|
||||||
|
> Here we give the title for the Testplan, f.e if the feature is a **"New Button"**, the title could be **New Button Testplan**
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## [Background]
|
||||||
|
> An introduction to the Testplan i.e what the Testplan is going to cover, or the feature to be implemented along with a link to the design document, figma design even if it is private
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## [Test Scenarios]
|
||||||
|
>What task or feature are we trying to test? Label each scenario with an ID for easy tracking f.e 1. Example Feature
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## [Test Cases]
|
||||||
|
>Determine test cases for said scenario ex. positive and negative testing
|
||||||
|
>Label each test cases with an ID for easy tracking f.e Test Case X 1.1,
|
||||||
|
>Test Case Y 1.2, Test Case Z 1.3 etc.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## [Description]
|
||||||
|
> Here users describe what happens in the test cases f.e what the user does in each test and what the user should expect as end result
|
||||||
|
> Can write this as a conditional statement f.e if user clicks this button, then modal should open
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## [Comments]
|
||||||
|
> In this section users can input their own comments f.e improvements needed, actual outcomes for said tests etc and feature requests
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## [Test Plan Table]
|
||||||
|
> In this section users should move input from the previous four sections and fix it into a table
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
| Test Scenario | Test Case | Test Description | Comments |
|
||||||
|
| :---------------- | :------------ | :--------------- | :---------- |
|
||||||
|
| 1. Test Scenario 1 | Test Case 1.1 | TC1.1 Description | TC1.1 Comment |
|
||||||
|
| | Test Case 1.2 | TC1.2 Descriotion | TC1.2 Comment |
|
||||||
|
| 2. Test Scenario 2 | Test Case 2.1 | TC2.1 Description | TC2.1 Comment |
|
||||||
|
| | Test Case 2.2 | TC2.2 Descriotion | TC2.2 Comment |
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## [Open Issues]
|
||||||
|
> In this section, separate from comments, any issues can be brought up from the design document. Also, any questions/issues that could help the test plan in the future should be brought up as well.
|
||||||
|
|
Loading…
Reference in New Issue
Block a user