Attention to detail doesn't just mean being aware of every little nit-picky speck of your brilliant piece of artistry. Attention to detail means understanding how each detail relates to each other detail. Your program isn't a list of discrete bullet points. Your program is all of the interactions that the entire set of points taken as a whole.
You may remember me criticizing the design of testlink's XML RPC API a few blogs ago. For this example I'm going to use testlink again. This time I'm going after the user experience of creating a test plan.
The process of creating a test plan within testlink is meant to be made up of a specific sequence of steps:
- Create any new test cases that are needed
- Create the platforms the tests are to run on
- Create the test plan
- Assign the platforms to be tested to the test plan
- Assign the test cases to be run to the test plan
- You're done!
So, pretty straightforward. Here's what happens behind the curtains (minus a lot of irrelevant details, mainly a ton of housekeeping work with the table nodes_hierarchy):
- Fill the database tables testcases, testsuites to create tree of test suites and cases
- Fill the database table platforms
- Fill the database table testplans
- Fill a join table associating platforms with testplans
- Fill a join table associating pairs of platforms and testcases with testplans
Still straightforward right? Except the UI of testlink doesn't require users to associate any platforms with a test plan before assigning test cases. This is reasonable since a lot of projects aren't multiplatform. Even if a project is multiplatform there's no obvious reason creating test cases first can't be done if mostly the same tests will run regardless of platform. Or at least based on the information presented by the UI.
This is what happens if test cases are associated first:
- Fill the database tables testcases and testsuites
- Fill the database table platforms
- Fill the database table testplans
- Fill a join table associating testcases with testplans
- Fill a join table associating platforms with testplans
The problem here lies in the fact that the 2nd to last step does not associate test cases in the test plan with platforms since that information isn't available. In most cases the final step doesn't correct the error causing testlink to enter a messed up state that a regular user can't easily correct.
Now, in testlink's defence, there is an attempt to detect the problem condition but it requires the user to notice an easily ignored "critic" message and follow instructions provided on it. The user has to first add a single platform to the test plan, save, then add the others. Assuming the user follows the instructions they'll have their existing test cases paired up with one platform. All of the other platforms added to the test plan need to have their test cases flagged manually. For most real-world applications this is highly tedious.
My first change is to automatically associate test cases with any platforms added to the test plan. My assumption is that if you're targeting multiple platforms you want the majority of your features to work the same on most platforms with a few exceptions. Adding those exceptions would be a lot less tedious than adding everything.
Now there's a problem with my approach which isn't a problem in the original. What happens if someone makes a test plan with no platforms, runs it, then decides to port their software to multiple platforms, showing this by adding all the supported platforms to their test plan in one go?
Feel free to comment with what you think the logic error in my fix (there is one) is. Hint: the instructions provided when testlink detects the problem (add one platform which associates with existing test cases, then all others) won't run into the logic error in my approach.