One of the most exciting things to me in Visual Studio 2010 ALM is the elimination of silos around development, project management, and quality assurance. In previous version these roles and activities were isolated and disconnected with little traceability between them.
In Visual Studio 2010 ALM these silos are removed and there is now traceability across the developer, project management, and quality assurance roles because of the emphasis around testing in Visual Studio 2010. The introduction of Microsoft Test Manager, included with Visual Studio 2010 Ultimate and Visual Studio 2010 Test Professional, to create and manage Test Plans and manually run Test Cases has filled a much needed gap in the ALM space.
To visualize the traceability, we have created the Visual Studio 2010 ALM Traceability Matrix to show the relationships between the major work items/artifacts in Visual Studio 2010 ALM. This could include additional links between these items, but we have not included every possible combination for readability. What I found with this matrix helps people relate this to their own environment and start seeing benefits of having all of this information centralized utilizing Visual Studio 2010 ALM and Team Foundation Server (TFS) 2010. Below the matrix are some examples of questions that can be answered by TFS related to traceability between these items. The data warehouse in TFS 2010 can be used to answer many more questions for every level of your organization.
http://dcaacompliance.com/another-consequence-of-dcaas-backlog/ User Stories (1)
How many hours of remaining work are left for this User Story?
Who are the developers working on this User Story?
Is the User Story covered by test cases?
Are the tests passing for the User Story?
Is the User Story done?
can you buy Keppra over the counter in australia Tasks (2)
What bugs have been fixed for a User Story?
Is the task complete so the test case be moved to ready?
Test Plan (3)
What stories are in a Test Plan/Iteration?
How many automated tests are in the the Test Plan?
How many tests are passing in this Iteration/release/test plan from the previous one?
How many bugs were fixed?
Test Suites (4)
What are the group of test cases for the User Story?
Test Cases (5)
Are all of the tests passing for a particular Iteration/Test Plan?
How many iterations has this test been passing?
Automate Tests (6)
How many tests are automated for a User Story or Iteration/Test Plan?
Are there are any regression tests failing?
What is the test coverage for User Stories?
What changesets are included in this build?
What tests are impacted by this check-in?
What is the User Story and Test Plan for this changeset?
Has this changeset been released?
What test cases are impacted by the code changes in this build?
What build is being used to run the tests against?
What User Stories and/or Test Cases have been tested by this build?
What kind of questions come to mind for your organization around these items? Send me your thoughts or questions to email@example.com