Sep 212012
 

I recently took part in a meeting where we discussed how to measure the success of our Agile adoption – how do we know we are improving, and by how much? There are different ways to answer these questions, but overall a desire existed to establish a set of factors to assess each team against – a check-list of sorts.

I have some misgivings about check-lists because they can give the impression of a one-size-fits all approach, when improving agility should be tailored to each team and organisation. On the other hand, there are factors which most Scrum/Agile practitioners would agree are an important part of the Agile recipe so there should be some validity to a standardised list. For me, a check-list can work, as long as “shortcomings” are seen as areas to explore and potentially improve, not boxes to tick by just going through the motions to satisfy the next periodic assessment.

Our programme manager suggested the “radar” style of chart as an accessible way to represent a team’s progress with Agile, and indeed this is a very concise way to represent scores on multiple related scales. They look like this:

A to E are the different scales you are assessing against (e.g. requirements management, Scrum adoption) and the distance from the middle on each axis represents the score for that scale. Perfect scores on every scale would result in a completely shaded chart, while the shape of a less-than-perfect set of scores gives a striking illustration of where the highs and lows are.

So how can you distil the big confusing world of methodologies, techniques and practices that sit under the Agile umbrella into just those which should apply to almost all teams, regardless of technology, then refine them into a handful of scores that can be displayed on a simple chart? It’s a challenge that could be debated endlessly and met in many different ways, but I have given it a stab. The questions reflect my personal preferences, gripes, and a fondness for Scrum, but you may find them useful – even if only as inspiration (or provocation!) The questions are written to be objective and answerable quickly as true (score one), false (zero) or half a point for partial fulfillment. The objectivity makes the test quick with the downside that they are not particularly nuanced questions, more along the lines of “do you do X?”

Here goes:

Project Management Requirements Management Build Automation Technical Practices Business Integration
The team works in consecutive fixed-length sprints Every Product Backlog Item (PBI) has a user story covering who, what and why in one sentence Nightly builds are automated The team are competent at TDD and use it appropriately The business understands how the Scrum framework relates to them
The Product Owner role is fulfilled well PBIs have acceptance criteria before development begins A build runs on each check in to source control Pairing is used appropriately The physical environment meets the teams needs
The ScrumMaster role is fulfilled well The team has a Definition of Ready and only includes PBIs in a sprint if they meet it Unit tests run as part of the build process The team can recognise code smells and resolve them by refactoring The team’s current work and project progress is transparently communicated to the business
The ScrumMaster is certified The team has a Definition of Done and only considers PBIs complete when they meet it Build reports are easily accessible to the whole team Team members are willing able to generalise e.g. developers can test The team’s has the financial resources it needs to improve agility
The team has time-boxed, structured Daily Scrums The team has a visible task board Team members can easily configure alerts to be sent on build failures The team understands and applies design patterns e.g. MVC, Gang of Four Project dependencies are identified, tracked, and transparent to all
The team has time-boxed, structured planning sessions for each sprint The team has a predictable velocity (story points delivered per sprint) Deployment is automated to a testing environment Source control is well structured with a suitable branching strategy, and merges are handled effectively Contributions from outside development needed to meet the definition of done are identified and aligned
The team presents done work in time-boxed, structured review meetings each sprint, including relevant stakeholders PBIs are detailed appropriately given their position in the Product Backlog Functional tests run as part of the build process Code metrics are analysed and appropriate investigations and actions taken Individual performance assessments, objectives and incentives support the team’s agility
The team has time-boxed, structured retrospectives each sprint which result in improvement actions being taken PBIs are sized with appropriate accuracy given their position in the Product Backlog UI tests run as part of the build process The team uses a real-time code analysis tool and resolves its warnings Reporting and governance requirements support the team’s agility
A sprint burn-down chart is visible in the team’s area The Product Backlog is prioritised Code metrics are compiled as part of the build process Non-code application components (database, configuration files etc.) are treated as source code e.g. refactored and automatically deployed The business are very satisfied with the performance of the development team, including the frequency, predictability and transparency with which value is delivered
A release burn-down chart is made visible in the team’s area Through regular grooming/refinement the Product Backlog remains correctly detailed, sized, and prioritised Build reports show the requirements implements and bugs fixed since the last build Design is allowed to emerge by implementing the simplest design possible to meet current requirements Users and/or customers are very satisfied with the product delivered

 

If anyone wants to modify or improve this, or create a variant, I’d be happy to collaborate on it.

Footnote: Mike Cohn’s excellent Succeeding With Agile covers a number of Agile assessments, including: Comparative Agility, Sidky Agile Measurement Index (PDF), Agile Evaluation Framework (PDF) and the Shodan Adherence Survey.