Mastering the Requirements Process

I've just received a copy of Addison-Wesley's "Mastering the Requirements Process" 2nd Edition to review over the next few weeks. The blurb says:

Mastering the Requirements Process, Second Edition, sets out an industry-proven process for gathering and verifying requirements with an eye toward today's agile development environments. In this total update of the bestselling guide, the authors show how to discover precisely what the customer wants and needs while doing the minimum requirements work according to the project's level of agility.
I'll be reading it over the next couple of weeks and posting a review after that. Please feel free to post any specific questions and I'll try to answer them in the review.

About the author

Re: Mastering the Requirements Process

The book reminded me that I was recently thinking about that perpetual issue of testing non-functional requirements. Maybe the book will hold the answer, but my thinking went like this:

It's relatively easy to get lists of categories of NFR (usability, latency, security etc.). To a lesser extent, I can find examples of how NFRs can be specified (e.g. '95% of users with no prior experience of the system must be able to create a new case in less than 4 minutes'). What seem to be hardest to track down are examples of exactly how to verify that NFRs have been met. Do we really lock 100 novice users in a room and give them a stopwatch each or are there more elegant (and cheaper) ways of testing NFRs?

This presentation got me thinking, especially slide 15, about identifying proxies for measuring fitness criteria.

Taking the example of getting sign-off on usability, a number of techniques (patterns?) spring to mind:

1) The Sledgehammer Pattern
Specify the requirement as '95% of users with no prior experience of the system must be able to create a new case in less than 4 minutes' and actually use a representative sample of users, gather the stats and get a pass/fail result.

2) The Expert Witness Pattern
Keep the requirement but tag on 'In the opinion of a mutually agreed 3rd party expert, 95% of ...' at the start. I guess this is similar to submitting a system for penetration testing by a third party.

3) The Empowered User Panel Pattern
Similar to the expert witness pattern but a panel of representative users, chosen from the stakeholders, are empowered to make the decision. Ideally, if you have access to the entire user group, then they can give a final yes/no decision.

4) The Inference Pattern
The NFR is quantified in terms of data that can be monitored after the system goes into production, for example by analysing logs to see how many users experience validation errors or give up. This might be well-suited to public-facing systems with huge user volumes.

Has anyone used anything like these or any others?

Re: Mastering the Requirements Process

Apologies. The link to the slides should point to here :o)

Re: Mastering the Requirements Process

Based on recent experience, I'm not sure it's all that easy to come up with NFRs. It's relatively easy to list the type of NFRs that you want but can be quite hard to get a commitment on a realistic figure for each requirement. You're right that defining a metric for that requirement can then also be difficult!

I'm a bit wary of the "empowered user panel" approach described. Using stakeholders to verify subjectively the system's acceptability seems to risk having a user group with a vested interest in the outcome of the test ("sorry, it's unusable without [pet-feature]"). They will (hopefully) also be familiar with the functional specification and probably not provide a very reliable measure of usability.

I have seen the "expert witness" approach employed to give an opinion on usability. However, the sort of feedback given was more appropriate for iterating the functional specification than signing-off against NFRs. Of course that's not to say that all aspects of usability can be inferred from the functional specification! I think you have to take care to be quite clear in explaining what you're testing for to your expert witness and treat defects, improvements and sign-off criteria from their feedback separately.

Add a comment Send a TrackBack